Misinformation, Media Literacy and Listening to People Who Disagree
Join us this week for a discussion with Brian Knappenberger, a producer and director renowned for such documentaries as Web of Make Believe: Death, Lies and the Internet, The Internet's Own Boy: The Story of Aaron Swartz, We Are Legion: The Story of the Hacktivists, and Turning Point: 9/11, to name a few. He shares insights from his recent documentary series Web of Make Believe and also discusses the trajectory of misinformation through a lens from the 2016 election forward.
We explore themes around technology innovation and how society adapts in both positive and negative ways – and how it presents opportunities for cyber attackers to exploit cracks in the system for financial gain. And we discuss the impact of today’s always-on/always connected world whereas Marshall McLuhan once observed has become “quite as imperceptible to us as water is to fish.”
Misinformation, Media Literacy and Listening to People Who Disagree
[01:22] The Best Director Documenting Misinformation
Rachael: Welcome to the podcast, Brian Knappenberger, a documentary filmmaker who Decider magazine, just last month, or in June, called him the best director documenting our digital age.
Brian: That's a good intro. Thanks for having me.
Rachael: It was such an amazing article. I think it was a wonderful overview of your work, and how deep you go into the storytelling. For those that aren't aware, Brian is the executive producer and director of Web of Make Believe: Death, Lies and the Internet, currently on Netflix. It’s a riveting series, how do you even come up with that? It is six episodes, and the last one is a two-parter. It's such an expansive world out there. How could you even get it down to just those few stories to call out?
Brian: That's true. It was an effort to figure out the right combination of stories that would get at what we were, that kind of blend of misinformation and this chaotic world we live in online. It was a bit of a challenge to figure out what stories would really get at the different elements we wanted to talk about.
Eric: What's that process?
Brian: In this case, I've done a bunch of different movies, and I'm attracted to these kinds of stories. I think over time, I've just collected a bunch of stories of things that I've wanted to either keep an eye on or just put in the back of my mind. I met somebody who was involved with this, which is the case with the Stingray story, and the Daniel Rigmaiden story.
The Larger Point of Misinformation
Brian: I met him right when he came out of prison initially. I was always curious about his story. These are stories, files, and things that I've kept around for a while, that just didn't make sense to make a full feature about. But they made sense as a piece of reaching for this larger point we wanted to make about misinformation.
Eric: You pulled them into that greater, I was going to say thesis, but that greater grouping around misinformation, disinformation, and it's riveting from week to week. They fit together. It's not a disconnected set of documentary interviews and stories, it's wow, what's next? Where's he going next?
Brian: We start the whole series with this quote, this sort of famous Marshall McLuhan quote. "We live invested in an electric information environment that is quite as imperceptible to us as water is to fish." It is this world we swim in that we don't always see, or we don't always understand. It’s having a real-life effect on us, shifting, and changing what we think even though we don't quite understand it. Marshall McLuhan wrote that in the sixties, by the way, but I think it reflects how we all feel right now.
Rachael: One of the things that I was really struck by because we had all heard about the Seth Rich story. I think that was the opening episode and I didn't ever really think about it from that perspective. You really dug into it holistically, from a 360-degree view. How do you even come to that point and how are you even able to tell the story from that vantage point?
The Seth Rich Story
Rachael: Getting all of these folks to share perspectives plus all the clips that you cut in from the media made me really sad.
Brian: The Seth Rich story is really fascinating to me. I think that the way it started is that we were able to interview Seth Rich's parents. This is the first time that they came forward with an interview. They also talked to David Folkenflik at NPR virtually at the same time.
This is the first time that they've come public about what happened, and what they've gone through. That was an interesting way into the story.
This is a story of misinformation. It's just so interesting how this escalated and was manipulated by so many different people and so many different forces. There's a real human story at the heart of this. These parents lost their son and are trying to figure out what actually happened, and what the truth is. Who murdered our son? It is very clear what they're after. All of this swirl, this tsunami of misinformation that's coming out of this story is leading them further away from their central quest.
It's leading them further away, because every day that goes by, it becomes more difficult for them to understand what really happened to their son. The story of what happened to their son gets more and more outlandish in the hands of these political operatives.
Eric: I want to take a step back for a second, for people who may not have seen the episode, or read about Seth Rich. He was a slain DNC, Democratic National Committee staffer, who died in July or August of '16 during the election process, as Hillary Clinton's emails were being exposed.
Eric: His death was, I guess exploited is the best way to describe it almost.
Brian: Part of the way of looking at this episode for us was looking at the story. Just following this piece of misinformation, and sort of tracking it, so you got like a map almost. How do you follow where this went, and what happened? It starts as a local news story.
Eric: Local in D.C., or Nebraska, where he's from?
Brian: In D.C.
Eric: The D.C. Police thought it was just, "We've had murders in this part of D.C. Standard robbery, probably not a big deal." I can't believe I just said that. We're talking about a murder in the nation's capital. They basically thought it was a robbery, and he got caught up in a robbery of some sort.
Brian: That's right. There had been some other deaths in that area before that. It was an area that had some crime in it. That's what D.C., the Metro Police believed at the beginning, but it becomes this if you flashback to that. If you rewind back to that period of the election, the Trump election is in full swing. There is a lot of speculation about what's happening with either Russian misinformation or Russian interference in the election in some way. Lots of speculation about what's happening, and so this feeds right into that.
Seth had worked at the DNC in voting rights. He was, I think, obsessed with the mechanics of voting, getting people out to vote, registering people to vote, that sort of thing. Hillary had just gotten the nomination to be the Democratic candidate, and he had got a job offer to work on the Hillary Clinton campaign.
The Beginning of the Story
Brian: His parents said that was his preferred candidate. He was mulling that over that night and clearly had a lot to drink. He’s at this bar and grill and was walking home late. You can imagine yourself in your early 20s. It was a period of time when he was thinking through his choices. This was a big deal. He'd have to move away from his girlfriend. He had multiple, lengthy conversations with his girlfriend that night. It was one of those real early 20s life moments, as he's had a few drinks, and was walking home. That's when he's shot.
They think it was a robbery gone bad. I think they were scared there was no video surveillance or anything. There was one clip where you see legs come into the frame but other than that, it's inconclusive. That begins this story. But the connection to the Hillary Clinton campaign, and the already established misinformation that Hillary Clinton is, I don't know. We've all heard of the cabal of killers, and bodies that she's left in her wake. This feeds into this, and we're off to the races in one of the weirdest conspiracy theories of the last few years.
Rachael: It really struck home too, as you're looking at all the sides of this, that what got lost was that he's a human being. This was a person; he is not a thing. But as it grew, he just became more like this digital thing that people were following. His parents were like, it broke my heart for them because it seemed like there was no one they could trust to help them find the truth of what really happened.
[11:10] Political Misinformation Environment
Eric: And they're in Nebraska, they're quite ways away from the big city D.C.
Brian: They're a fish out of water in this political misinformation environment. I think there's a series of people that come in who are ostensibly trying to help them. Or, at least, they say they're going to try to help them find their son, and they step in, and they're doing anything but. The very best you can say of someone, that just want the publicity or they want to give them the publicity.
Others seem to be, I think, actively trying to use this story to deflect from a potential Russian interference on the side of Trump. Just to confuse the waters a little bit and want to pursue that. In any case, there's a series of figures. We plot these out of people that come in and try to use this story in some way to make a political play.
Rachael: It just really struck me, because you could see it, especially as younger people today. They only have the internet and social media. As I like to talk about on the podcast, I just recently got on TikTok just to see what it was all about. I get sucked in, because the algorithm figured out, I really like dog videos and kittens, so it's just serving me up on the serotonin. It's giving me the juice. But you do forget you're living your life online, and it's not real life. So, I got to put the TikTok down. But it dovetails into the gal that you got to go on camera, who had the Nazi affiliation somehow. She got into that trying to understand it, because she thought it was crazy and then got sucked into it.
A Slippery Slope
Rachael: I just think that's a really fascinating story, but I think a real slippery slope that people are forgetting about when you get sucked in online. It's like a game until it's not a game.
Eric: That was episode three, I think. I'm Not a Nazi.
Eric: That was scary.
Brian: I think that's exactly right. I think she would probably say this, that it was a difficult time in her life. She was confused and was going through these relationships. We do track her thinking and where she gets this information, and what leads her to the next step. It seems shocking at first, but then she learns to deal and live with it. That takes her to the next step. This pathway is aggravated by algorithms of YouTube and other social media outlets, that can sense you like dog videos.
Or it can sense, I guess more directly, that we like conflict. That there is a sort of amped up or just taking someone to that next level, it's increasing their engagement. This becomes this rabbit hole that people can go down. I was curious about that, where she was, and who she was. Look, I think there's a lot of confusion about her too. You look at her, and I met her and hung out with her.
You wonder, is she telling the truth now? Is this how she really feels? I do think it's how she feels but, to some degree, she was at a point where she went down this rabbit hole. A lot of people that are on YouTube don't do this. You have to understand. the complexity of her character is one of the things that makes it so compelling.
A White Supremacist
Eric: The way she described the questioning, and the desired response was fascinating to me. It wasn't, “Are you a white supremacist,” it was easing into, and then it seemed like they would build. There was a whole, scripted may not be the right word, but architected recruitment process to get you to where they wanted you to be. A white nationalistic perspective, without saying, " We're anti-Semitic. We hate the Jews." They didn't go right there. It was almost easing you in, almost like a gating process as she was interviewed. Just understanding that was fascinating to me.
Brian: I think that's exactly right. You have these parallel things happening, you have the algorithm, which is maybe nudging you towards more extreme thinking. But within these groups, you're saying, there's an awareness that if they go right to the more extreme stuff, it could not be helpful for their recruiting cause.
So, there is this sort of easing into this with guarded language, and certain professors, and stuff that say ideas on immigration policy. Things like that are stepping stones into this, and a relatively sophisticated awareness that you can walk someone down this path if they're already open to it.
Eric: It made me think about disinformation, misinformation in general. Like how do you take somebody from where they are to where you want them to be? What's that process, whether internet-driven or in person in meetings? She went to the party in New York in this episode. Then it starts building from there. How do you change the mindset of somebody, especially to be something very deviant or extreme? Let's use the word extreme. I don't care if you're right or left, and I don't care what you believe in.
The Tools and Techniques They Use for Misinformation
Eric: How do you take somebody from where they are to where you want them to be? What are the tools and techniques that they use? In my mind, as I watched the episode, I just started thinking about all the shaping of the clay almost. It was really a fascinating episode and scary for me.
Brian: You see that with her, and you watch her walk down this path, then you watch her from the perspective of where she is now. It is a place, at least if you believe her, of real regret, and trying to make amends for the decisions that she made. You wonder, what are those steps? I think part of it, and one thing she said that was interesting to me too, was, "It's like a cult." In the sense of, you do get yourself into this position where it's very difficult to get out. You establish friendships there.
It's one of the things like a cult, you're either in or you're out, there’s no in between. Also, you've done things. Things have happened. You've said things that other people know, won't look great to you. So there's a real social cost to turning around and getting out of it for some people because by the time they're at that party the veil is dropped. They're sieg healing, and there's no illusion anymore. These are violent extremists, but at that point, maybe this person has said or done something that will do real damage to them if they leave.
There's this weird cultish social pressure to it as well.
Misinformation Disinformation in the Internet Age
Eric: We'll stick with the Nazi theme for a second. My one son asked me years ago, "Why did the Germans not stop Hitler? Why did they not stop this behavior?" I think it’s very similar to misinformation, disinformation in the internet age, it does build up. You get into a perspective where you get to a point where you can't back out easily because you're somewhat guilty. You're somewhat part of the process at some point, and that's what I saw in this episode. It was an interesting episode.
Brian: I think it's interesting to take apart the parallels of that in Germany and think of the nationalism that fed into that. The fear of immigration and the other, the way that those things were exploited to lead people down this path.
Eric: You have leaders that are very charismatic. They're driving you in one direction, same thing here. I think the advent of the internet has just made it so much easier for these extreme organizations to not justify but broadcast their perspectives. Even in the Seth Rich story, Fox News is incredibly involved, amplifying the story. Whatever you believe, for or against, believe what you want. The amplification process is incredible.
Brian: That's when it becomes legitimized in the mind of a lot of people.
Eric: That's the problem.
Brian: People start to say, "Is this real or not?" That's when you have just that head-spinning sense of what is reality and what isn't. Of course, Fox had to ultimately settle a case with the Richs because of some of the things that were broadcast on Fox. But it's just wild to me that any detail in this story, no matter how insignificant, was just treated as gold.
[21:23] Giving the Story More Credibility
Brian: As long as it was some sort of bread crumb along the way to giving the story more credibility, it just didn't even matter. When you're making a documentary, and you're laying it out chronologically, you're seeing these steppingstones. You look back and they're disappearing like rocks in a river or something, then you see how flimsy every step along the way was. But some people are still hanging onto this flimsy idea.
Eric: As a documentarian, I feel like you've got the perspective that people don't have when they're in the middle of whatever the story is. I'm assuming you're looking at it from all perspectives. You're pulling all the data. You've got time and the big picture. Things are, in I think every one of your stories I've seen so far, pretty much come to some conclusion, even if there's no conclusion of what exactly happened. You're able to see it after the fact and the whole picture. Is that fair or probably not?
Brian: That's what I'm trying to do. I think in the documentary, you do have the benefit of, in some ways, hindsight and trying to make sense of it.
Eric: I think that's what I was trying to say.
Brian: It's not the round of headline writing. What's more, you've managed to look back and try to piece it together. I also think one of the things that we really try to do is try to find and listen to people who disagree with us. In fact, I think that's when documentaries get interesting when you start to find people that are subverting or have ideas that are clashing in some way.
The Figures That Came Forward
Brian: So, we reached out to Jack Burkman in the Seth Rich episode. He's talking about how he got shot in the butt. We really wanted to talk to Ed Butowsky. Ed Butowsky is another one of those figures that came forward, and at least on the surface said he wanted to try to help the Rich family. It became clear he also had other things happening.
We tried very hard to talk to him and thought we would talk to him, actually. We’ve flown all the way down to Texas, set up the whole interview set up, and started to roll the camera. His lawyer, Ty Clevenger, walks in. You know, I just did the interview with Ty.
Eric: Interesting. Do you think that those people when they're in the middle of it, I've got to imagine they don't stop, look, listen when you're going to cross the road? They take a pause and say, "Let me gather all the facts before I go down a path I may not want to go down." I don't feel that happens.
Brian: I think they sensed an opportunity here, either for publicity or to make a political point that they thought was expedient or could be beneficial to them. The candidate, in this case, Trump, would somehow benefit from a figure like Ed Butowsky. Some of the emails and text messages that came out later in some of the trials were released by the detective in this case. Or the sort of private investigator that was hired by Ed Butowsky, some of those are pretty revealing. "Trump has his eyes on this."
Misinformation Is Really Scary
Brian: They're basically describing, essentially, at least making the case that Trump is interested in the story. They wanted to hurry up and release something. When you go back, you look at what they said and what was the basis by which the Fox News story was printed and then broadcast. It's very flimsy.
Rachael: It seems like it's a tough time though, for these things. Doesn’t Fox News call itself entertainment or opinion? Or they hide behind this theme of how they present information, ostensibly to have a get out of jail free card to say whatever they want. That's really scary.
Eric: It's on every side, right? Everybody, it seems like the news is not what it was. It's not what it was back in the '80s and prior.
Rachael: That's where we are today, not even close.
Eric: I heard Ted Koppel talk about it at one point, where you had to present both perspectives of an argument or multiple perspectives. I think the news has changed quite a bit now.
Rachael: That's what I'm saying. It's not really news, and I'm not picking on one outlet over another. I'm just saying, we're at this landscape today where it's all opinion. They hide, I say guise of opinion or entertainment. "Don't take what we're saying seriously," but people actually do. It's like Twitter and misinformation, you can say whatever you want. Some people will choose to believe it and not fact-check it. That's scary. You know the TV is a powerful thing, and there are guardrails in place. You're not allowed to say certain words before a certain hour of the evening. Children programming.
The Genesis of Misinformation
Rachael: But these things can happen throughout the day depending on the cable news channel. I just think that's really interesting. This was a great example of how the genesis of that really spurred forward.
Eric: As you say, I just got a Fox News alert from Apple News about Amber Heard’s exposé on Johnny Depp. I was like, perfect timing.
Brian: Critical. When it comes to news and the state of information right now, I think there are a couple of things that are pretty profound forces happening. One is, just that the internet in general, the news industry, and journalism in general, have just had their financial underpinnings decimated. We were saying that five years ago, we're saying that 10 years ago. It's really brutal. So many news organizations, local news in particular, which I think is particularly egregious on a local level. Different people's cities and stuff. Many fairly large cities have lost their newspapers, and just don't have that kind of reporting.
So much of that tradition of really looking for speaking truth to power, looking for the dangers, or events, or just news in general, of people's communities. It's just really evaporated. Then you got that at the national level too. They say the New York Times is doing better than ever or the Washington Post has managed, in some ways, to look for these kinds of subscription models that work. We're experiencing a lot. I think you don't want to underestimate the effect that's had on our information environment just in general. Investigative reporters are having trouble in ways that are brand new.
The Position of Social Media in Our Lives
Brian: So, you add that with the easy prevalence of the position of social media in our lives. The way that can be algorithmically juiced in various ways by entities that we don't quite know who's doing it. It creates a real crisis in understanding our world and the forces at play in it.
Rachael: It's almost hard to tell what's real and what's not real today.
Eric: Jeff Bezos buys the Washington Post, and protects that. Elon Musk may or may not buy Twitter. It doesn't look like it's going very favorably for them, it may or may not protect our approved news outlet. But other than that, what should our listeners do to get to the truth? You're not doing a documentary film on everything in their lives. There isn't always somewhere to go where you can get the authoritative source distilled down. They can't do their own research many times to the extent, certainly not to the extent you and your crew do. So what do you do?
Brian: That's true. So, you look for those. You just look for and champion those outlets that you think are still approaching it with that traditional fact-checking kind of journalistic. I think some of it is fractured too. You end up following different actual reporters, or different figures. It's interesting that we have so much information now, and so much more access to information. But in a lot of ways, what that really means is that you have to be more engaged with it in understanding it.
I think this is part of what you're saying, you have to be involved in it a little more.
[30:49] A Degree of Skepticism
Brian: People find that exhausting, for sure but it is a degree of skepticism across the whole media landscape that just has to be there. That sort of media literacy is just part of everybody's lives now.
Eric: I think that's a great term, media literacy. How do you think through something, a piece of news that's put out there, validate its authenticity, and validate that it's good? You could watch CNN and Fox and meet somewhere in the middle. I do that sometimes, tune into the BBC which I consider to be a more trusted authoritative source on the news than a lot of what we have here. I'd love a class in, I was going to say college, but probably high school, that would teach students how to think through a problem such as this. Disinformation, misinformation, and how to understand the tools that are available.
The interests of the people who are publishing, whether it's the Internet Research Agency from Russia, or it's the local whack job from downtown who lives under the bridge. But they're getting their word out there. If you see a semi-naked person with a homeless sign protesting in the street, and bringing God to you, I think a lot of people question that in the city.
Brian: Right, but that could just look like a normal person online. But it's not that obvious. Meet Jesus at 6:00 PM by the Safeway. It’s probably not going to happen.
Brian: Go to that person's Twitter account, and it looks more official somehow.
Eric: Exactly. But when it's not that obvious, when it's a news channel, or it's on the internet, or Seth Rich was the whole WikiLeaks type of interaction there.
Teaching the Next Generation to Think Critically
Eric: How do we teach the up-and-coming generation to think critically through the information that they're getting bombarded by so that they can be more normalized? I don't even know if that's the right way to look at it.
Brian: I couldn't agree more that this should be deep in our school curriculum, for sure. I give screenings occasionally and talk to colleges and I do think there's a growing awareness and engagement with this sort of thing. That can't happen soon enough. I have a nine-year-old son who likes to watch Minecraft videos on YouTube. I've had multiple, all sorts of restrictions in place, as you can imagine.
Eric: Yes, wait until Minecraft says vote for Joe. Bad example. Vote for Bob. Your nine-year-old is like, "Dad, are we voting for Bob yet?" "What are you talking about? Who is Bob?"
Brian: I know.
Eric: "Dad, we're voting for Bob this year."
Brian: No, we've had to have these conversations about what it is, what YouTube is, and how it could be. To try to describe this, there are figure misleading figures. Things are not true. You're sending them into this world where there's this kind of infinite number of potential kinds of influences. It can't happen soon enough that kids become aware of this information environment that they start to understand how this is constructed.
Eric: Is it happening?
Brian: I think so. He gets it and wants to watch Web of Make Believe. I'm not sure I'm ready for that yet.
Eric: Let me ask you the real question. Do you have a career for life here on these documentaries, because we're never going to solve this problem?
Information Is Manipulated Forever
Eric: Are you worried about your next 10, or 20 years of filmmaking? And, "I may need to switch subject matter, because this problem may be solved by America at some point." Or pick your country.
Brian: I think, this is a funny question.
Eric: I think you're employed for life.
Brian: In some ways, it's a deep question because information has been manipulated forever. This is, if you go back in history, it's involved in a project that we're working on now. You see things like election interference, false stories, and red flags, false flag stories all the time. You're seeing the use of these same tools for quite a long time. You do see humanity reacting to each new technology that comes along. The radio comes along, it has a very profound effect, in fact, in Germany. During pre-World War II. So television, these things
have a way... And then cable television, right? Which is where some of the standards, the sort of fair standards, sort of those laws kind of go away.
You start to see us reacting to these changes in technology, these shifts in the way that we get information. We have to figure out how to deal with it. I think we're in one of the most profound, and the internet is just as or more profound than any of those, trying to deal with it. And I don't think we're through it yet. It's going to be a while before we get it, but there'll be more things too. There'll be other stuff. I do think that anytime there's a new technology, there are people who are going to abuse that technology.
Misinformation: The Dark Side of Technology
Brian: I think it's almost always smart to look for the dark side. We're attracted by the fun stuff because technology is great. Technology is exciting and fun. It's the famous line, that any sufficiently advanced technology is indistinguishable from magic. This is true. We are tool-creating creatures, and this is the most exciting thing that we can do. But we don't see that there's also a dark side to it mostly because that's not the fun part at the beginning.
Eric: It's fun for some people.
Brian: That's true. But it's also new, so we don't know what to look for. I think that as technology shifts, changes, and becomes more profound, there's no stop to that. There will be both wonderful things to come out of it and really challenging things that will hurt us as a society, then we'll have to get it.
Rachael: It's sad. That's a good segue to this extortion episode.
Brian: We just go from darkness to darkness.
Eric: From Nazis to Sextortion.
Rachael: It was interesting watching it too because I'm like, "Wait a minute. All these girls are in the same area." Then to find out who it was, I don't want to ruin it for anyone who hasn't seen it yet. It was just interesting to see that because it really opened my eyes to how its mechanisms of it worked. I don't know any teenagers that have gone through it. I don't have children of my own, so in my little reptilian brain, I'm like, "Why would you do that? That's crazy." I'm also not 14, 15. But to have watched that, and there was this other documentary that came out, Is Anyone Up? about Hunter Moore to watch that.
The Sextortion Episode
Rachael: To have seen that, and then juxtaposition to what the Sextortion episode is really scary and eye-opening how you don't have control over your phone or your social media. It's so easy for somebody to be able to get access that you think they're a friend, and they're not actually a friend. That was what broke my heart for these women. You almost have to lock your life down, and who can live like that? In a lot of ways, it's like you have to completely change the way you want to live and interact with people, and that just doesn't seem realistic.
Brian: It's horrifying. It's a uniquely modern horror, not that people haven't been bad in all sorts of ways before this. But to be sort of preyed upon by a figure that you don't know where they are coming from. It could be anywhere in the world. That seems uniquely new to this moment or at least, this period of time, where to have the added horror of not understanding where the threat is, is really bizarre.
To be so immersed in this digital world of social media is a reflection of your own personality and your interests. You posted so many things online. It's who you are, it's who you communicate. To be yanked out of that, locked out of it, because this person has managed to get the password and gotten control of your account, is horrifying. It's horrifying in ways that feel very new. So yes, I feel that story, in general, is really opening up a whole world that we're starting to try to learn how to confront.
[40:40] The Cascading Fallout Effect of Misinformation
Rachael: I wonder too. There's always this cascading fallout effect. Eric knows, I always talk about going off-grid. "Why don't we just unplug, because it just seems like an absolute nightmare."
Eric: But she doesn't, she's on the dark web, on Twitter, on TikTok. She doesn't use multifactor authentication.
Rachael: It almost seems like we need to unplug. I wonder if the younger generation, I don't know what they're called these days, Gen Z or whatever. You wonder if you get so disgusted as a youth, you're like, "I don't know who I can trust. I feel like I'm being exploited."
Do you see it going the other way? Did any of the teenagers that you spoke to, like, "You know what? We're going to be a Luddite. I'm only going to deal with real-life stuff and I'm going to put the phone down." Or do we see any kind of like societal changes coming out of all of these bad things, to try to get a better handle on it?
Brian: No, I don't think so. I don't really see generation alpha, by the way, as next.
Eric: I just looked it up for you. 2010 to onward, generation alpha.
Brian: I don't feel like I see many people wanting to turn away from technology, especially young people. I think they want to dive in. It's exciting, interesting, and new.
Eric: Well, in fact, they are.
Brian: They're attracted to it. You don't hear that often. It's like, "Maybe we need a break." You hear that more often, the Gen Z than the Alphas. But the Sextortion story is really interesting when you think about how it played out, in terms of how to solve it or how to not solve it.
The First Person You Meet
Brian: Maybe we're not there yet but it's, first of all, the first person you meet in this series is this woman Mackenzie. She goes to these great lengths to go to the police. When she goes to the police, she finds out they don't even know what an IP address is. They don't have even the most basic tools to understand what she's saying, or what she's going through, and they don't take her seriously. That's super frustrating for her.
But then, you have all this group of other women in this that are going through the same thing and are starting to figure it out. They're starting to kind of draw connections, and just trying to, whatever clue they can get, begin to talk to each other. They come forward. I think the really powerful part of that episode is they start with their mom and the rest. It's really amazing when they set aside the potential shame of something like that, or the emotional fear that they have, or the desire to just go.
One woman obviously talks about committing suicide. She's really upset about this but then ends up talking to her mom about it. Connect with other women that were going through this exact same thing, to the point where they start to draw these technical clues and put things together. Finally, find an authority figure, in Raechel Moulton, that's sympathetic, and that wants to figure it out too. It's both of those things. The idea of this series is these reverberations, this earthquake that we're all experiencing when it comes to new technology and information.
The Awareness of a Crime
Brian: How we live our lives as highly evolved apes, essentially, sutured into this highly technological, electric environment. How do we do this? Clearly, we don't live in a world in which authority figures or police can't know what an IP address is anymore. That can't be a thing. I think it's also, maybe in a more powerful way, the fact that these women are talking to each other and are stepping past that kind of shame and trying to figure it out themselves. It's that awareness of this as a crime.
This, as something that is a predatory thing, combined with a greater awareness of authorities and police, is a step toward trying to figure this out and living online and being online in ways that are comfortable for everybody. But I don't know if we're ever going back. I don't think we're ever going back to a place where we're not using these tools. The path is forward, but how do we make this a better environment for ourselves?
Eric: As you're speaking, I'm trying to think back to a technology where we went back. We walked away from it. The biggest one is probably like nuclear weapons, we haven't used them again, but we have nuclear power. We still have nuclear weapons and the threat of nuclear weapons, but the radio, the television, and the internet pick your technology tool of choice. I can't think of one that we actually rolled back.
Brian: I agree. Although, I'm not so sure about nuclear weapons though.
Eric: Granted, it's a stretch. We certainly didn't roll them back. They're still there probably pretty close to as bad as it's ever been.
The Misinformation Threat Is Growing
Brian: It's funny. I've been looking at this a little bit with a new project too. Nuclear. I think the threat is growing. In nuclear, I don't derail your thought, but the treaties are melting away. The New START Treaty, if it's re-upped in the next few years, that'd be a good thing. If it's not, things are not great between the United States and Russia. There are no more nuclear treaties. The United States is making nuclear weapons. China is going crazy with them. They're going from 300 to probably 750 nuclear weapons in the next path.
Eric: I thought it was even more. But again, there probably isn't a technology out there that we've rolled back. I don't see it happening.
Brian: Sorry, I went on the nuclear rant for a second. You're right with that.
Eric: That's as close as I can get because we actually used nuclear weapons in 1945, and then haven't used them on people to date. Maybe we withheld some of that. But every other technology and we don't have to talk weapons. I mean radio, television, the internet, and automobiles, continue to progress. They get more ingrained in our lives. Maybe like radio, they change and fade out in some manner. The horse and buggy faded out and became the automobile. I don't know what the internet will, or its replacement will be in 2152, but there's probably something on the horizon.
Brian: You know it's going to be something.
Rachael: I think we're all going to have some private internet. That's my prediction. I think we're all going to have our own little private internet, and I can invite you to it. It's protected by a VPN.
Eric: Which you don't use.
Our Level of Acceptability on Misinformation
Rachael: It's either that or unplug. I think it's going to get more restrictive if you can't trust anybody. You got to create these little mini bunkers.
Eric: I think our level of acceptability will increase. So, "There are nude pictures of me online. Whatever, they're out there. I said a lot of things." We almost become more numb to the amount of information that's out there. Let's be honest, we keep putting the information out there. Brian, I think you have topics for the rest of the time. What a great career choice. You could make documentaries on these types of topics around misinformation, disinformation for probably the rest of your career.
Brian: I think it's a big issue that we're facing right now. Just the way that technology has shifted our whole lives, the way we see the truth, and who we are. A sense of identity, and the whole thing, and I think we're dealing with it. We're trying to figure it out.
Rachael: Just the cracks in the system too, that's like the whole Stingray, the two-part episode there. There are always going to be cracks in the system, and there are always going to be people that can figure out how to exploit them. I think that's what's so fascinating too, about what you do in documentary filmmaking. We're going to plug these holes.
We plug the holes but there are some new holes that we just created because we just innovated some more. We’ve got this great new technology, and that's just as fascinating to me. How do you uncover the cracks in the system and exploit them? Do you ever get ahead of that? I don't think you do.
[49:40] Cracks in the System
Brian: What's so fascinating about the Daniel Rigmaiden story, to me, is that he was such a savvy person that was able to figure out these cracks in the tax system. Where he could do this weird hack, he could file fraudulent tax returns, essentially at scale, and hide all the tracks of everything, and the whole system.
Rachael: He automated it. That was genius.
Brian: Yes. Then, the system that caught him, the Stingray is exploiting a security thing. I'm sure you guys know more about this than I do. The security of cell phones, cell towers, and everything else. This sort of man in the middle kind of thing with the Stingray. But then the twist is, what's right? What's the right use of those kinds of technologies?
Considering that we have the Fourth Amendment that's supposed to protect us from illegal searches of our places, artifacts, and things. How does that basic right to not be searched, and how does that find its way in the digital age as well? Things are radically different, and all that means something very different. It takes this bigger turn. That is, I think, a very legitimate question about our freedoms now in this digital world.
Rachael: Or lack thereof. Because I think a lot of people don't know what their rights are in some areas. It's a gray area if you're in an airport. They're like, "Give me your phone, and give me the passcode." You're like, "What? No."
Eric: The first answer is always no.
Rachael: "No." They're like, "Well, we'll detain you for the next 18 days."
Eric: Unless you're a bad person, then it's okay.
Rachael: This has been wonderful, Brian. Really appreciate your time.
Brian: Great to talk to you guys.
Eric: There might be a season two coming up.
Rachael: That would be great, and endless stories. Congratulations on such great work, powerful but important stories that need to be heard. Thank you for doing that. Like in investigative journalism on steroids, it's a lot of hard work. You got to put it together and get it out in the world. Congratulations on great work and storytelling.
Brian: Thank you. I appreciate it.
Rachael: We need more of you out there. I hope your son follows in your footsteps.
Eric: If our listeners want more, go to Netflix. Everybody should have a subscription at this point. Web of Make Believe: Death, Lies and the Internet. Brand new for 2022. There are six episodes, one is a two-parter. You can get a lot more context about what we were discussing. Hopefully, we see a lot more of your work in the future. It's really well done, well produced, well shot, and impressed me.
Rachael: As is your IMDb, DP, and DB page. I didn't realize how much of your stuff I had seen over the years. Everything that you've done, they're all very important stories that need to be heard. You walk away, I have a lot of questions, and I have to think about things. It's what I want out of a good documentary.
To all our listeners, thank you so much for joining us again this week. As always, hit the subscription button, and you get the episode delivered right to your inbox every Tuesday. We have great guests, like Brian, that we're so fortunate to speak with. You get to learn from all the amazing insights and work that he's put into his work. Until next time, be safe.
About Our Guest
Brian Knappenberger is an American documentary filmmaker, known for The Internet's Own Boy: The Story of Aaron Swartz, We Are Legion: The Story of the Hacktivists, and Turning Point: 9/11 and the War on Terror and his work on Bloomberg Game Changers.
Listen and subscribe on your favorite platform