[00:47] An Ugly Truth
Rachael: We have on the podcast joining us today technology reporter, Sheera Frenkel, at the New York Times. She has just written a book, An Ugly Truth: Inside Facebook's Battle for Domination, with Cecilia Kang. And I have to say, I read this in one sitting. Literally, I couldn't turn the pages fast enough, I was getting paper cuts.
Rachael: I was so excited, and I was so shooked, after reading it. The totality of everything that you guys captured in your reporting, I'm still shooked. So welcome to the podcast Sheera. I'm so excited you're here.
Sheera: Thank you so much for that amazing compliment, it's what we wanted. We wanted it to be a good book about Facebook, but we really wanted it to be a page-turner.
Eric: Okay, hold on, I feel so insufficient right now. It's 321 pages and I was not able to read it all in one sitting. It's a fascinating read, don't take anything away from that. I'm a quick prolific reader, but Rachael, that's impressive.
Rachael: I know. The only other time I've done that was for the Da Vinci Code, so take from that what you will.
Eric: Yes, we need some help here, but okay.
Sheera: I think there's been a lot of books written about Facebook and they often are really comprehensive. They get really into the weeds of lots of facets of that company because it's a massive company. It touches on so many different business models. And I think we just kind of wanted to focus on this core question of who are the people running Facebook.
From Chapter to Chapter
Sheera: Why do they make the decisions they've made and what impacts have they had on society? And we thought if we kept it really focused on that idea and that prism, that we would keep that thrill alive. We would keep you reading from chapter to chapter.
Rachael: You absolutely did it.
Eric: Well, it worked with Rachael.
Sheera: Do you not care about society? Do you not care about democracy, Eric?
Eric: There were some great questions. As we were talking before we got started, the back cover, you read it. It's like, "Calm down, breathe, we hear you. We never meant to upset you." And you read through and I'm like, "What do they want me to interpret from this?" Then I started getting in my mind, and then I dug into it, and the questions, the concepts, the ideas that were brought up. The decisions they have to make, which today, even in society, we're still making these decisions. And it is not a black and white decision. It was a fascinating read.
Sheera: Thanks. So, the back cover is a series of apologies by Mark Zuckerberg and Sheryl Sandberg. And the reason we did that is because, as reporters, that was funny. It's a lot of how we started our reporting process for this book. Which is, we'd spent so much time writing one-off articles after each one of those apologies. Like Facebook messed up. Facebook apologizing, apologizing Facebook and specifically Mark Zuckerberg and Sheryl Sandberg saying they'll never do it again.
Keep That in Mind or Ship It
Sheera: As reporters, when you start with that, you're like, "Wait, there's a pattern here, there's something happening. It's not just one article or two articles, it's at this point, dozens of articles." And if we remind readers of that on the back cover, these are all the mistakes. When you're reading a book, we hope you kind of like, keep that in mind and think, "Oh, that's right. This is mistake number five or mistake number six, and they keep saying they'll do better, and why aren't they doing better?"
Rachael: Yes, that's what struck me too. And in the security world, we're so numb because it's like, "Oh, another breach. It's just 500 million user accounts, whatever." And you forget. When you look, like your book, at the totality and you're looking at particularly that period of the election to kind of present-day, it's striking just what an impact it is and the cascading effect that we're going to be seeing for many years to come.
Rachael: Yes, I sat in silence after I finished it because I had to absorb it all. Just what remarkable reporting it was too. You guys are amazing reporters. So, I do want to thank you for being so talented at writing, but also so good at reporting because that's not easy to do. It's hard to do something, building up contacts like that, and really getting in there. And being able to dig in, and get to the inside. So, thank you to you and Cecilia for that because I just really appreciated it.
Facebook Is Polarized
Sheera: Thank you. I’d say it helped a lot that I had a co-author on this and she was in Washington, I was out in the Bay Area. I live in Oakland and did most of the reporting really in Menlo Park. But I think that helps because Facebook, it's really, well, I was going to say polarized. It's fractured almost between the policy side of the company, and those in Washington. And then the security teams that are in one part of Menlo Park and then the product people that are in a different part of Menlo Park.
Eric: And the employees too. I mean, the employees are even saying, "Hey, which way do we go?"
Sheera: Totally, I can't tell you how many times we would be focusing on a specific meeting. I would go talk to my sources, and Cecilia would go talk to her sources. And we realized that our own sources had no idea what the other people in the room were thinking, or doing, or what the motivations were. We were like, "Oh, they have no idea. This book is going to complete." And we got that feedback after the book was published from some executives.
Sheera: Even at Facebook, they were like, "Oh, I had no idea that those were the other opinions in the room. I had no idea when I said that the reaction was this." And I just thought, "God, that's so striking because they're this company that's all about transparency and communication. Their own employees have no idea what's happening. And then they just have all these questions about what's happening in other parts of the company."
Should We Allow It or Just Ship It?
Eric: There are really difficult decisions that they have to make. Zuckerberg and the team, they've got to make these hard decisions. In the beginning, it talks about the accessibility to confidential or user data. Something almost as simple as that which you would think is so easy. It's not, should we allow Donald Trump's messages to be published if they're factual or not? Or somebody else's, a hate group's message, whatever. This is, how do we allow engineering or somebody access to the data so they can do their job better?
Sheera: Yes, that's an important example because you think that would be black and white, like isn't user data, that? Isn't the right user's privacy, this? Again, to back up for a minute here, we're not talking about a photo that I posted or is Sheera Frenkel a male or female? We are talking about the kind of intricate data that Facebook builds on you through years. We're talking about predictive data, like am I likely to buy a house? Am I likely to have a child? We're talking about my real-time location information if I have Facebook on my phone. This is really specific data that Facebook has access to.
Sheera: They were giving their engineers total access to that because it made it faster for them to code and create programs. To me, that's such a typical example of Facebook's calculus and specifically Mark Zuckerberg's calculus. Which is, for the company to be successful, it had to grow as quickly as possible. The move fast and break things motto, which okay, technically they changed it, but let's be real here. Every engineer at the company says, "This is still our motto." And can I curse?
[08:13] F**k It, Ship It, Make Sure It’s Secure
Eric: Sure, we're a clean podcast. Break that, let's go.
Sheera: Engineers we spoke to that work there right now say one of their key motto is f**k it, ship it. It's not check it carefully, make sure it's secure, it's f**k it, ship it.
Eric: I think we have a show title, Rachael, there goes our clean rating, but that's okay. So, what does that do though, Sheera? I mean, how does it change?
Sheera: So, what it does is it creates these decisions like, "Okay, well, we want engineers to f**k it, ship it. We want them to get these products out. And so let's just give them access to user data because they can test things in real-time. They want to make a new little, I don't know, fireworks display that goes off when you wish somebody a happy birthday. Let them test that out on 1000 people by watching the real-time reaction, do they like it, do they not like it? And that way, we get really quick data and it tells us, yes, let's roll out this product. No, let's not roll out that product."
Eric: I think it's awesome, until I start looking as an engineer into your whole entire history. You've put up their private messages and everything because I just met you on a dating app. And I want to know a whole lot about you, I'm a stalker. It's great until that point.
Sheera: That's the issue. And we opened the book with this one report that was created by Alex Stamos who joins the company as their first CISO, the first head of security.
Comprehensive Security Assessment
Sheera: One of the first things he does is this comprehensive security assessment and he flags it. He's like, "You guys have fired 52 people just in the last 18 months for abusing access to data." It's by and large men that are spying on women that they're interested in, just to be clear. I mean, there's people doing other things, but that's the majority here. And it's dangerous.
Eric: We're going to go with that stereotype, by the way, that's clean, let's keep moving on, I agree.
Sheera: I mean, that's in the reporting account.
Eric: It's factual.
Sheera: That's in the report. So, it's there in the report and it's there as part of their meeting notes. So, he tells him like, "This is an..." And they say, "Yes, we fire these people." He goes, "Yes, but it's going to keep happening, and it has happened. And you have no idea how many people you didn't even catch. But the point is not that you're catching them, the point is that you let this keep happening because you've chosen to give all these engineers access to data. You shouldn't do this. You've got to limit who has access to what. That's kind of like a privacy standard, a lot of other companies at this point."
Sheera: He convinces them, but he makes a lot of enemies at the company with this decision. And that's kind of how open in and we set the stage for his entire time at the company. Which is, he comes in, he tries to say these are best practices for security and privacy. And he makes a ton of enemies.
A Simple Problem
Eric: So I'm reading this as a security practitioner and I have experience in the intelligence community where you're not allowed to spy on US citizens. Certainly not your prospective girlfriend, or your ex-wife, or your husband, or ex-husband or whatever. You just can't use these tools to do that, it's bizarre to me. This is just the beginning of the book, by the way. This is a relatively simple problem compared to some of the larger ones, in my opinion. But then it goes to think about nation-states. All they have to do is get access to the system or somebody, any employee pretty much, because it's wide open.
Sheera: Of course, they have done that. This is actually what’s getting to in the book. It's funny when you were saying the book is 300 and I can't remember how many pages.
Eric: It was like 21 pages, I do.
Sheera: The book is long, it's more than 300 pages.
Eric: It's 300 pages until you get to the acknowledgement, how about that?
Sheera: There you go. So there was a version of this book that was twice as long when we first finished it. We had to cut half of it because we could have gone in so many different directions. I mean, at one point, the Russia chapters alone were like four whole chapters. We just had so much material. And, again, we had to trim it because we wanted to make it really readable. We wanted to make it more of a page-turner. We’re like, "We can't spend 10 pages on this one thing." But yes, that's absolutely right.
How China and Other Countries Ship It
Sheera: In fact, there are people who have spoken to us about how absolutely China and other countries have done exactly that. You've left yourself vulnerable to nation-states and to hacking by nation-states by giving that level of employee access to user data.
Eric: I mean, it takes open-source intelligence to a whole new level. As intelligence, with my background, I just looked at that and I was like, "Okay, this is a problem."
Sheera: Yes. It's interesting because a lot of the people that Alex Stamos brings into the company are former NSA. They have intelligence backgrounds and yes, they are. I mean, as a security team, they still have had access to quite a bit of that data. But we're talking about a group of like 12 or 15 people as opposed to a group of 20,000 or whatever security people they currently have.
Eric: Well, and to your point, they fired 52 that they really had in that window, but they hadn't fixed the problem. You're going to fire another 52 until you fix the problem.
Sheera: Right. That's kind of like, in some ways, the theme here at Facebook, is they have at the core of the company, a problem. Which is that they have been motivated by growth. They've been motivated by growing as quickly as possible into new markets. They have been motivated by business growth which relies on keeping people online as much as possible. So, for Facebook to be successful, they need to keep you, hours a day. Opening the app multiple times a day.
Sheera: The only way to do that is creating algorithms that amp up emotive content. So you open your news feed, you open Instagram, the first thing you see is something that's going to inspire emotion in you. When you make that calculation, when that is at the core of you being successful as a business. How are you not going to inevitably have a problem with hate speech and misinformation?
Eric: One, if you're a nation-state, once again, I think that you're very well aligned there. That's exactly what they're trying to do. And now there's a US social media platform, I don't know, they're many. We're not just talking Facebook here, where their whole business goal is aligned with you. We want to create emotion; we want to get people excited about things on both sides. Because that drives eyeballs and the eyeballs drive money and growth.
Sheera: Right. Then it's interesting because here in the US, we have some safeguards for that. We have independent news organizations that counter some of that, we have many civil disobedience groups. And we have a long history of NGOs. We just have institutions where people can go to for other sources of information, which people trust.
Sheera: One of the things we tried to do in the book is to show how in other countries where those don't exist. How construction of Facebook is kind of a worst-case scenario. Myanmar is the most obvious example of what happens? We have a whole chapter dedicated to Myanmar because it is. Like okay, if you could create a situation which is like everything that goes wrong, goes wrong, that is it.
Rachael: That was it.
[15:03] No Media Literacy
Sheera: You have the instruction of social entire country which has no media literacy. In the book, I talked about reporters in Myanmar in 2015 emailing Facebook and saying, "This is..." I was one of those reporters. I was in Yangon doing reporting on the instruction of internet. And I was sending Facebook email after email saying, "This is a tinderbox, this is bad. The hate speech here is off the rails." And I was getting back these form responses that were like, "Don't exaggerate. It's not that bad."
Sheera: That's what happens when you have an algorithm that amplifies hate speech. A population where so many people I interviewed in Myanmar told me that, "Well, Facebook is an American company. Surely they vet everything on their platform. And so, if we're seeing it on Facebook, it must be real."
Eric: It must be real.
Rachael: Wow, it must be real. And there was one content moderator for that whole region.
Eric: Think about Myanmar, I mean, how much is Facebook really monetizing that? It's almost not even a monetary thing, if you ask me.
Sheera: Right. Well, the amount of money would cost them to be responsible in terms of their content moderation practices there, probably doesn't make it financially. This is a country with over 100 languages that are spoken. We can't get into this in the book because it's too complicated. But the typeface in Myanmar is also incredibly complex. They have so many characters in their alphabet. That makes it really difficult for Facebook's AI systems to even adequately monitor written content. Which is usually a very easy thing for them to do in other countries with Latin alphabets.
The Question Whether to F**k It or Ship It
Sheera: So there's a lot of reasons why adequately, sorry, moderating content for Myanmar is really difficult and really expensive. That begs the question, why enter that market? Why even go and enter a market very aggressively, I should say? Because Facebook was the internet in Myanmar for the longest time. It was through a program called Free Basics where everybody got it for free on their phone. It just became the go-to, what everyone used. Nobody emailed each other there, people just Facebook messaged each other.
Eric: But it is free. So other than advertising and providing rapid internet access.
Rachael: Well, free.
Eric: I mean, but you're not paying Facebook for a subscription.
Rachael: I'm paying them in my personal data though. What's my birth date worth, you know what I mean? What is my location worth? I want to translate that into dollars.
Eric: But if you're in a third-world country and you can now communicate with your friends at no additional cost. I mean, I don't know, maybe text messaging costs money there, I have no idea, but Facebook's free. Facebook, other than data charges, it's basically free. So you were just provided a communications capability you never had before. I can see why adoption would be off the charts, makes sense to me.
Sheera: It was amazing. And people were amazed by that ability. I mean, this is a country where before the introduction of a lot of this, it's really the introduction of a cell phone that made it possible in Myanmar. Because very few people, almost no one there has a computer, but they have cell phones. So telecommunication companies move in and all these cell phone towers go up.
A Gold-Gilded Pagoda
Sheera: Sorry, this isn't in the book, but I just remember it from being there. In the middle of Yangon, there's this beautiful pagoda, and it's a gold-gilded pagoda thing. All the streets around it used to be well-known as streets where you would go to buy stamps and postcards. Because that was the way people talk to one another, you would send letters. The mail system, there was the way you kept in touch with family and friends. And in the course of two to three years, all of those stores got replaced by cell phone stores and cell phone carriers.
Sheera: There were these shops, I love these shops. They just had the word Facebook in big letters. And they would charge you the equivalent of $2 or $3 to set up a Facebook account for you. Because people didn't know enough about the internet to create an email address, to make themselves a Facebook account. So they would pay somebody two bucks or three bucks to create an email address for them on Gmail. And then register a Facebook account for them because that's how important it was to get on Facebook.
Eric: And then off to the races?
Eric: At that point, they were good. They got up to speed quickly, I bet, and became more technology literate.
Sheera: Yes, they did become incredibly active Facebook users. But again, I can't tell you how many people would pull out their phones and show me a video of ISIS. Because this is 2015, 2014, show me a video of ISIS and say, "Look, this is a video of Rohingya Muslims killing Buddhists in Thailand. And I would say, "No, no, that's ISIS in Syria in the middle of an execution."
The Price of Innovation
Sheera: They said, "No, look at this caption, it says right here in the caption that this is happening in Thailand, these are Rohingya killing Buddhists." And I said, "No, no, no, I can guarantee you that is a video from Syria. I used to report in Syria." They wouldn't believe me because the internet was telling them something else. And you can see how quickly that stokes anger, and hatred, and outrage.
Eric: What do we do?
Rachael: Well, I guess that's the question though. It's the price of innovation and the price of connectivity. Like WhatsApp, you can message people in Singapore. I have a friend in Singapore and we always WhatsApp and it's so easy. But yes, what is the responsibility long-term because you see such divisiveness. And there were decisions made during that one election that we all remember. How do you not fact-check political ads?
Rachael: And you're seeing these ramifications and clearly, something needs to be done, but what's to be done. I mean, there's the federal government, there's big tech. Is it even possible for them to come to an agreement on kind of a way forward so we don't end up like Blade Runner 2049 dystopian future in five years.
Sheera: Now that the book has come out, I get asked the question a lot. I think there's certain things that are obvious. Early on in the 2015 campaign, Zuckerberg makes this call to allow Trump this carve-out to say things that he would not allow.
Eric: Because he's a political candidate running for the office of the president. He gets special treatment outside of their stated policies.
A Domino Effect Decision
Sheera: Exactly, that is a domino effect decision. Where the people sitting in the room that day, Mark Zuckerberg and his team and Sheryl Sandberg, I should add, in their minds, they were making a one-off decision. Donald Trump says he's going to ban Muslims. They say, "Yes, that's probably hate speech." We might remove it if the average person says it. But because it's Trump saying it, and he's a candidate, as you just said, we're going to allow it.
Sheera: In their imagination, they're sitting here being like, "Oh, well, Americans are going to be outraged. They're going to say, this is insanity, we can't ban Muslims from America. We have to let Americans see this because so many Americans are going to get angry when they see this." They cannot imagine that what's actually happening is that their algorithms are boosting it. Making it the most popular post on Facebook. Even if you disagree with it, you're probably engaging with angry emoji.
Sheera: Even though they have all these examples from history that show them, "No, this is exactly what's going to happen. You're actually going to make Donald Trump the most popular person on Facebook because he always says things that are so outrageous that people have to respond." They can't see that for some reason. So, yes, they essentially succeed in making him the most popular and probably the most powerful person on Facebook other than Mark Zuckerberg.
Sheera: They create a system by which populist leaders and other countries such as the Philippines, Hungary, and Turkey can start using Facebook themselves as a way of messaging their own population. Making sure that they are dominant within the local ecosystem of Facebook in their countries.
[22:46] Life and Death
Rachael: It's truly frightening because it's life and death. That one person can make a unilateral decision and it literally impacts lives.
Sheera: Again, I go back, here in America, we have safeguards.
Eric: We do?
Sheera: Well, some. We have a few kinds, we have NGOs, we have civil society groups, we have others.
Eric: I'm with you.
Sheera: We're not in a country that's been a military dictatorship for 30 years, or an autocracy, or a monarchy for that matter. Many people I've spoken too, since the book came out have been like, "Oh, well, Trump's off Facebook now and he's banned for two years." I'm like, "Well, yes, but the Philippines is about to have really important elections. Hungary's about to have important elections, India has more."
Sheera: There are countries that cannot wait two more years for Facebook to make up its mind on whether or not it's going to permanently ban Trump, and what it's going to do about political figures all over the world. There are countries all over the world where this is still having life or death consequences. To not address it, not figure out systematically what Facebook's approach to political speech is going to be, is incredibly dangerous.
Eric: But they are, and this is something I found myself struggling within the book. It's this constant yin and yang, it's this tension like, "Who determines what the right decision is." I mean, very early on, was it, Joel Kaplan?
0Sheera: Joel Kaplan.
Eric: Says something like, "Don't poke the bear," as they're going through 2016, the election time. Was that the right move or was it time then to do something and make a stand?
How Do You Decide to Ship It
Eric: But if you're going to make a stand, put yourself in Zuckerberg's shoes, just for a second. How do you decide? You've got employees saying one thing, you've got advisors saying other things. It is a country where we have an open and free press and you have freedom of speech. Should you be allowed to talk about things, and as a citizen, use a platform to write something that is factually just wrong? Now, I totally, fundamentally, disagree with it, but I don't know where to draw that line.
Sheera: What's interesting and what we try to show in this book is Mark Zuckerberg struggles with these decisions. He listens to people like Joel Kaplan and he listens, well, sometimes to Sheryl Sandberg. Though he kind of seems to stop listening to her at the end of the book. He makes bad calls over and over again on what the right thing is. And yet, and this is what I would pose back to you is, despite making bad calls, he is not only in power. He has consolidated power. You would think that given this track history that he's got going, he would say, "Maybe I'm not the one that should be making these calls."
Eric: Oh, come on! One of the richest guys in the world, one of the most famous companies in the world, think about his psyche. I've never met the guy.
Rachael: I think he was wartime CEO.
Eric: I'm better than anybody.
Rachael: That's really fascinating.
Sheera: Sure. But objectively, if you had a moment of hubris, wouldn't you sit there and say ...
Eric: Who does that?
Acknowledging the Mistakes
Sheera: Well, Menches. People that have a moment to acknowledge the mistakes that they've made, which he, at least the PR line, is that he acknowledges the mistakes that he's made. But our book came out two weeks ago, roughly, let's say.
Eric: Rachel, read it two weeks ago in one sitting.
Sheera: Mark Zuckerberg did not decline to be interviewed for the book. He declined to be interviewed after the book was published. He knew it was a best-seller, he gets these reports. The first interview he gave after the book's publication was to Casey Newton, a fantastic journalist who writes a newsletter about Facebook. And do you know what he spent the entire time talking about? How he wants to turn Facebook into the metaverse.
Eric: You better define that for most of our listeners.
Sheera: Right, imagine the movie Ready Player One. It's like an AI world where you live your entire life on Facebook. He's just had a book come out about Facebook which is showing really systemic problems that are ongoing and really serious and a threat to democracies all over the world. And he doesn't address any part of that. He gives an entire interview where he talks about the metaverse.
Eric: I get that, I really do. How old is he?
Sheera: He's 37.
Eric: So he's 37, he's relatively young. He's not Kissinger's age or anything, with lots of wisdom and experience. He's one of the richest, most powerful people, he's incredibly intelligent. Am I good so far?
It’s a Really Bad Idea to Ship It
Eric: Very few people have ever, probably his parents too sent him off to boarding school, if you remember. Very few people have probably ever said, "That's a really bad idea," or "Have you thought about this?" How often do you think he's been challenged? I get that from his perspective, I'm not justifying it, but I get that. That's the way he thinks.
Sheera: It's incredibly dangerous to the rest of us though.
Eric: Of course, that's why we have diversity. That's why, one, we don't have dictatorships at least in the United States. To protect ourselves from ourselves and to make ourselves better.
Sheera: I think that's, again, where we wanted to kind of leave readers, is this idea that people thought Sheryl Sandberg was a safeguard. People really had their hopes banked on Sheryl Sandberg. But yes, we see a lot of her power diminished at the end of the book. And even Sheryl Sandberg, ultimately, I know she's a woman and I know she wrote about Lean In. But she's been there for a decade. There are not many women up there in the executive ranks with her.
Rachael: No, yes.
Sheera: Up until two years ago, they had more men named Chris who were executives at Facebook than they did women. That's a problem.
Eric: Women in total, not women named Chris.
Sheera: Yes, women.
Eric: Women in total, yes.
Sheera: Women in total. That's a problem. And if you take aside women versus men for a minute, because I think that's one part of it. But diversity is also about diversity of background and experience.
Eric: And thought and yes.
A Runaway Success
Sheera: And you have just the suite of executives who like Mark. He went to Exeter, he went to the most prestigious boarding school in America. Then he goes to Harvard and then he's basically a runaway success in Silicon Valley. This is not a person who's in the dark side of the world. I mean, you have Sheryl Sandberg.
Sheera: It's funny when they make mistakes, and I can't speak about which person it was at Facebook because it's an off-record interview, but I had an off-record interview with someone very, very senior there once, who said, "We failed to see around the dark corners, we don't think to look around corners." And I kept thinking, "Well, of course, you don't. You're not people who have ever had anything really bad happen to you."
Eric: I'm not going to say that, but I'm sure, death has happened or something like that. But they don't have people saying no, they haven't had that experience.
Sheera: When I say bad, I say this because I mentioned this in our pre-interview, we were chatting a little bit. I was a reporter in the Middle East for 10 years, I covered war, and famine.
Sheera: Really horrible things that happened.
Rachael: On the front lines.
Sheera: You think differently when you've been exposed to that, when you've been exposed to that level of human misery and human tragedy. And also what people do when they're truly desperate when they're in those. That is a different way of thinking about the world. I also had this unique experience where I went straight from covering the Middle East to moving to Oakland and covering tech companies.
[30:29] What the Bad Guys Will Do
Sheera: It was really jarring for me. I look at things and my first thought is like, "How will it get misused? What are the bad guys going to do?"
Sheera: And my husband laughs at me, I often say like, "Well, what are the baddies doing?" That's something I use with my toddler, I'm like, "How bad is he going to use it." And I'm like, no one at Facebook has ever taught a kindergarten class and looked at the class and been like, "What are the baddies doing?" It's just not where their head goes.
Eric: No, it's the art of the possible, the future. And that's a good thing also, that's where the diversity really helps.
Sheera: You need both. You can't run a company only thinking about the bad, but you need really strong voices in the room. I'm not talking about the token security guy analyst in the corner that's like, "Oh, look at me." No, you need executives who are empowered, who are in the room saying, "Here's how bad guys are going to use this."
Eric: I almost equate this as we're talking through it and some of the work I've done over my career to nuclear weapons proliferation and use. Nuclear power can be used for good or bad, the airplane, good or bad. I mean, hell, you can go to fire good or bad, if you're in IT, IT good or bad. And I agree 100% with you. I think there's a responsibility here. You create this amazing capability, I try not to butcher this, but what will the baddies do with it? How will they misuse it, abuse it to harm people?
Where the Book Ends
Sheera: Yes, and that's the thing, again, with the book. I probably shouldn't talk about where the book ends. I want people who listen to this actually to buy the book.
Eric: If you get them to start reading it, you'll be okay. You've got to get them to start, yes.
Sheera: But, I think that one of the places that we end is those voices in the room, that we're doing that. Whether they're the people that founded WhatsApp, or Alex Stamos, but the people in the room who were brought in to do that, all leave Facebook, ultimately. They lost those outside voices because nobody was listening to them or they were pushed out.
Eric: Or they couldn't change, and at some point, you're beaten down. And it's no fun going to work and you can't change.
Rachael: Yes, you're deemed an alarmist. Just write it off that you're an alarmist and then it actually comes to pass. So it's heartbreaking. Ostensibly, this platform is enabling nation-state attackers, and other groups with ill wills. Coming back to the security standpoint, a lot of talk is about disclosure. And what is the responsibility of someone like Facebook? I think they had found some Iranian-based hackers leveraging the platform to target US military personnel, things like that. But it's quite an enabling platform as we talked about. So then, what does that lens look like for folks like Facebook or Twitter or otherwise? What's their responsibility here?
Sheera: Yes, I'll say that Facebook's security team is fantastic. They have really developed one of the best capabilities of any security team in terms of finding nation-state attackers. Part of the reason is that the executives of the company have given them that mandate.
A Team of Experts
Sheera: They've said to them, "Here's a team of experts. They come, most of them from government intelligence backgrounds or really specific country expertise in Iran and Russia. And we want you to spend a lot of time looking for these attackers." Now we get these regularly published CIB reports, coordinated inauthentic behavior.
Sheera: It is an incredible thing that they're doing, not just here in the United States, but I would say globally. Because a lot of these CIB reports are not about the US. There are about Russian attempts at spreading disinformation in Ukraine, or Iran, and Israel constantly. So there's a lot there that is helpful. I think where Facebook is still struggling is actually one where we started this conversation. Which is, what do you do about misinformation? About the right for Americans to say things to other Americans that might be categorically false, but it's still their right.
Sheera: And I think there, where we go back to, is this question of, okay, people have the right to say things to one another. I'm not going to challenge that, I'm a journalist, I believe in free speech. There's a great quote by Renee DiResta, which is that freedom of speech is not freedom of reach. That's where I get stuck, which is, say what you want, but why does Facebook need to amplify it?
Sheera: You don't believe in that, fine.
Eric: Where do you draw that line? I can publish it in the newspaper.
Sheera: I draw the line on algorithmic recommendations and amplification. I ran an experiment last week where I joined a group. Again, I live in Northern California, so Facebook knows my geography. There are a lot of anti-vax people where I live.
Should This Group Exist?
Sheera: I joined a group for natural cures for the common cold because I wanted to see what would happen. Within one click, Facebook was pushing me into an anti-vaccine group.
Sheera: That was the next group recommendation was vaccines cause autism in kids. So that is where I get stuck, which is, it's one thing to say, "Should this group exist? Whatever freedom of speech, they can exist. Why are you recommending that people join these groups when your own studies tell you that it is dangerous to push people into those groups."
Eric: Okay, so I want to get your opinion here because you are clearly more an expert than Rachel and I are, at least me. What if Facebook had pushed you to the CDC, or the NHS in the UK, a reputable source on vaccinations?
Sheera: That's what they should do.
Eric: You're good with that?
Eric: I am also, I don't know where that line is.
Sheera: In fact, that's what Facebook's policies are. The tricky part is that Facebook technically tells reporters all the time, "Well, our policy is not to push people towards anti-vaccine groups or conspiracy groups." And yet that's still what their algorithms are doing. So even when they decide as a company, they're not going to do something anymore, they still struggle with it. That's what's so hard for me as a reporter. I'm like, "Well, you tell me you're not going to do this thing. But I can see through my own experiences at Facebook, that you're still doing these things. You're still pushing people towards these groups that are really dangerous, especially in the time of pandemic to be pushed into."
It’s All About the Algorithms
Eric: Why do you think so?
Sheera: Because that's what their algorithms are telling them.
Rachael: It's all about the algorithm, yes.
Sheera: Because that's what people join. I also understand that, which is, if you look at the studies that have been done by psychologists about conspiracy movements. And so, I'm using the word conspiracy quite broadly here. I'm not sure if the anti-vaccine movement is or not conspiratorial, but for a minute, let's just classify it there. Psychologists have shown that the biggest indicator of believing in a conspiracy is prior belief in a conspiracy. So whatever it is you're joining, whether it's like, I don't know flat earth or whatever, just name your own.
Sheera: You're going to automatically be pushed into other conspiratorial groups because that's what every algorithm shows human psychology wants. It doesn't mean that that's what's good for you. And here I use an analogy, I think it's a little extreme. It's not drugs, it's not heroin, it's sugar. It tastes really good. It's not good for you, but it tastes really good. And so, you have a little bit, and then if you don't have it really strong, you want more.
Sheera: Unless you have a dentist sitting next to you, the CDC is the dentist in this scenario. "No, bad, cavities. Remember the last cavity you got that really sucked, you don't want more cavities."
Rachael: Exactly. I remember in the book and I guess, one of the hearings he was in talking about breaking up the world of Facebook. And yet the flip side of that, with that interview, the metaverse and creating this.
[38:26] To F**k It or to Ship It is the Only Reality
Rachael: That's truly frightening if you're cascading this across the entirety of one's being, in every aspect of your living life. Then this becomes the only reality you know. It's truly frightening and you could imagine if you're young and that's how you're raised. I would be so scared for children because then, are you really making decisions for yourself? That's a critical part of becoming a person.
Eric: You don't learn that critical thought process.
Rachael: Exactly. That's what truly frightens me. But how do you regulate that, I mean, who's looking out for that, in the name of convenience?
Sheera: Yes, that's a great question.
Eric: That's the problem.
Sheera: I think about that a lot. I have two small kids. And I spend a lot of time thinking about what misinformation is going to be like for my kids, AI, and deepfakes. They're going to see things that look really real, but are not. If we don't right now, including here in the United States, start media literacy. When we were kids, we would go to the library. We'd learn how to use those cards and find accurate sources of information and what was accurate. How do you implement a program like that now? So that we're not dealing with a problem 20 years down the line where it's totally out of our control and our capabilities to even begin this kind of education.
Eric: You're coming at it from the perspective of a very well-versed educated person in this space. Imagine somebody in Syria, who's dodging bullets and bombs, just trying to survive and food. They're never going to get there, but I will tell you.
A Five-Step Process
Eric: I printed something that I've used in the past for this discussion from CISA, the War on Pineapple. I used it in an op-ed piece I did once. But CISA, they've got a five-step process on how foreign nation-states use misinformation and disinformation to skew the readers.
Eric: So from an education perspective, go get educated. Go understand how people are trying to, or the algorithms as we say here, as we've been talking about, may skew you in the right direction, and then go find those reputable sources. We need to train people on how to find reputable sources even more than critical thinking. At least read what the experts are saying, determine the experts, and read that.
Sheera: It's funny, in some ways, I don't think the US understands their power in this as well. Because when the US starts these kinds of programs, so many countries follow suit. I'm thinking of countries across Latin America right now that are struggling with this as well. The few countries that are doing it, by the way, Singapore and Estonia are among the only two countries in the world that currently have media literacy programs as part of curriculums for small kids. They're already seeing tremendous results in how smart kids are getting about what they share online and how they use social media.
Rachael: That is a great point, Sheera, because a lot of these countries too, they're like, "Where do we start? How do we move forward?" Then the US provides that path forward, and you can learn. That's so much of the information sharing that we talk about in security, that's so valuable. It's kind of the first domino to push down and that's how you get there.
The Arab Uprisings of 2011
Eric: Sheera shared at the beginning of the discussion. I think it was Myanmar, the individual was saying, "Well, it came from the US." So they're inclined to believe it. It's scary.
Sheera: This is a whole chapter that got cut from the book about the Arab uprisings of 2011. But a lot of pro-democracy activists all over the world use Facebook because they think they're going to be safe there. That's what we saw in Tahrir Square. Ultimately, just a few years later, when the military coup happened. Well, they began with arresting LGBT groups in Egypt. Then they moved on to the Muslim Brotherhood. All these Egyptians that were getting arrested were saying, "But I put it on Facebook and Facebook is American, why wasn't I protected?"
Rachael: Wow, that's so scary.
Sheera: I mean, the power of a company like Facebook, part of it is that it is American and it has that shine, that gleam to it. But, again, there isn't that literacy of like, "Oh, well, if I put something on here, my own government can use it against me. There is no guarantee that it is real, there is no guarantee that it is fact-checked by anyone." Those are really important things. That if you're Facebook and your motto is Connect the World, is maybe their current motto.
Sheera: It's not connected to the world responsibly, and teaches people that things they may see on Facebook are also not real. It might be hate speech and lead them down a path of misinformation.
Rachael: It's to connect the world at any cost and be the first one to do it.
Facebook Is a Great Platform
Rachael: The book is about Facebook, which I think is a great platform to discuss the challenges that face societies these days. This is really any social media platform. I just saw an article the other day as I was reading, actually, it was today. It's not only the big platforms that spread misinformation from vaccine skeptics. So there are a lot of organizations out there, large and small that have similar types of responsibilities, Facebook, arguably the biggest.
Sheera: Yes. We wanted to focus on Facebook because it was the biggest. People often ask us this, "But what about YouTube? What about Twitter?" I'm like, "Yeah, absolutely." Facebook's the biggest between the three family of apps, we're talking about over three billion people using their products. This is bigger than the Catholic church, the height of their power, ultimately.
Eric: A third of the world, maybe, population-wise.
Rachael: They should lead by example though. Isn't that the responsibility when you're number one, that you do lead responsibly and that others will follow.
Eric: Is it, or is it just sheer profit and growth? I mean, a real legitimate question.
Sheera: They're at a point right now, they're a trillion-dollar company. They're one of the first tech companies to reach that title, becoming a trillion-dollar company. Do you say, "It might cost us a couple billion, but we won't be first in that market or we'll grow a little bit slower?" At what point do you make that calculation of, "We're okay with our growth slowing down a little bit if it comes at the cost of being more responsible about people and setting an industry-wide standard." That would be the other thing, the only ones in a position to set an industry-wide standard.
[44:39] How Much Money is Enough to Ship It?
Rachael: Exactly, and how much money is enough.
Eric: You'd have to adjust goals though. Your goals have to change from being the biggest, the best, the most profitable, whatever it may be. Connecting the world, doing it responsibly, puts some drag on the system. That's a cultural change, in my 25 years of experience in the business world, "Hey, I need you to go fast, but not too fast. I want you to do it carefully." There's some tension there.
Sheera: Yes, absolutely. It would mean going against the entire ethos of Silicon Valley, which I'm sure isn't easy for them.
Eric: Which is tough.
Sheera: It's tough, but I don't know, are you big enough, are you rich enough to finally do that?
Rachael: Right, well, exactly. We've seen enough kinds of bodies littered on the highway, just to use an analogy or euphemism, whatever that is. I'm hungry for lunch. But there's enough, that's pointed to us that "You have to do this." We've seen all the problems that have happened in lives lost. Literal lives lost when you're not thinking this way, and at some point, it's just wholly irresponsible.
Eric: I almost think it's a bigger challenge. To be the biggest, the baddest, to grow the fastest, that Silicon Valley drive, I understand it. I've worked for Silicon Valley companies for my whole career. But I think the bigger challenge is to do it responsibly, "I want you to be the biggest, I want you to be the best. But I want you to do it in this manner. I want you to do it safely," or whatever it may be.
Ship It As Quickly As Possible
Eric: We see the same thing in cybersecurity companies or software companies, they want to ship products as quickly as possible. They don't put the money into security that they should have, all of them. Why? We've got these material goals, we've got these goals as organizations to get it out there. That's a higher challenge, can you survive though? Can you keep up? That's the question people struggle with.
Rachael: Well, if you're a trillion-dollar company, yes, you're good. You're good. Yes, I'm going to say that. I do want to be mindful of Sheera's time because you're on vacation. So one last question though. Okay, I would love to read part two, just throw your notes and outtakes in a part two book. I don't care what they look like and they could be handwritten notes, but I want to see those other 400 pages.
Sheera: You want to see my whole chapter of the Arab uprisings of 2011?
Rachael: Absolutely, I'm hungry for that book. So let me know when it's out there because I'll be the first one to buy it for sure.
Sheera: Thank you so much.
Rachael: Awesome, all right, we'll let you get back to your vacation. Thank you so much Sheera for joining us today. This has been a wonderful conversation. Again, congratulations on the book. It's a wonderful read.
Eric: This made you think, this was a thought piece today.
Sheera: Oh, it's always my goal.
Rachael: All right, well with that, thank you for joining this week's episode. Until next time, be safe.
About Our Guest
Sheera Frenkel, NY Times technology reporter and co-author with Cecilia Kang of the riveting must-read book “An Ugly Truth: Inside Facebook’s Battle for Domination” joins us for this week’s podcast. Sheera takes us through her many years of reporting in the Middle East and Silicon Valley to paint a picture of the real-world consequences of newly found access to the Internet (aka Facebook in Myanmar), having no mechanism for distinguishing between mis/disinformation campaigns on social media and absolute trust in Big Tech can have on cultures and society.