• Censorship and Freedom of Expression
    Press Freedom and Digital Safety
    Play
    Ela Stapley, digital security advisor at the International Women's Media Foundation, discusses strategies for the safety of journalists as they report on the 2024 election cycle. Tat Bellamy-Walker, communities reporter at the Seattle Times, discusses their experiences with online harassment and best practices for journalists on digital safety. The host of the webinar is Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times. A question-and-answer session follows their conversation. TRANSCRIPT FASKIANOS: Welcome to the Council on Foreign Relations Local Journalists Webinar. I’m Irina Faskianos, vice president for the National Program and Outreach here at CFR. CFR is an independent and nonpartisan membership organization, think tank, and publisher focused on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine. As always, CFR takes no institutional positions on matters of policy. This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues and provides a forum for sharing best practices. We are delighted to have over forty journalists from twenty-six states and U.S. territories with us today for this discussion on “Press Freedom and Digital Safety.” The webinar is on the record. The video and transcript will be posted on our website after the fact at CFR.org/localjournalists, and we will circulate it as well. We are pleased to have Ela Stapley, Tat Bellamy-Walker, and host Carla Anne Robbins with us for this discussion. I have shared their bios, but I’ll give you a few highlights. Ela Stapley is a digital security advisor working with the International Women’s Media Foundation. She is the coordinator of the course “Online Harassment: Strategies for Journalists’ Defense.” Ms. Stapley trains journalists around the world on digital security issues and provides one-on-one support for media workers in need of emergency assistance. Tat Bellamy-Walker is a communities reporter at the Seattle Times. Their work focuses on social justice, race, economics, and LGBTQIA+ issues in the Pacific Northwest. Tat also serves on the National Association of Hispanic Journalists LGBTQIA+ Task Force, as a member of the Seattle Times Committee on Diversity, Equity, and Inclusion. And Carla Anne Robbins is a senior fellow at CFR and co-host of the CFR podcast “The World Next Week.” She also serves as the faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. And previously, she was deputy editorial page editor at the New York Times and chief diplomatic correspondent at the Wall Street Journal. Welcome, Ela, Tat, and Carla. Thank you very much for being with us today. And let me turn the conversation now over to Carla. ROBBINS: Irina, thank you so much. And, Ela and Tat, thank you so much for doing this. And thank you, everybody who’s here today. We’re going to chat among us just for about twenty, twenty-five minutes, and then questions. You’re all journalists; I’m sure you’re going to have a lot of questions. So, Ela, can we start with you by talking about the threat environment, as we national security people refer to it? The IWMF announcement on your safety training—and I want to talk about that—referred to, quote, a “spike in physical and digital violence” directed against U.S. newsrooms in particular. And it said, “This year alone, thirty journalists have been assaulted and eight have been arrested in the U.S., all following a surge of anti-media rhetoric.” And you also said that the U.S. currently ranks forty-fifth on the World Press Freedom Index, down from thirty-two just a decade ago; and also that this abuse disproportionately affects women and diverse journalists, who are often reluctant to speak out for fear of jeopardizing their careers. Can you talk a little bit about the threat environment here in the U.S., what’s driving it, and the different forms it’s taking—it’s taking? STAPLEY: Yeah, sure. So I’m Ela Stapley. I’m a digital security advisor. So when I look at the threat environment, I’m looking at it from a digital safety standpoint. What we do see in the U.S., and we have seen now for a number of years, is a massive uptick in online abuse—or online violence, as it’s now called, in order to get across the seriousness of the situation. So when we’re talking about online abuse/online violence, what we’re really saying there is attacks on journalists that are now so serious that it’s really limiting their ability to do their work. And it’s really having—I don’t say this lightly—an impact on democratic conversation. So one of the biggest issues that you see in the U.S. is this, along with common tactics that are used with online harassment and online violence. And that includes the publishing of journalists’ personal information online, known as doxing. This includes the home address or personal contact, such as a personal email or personal phone number for example, with an intent to do them some kind of harm. And we do see that being used against journalists in the U.S., especially if they’re covering particular beats. That includes so—kind of far-right or alt-right groups, for example, who—one of their tactics is doxing journalists online or people who talk about them in a way that they don’t agree with. So that is one of the biggest threats we’re seeing. And we’re in an election year. I do think we did see this during the last election. There will be an increase in online abuse and harassment during that time, and all the other threats that come with it, which include doxing but also things such as phishing attacks, for example; malware attacks; and possible hacking attacks of accounts, for example, could also be something that we see an uptick in. Other threats that journalists are facing. If they’re going out and they’re covering the election—so some of these rallies or places where they’re going—the chance of a physical confrontation might be quite high. So you’re seeing there kind of damaged equipment, which it sounds like a physical safety issue but is actually a digital security issue as well. So journalists quite often carrying their personal devices instead of work devices, that’s very common, especially for freelancers. And you know, if you haven’t backed those devices up, the content on them; or if you’re detained and those devices are searched, for example; what about the content that you have on them? How safe is that content? Not very. And do you have sensitive contacts on there or content that could put you or your sources at risk is something I’ll say that journalists, you know, need to be thinking about, I would say. And we do see that across the U.S. Obviously, some areas, some states may be more complicated than others. ROBBINS: So I want to get to Tat and talk about your experiences and that of your colleagues, and then want to talk to both of you, because it seems like there’s this intrinsic tension here because—I mean, I’m going to really date myself—back in the day, when I started in the business, the idea that we would want to not share our emails or not share our phone numbers with people who we would want to be reaching out with us, we wouldn’t want to hide from potential—people who could be potential sources. So I understand there has to be, you know, a separation between the public and the private because the private can really be a vulnerability, but it's certainly a very different world from the world in which—when I started. And I will add I started writing with a typewriter back in the—Tat, there used to be typewriters. For you young’uns. OK. So, Tat, can you talk about your experience and that of your colleagues, in Seattle but also the people that you deal with in the groups that you work with? BELLAMY-WALKER: Yeah. So I’ll talk a little bit about, like, my, like, personal experience. So last year I was covering, like, the local response to, like, the national, like, uptick in anti-drag legislation, and I interviewed, like, several, like, trans drag performers about, you know, how that had an impact on them. Like, it severely, like, limited their, you know—in terms of, like, violence, they were experiencing violence, and it was, like, difficult for them to navigate this, like, increasingly, like, hostile climate where, like, anti-drag, like, legislation was just going through the U.S. So from me, like, writing that story, I started to get, like, a lot of, like, transphobic emails targeting me and my sources. And then from, you know, the, you know, transphobic emails and messages, later on, there ended up being, like, at this conservative Facebook page that also seen, like, the stories that I cover. And I’ve been covering, like, LGBTQ issues for a very long time including writing, you know, personal essays about my experiences as, like, a trans person. And they—like, they wrote this whole—this whole Facebook post about, you know, like calling me, like, a girl, like—it was, like, this whole thing. And they included, like, you know, that I work at the—at the Seattle Times. Like, it was, like, this very intense situation. And it ended up escalating even more to the Blaze writing a story about me. And it just—like, it just escalated from me. You know, I wrote the story. Then, you know, there was the conservative Facebook page. And then, you know. You know, it ended up in a story being written about me. And so, like, things like that are very—are very serious, and really do you have, like, a negative impact on how, like, trans journalists, like, do our work. And for me, like, at that time, it did make me feel pretty, like, traumatized to see, like, how, you know, my story was, like, taken—like, it was—it just—it felt like it was just being used as this—like, this negative force, when I was trying to write about, like, why these drag performers were pushing for their craft, and why they—you know, be felt so intensely to push for their craft, at a time of such hostility targeting drag performers. So for me, at that time, what was most important was to, like, assess, like, my online presence, and see how far was this going. Like, how far was this harassment going? So I made sure to, like, lock down my accounts. You know, that was very important for me to do. Also, having a friend document the abusive language that was coming up under the different post was very helpful. And just kind of like logging what was happening to me on, like, a day-to-day basis. Yeah, so that's essentially what I experienced. And it made me want to—I guess, in some way it made me want to make sure that I'm very careful about the information that I put out there about myself. So I have since, like, removed, like, my email address, you know, from the Seattle Times website. I try to be pretty careful about what I put online about myself. Yeah, so that—I would say those—that is how that had, like, an impact on me and my role in journalism. ROBBINS: So before you wrote that story—because, of course, you were writing about people being harassed because of what they did—did you think about the fact that you were going to be harassed for what you did in writing about them? BELLAMY-WALKER: At that time, I did not—I did not think about that, about how, like, writing about this story would have an impact on me. At that time, I did not think about that. But now, like, in hindsight, I know that it's important to be prepared for those, like, online attacks. And, like, vitriol and in everything. But yeah, it just—like, I didn't realize, like, how far it would go. Because at that time I was also pretty vocal about, you know, the lack of diversity of trans journalists in—just in journalism and the industry in general. So that also caught fire online with folks, you know, targeting me for that as well. So, I feel like all of those situations started to make me, like, a very big target for—you know, for these for these folks. But I know now in the future that it's important for me to prepare for these online attacks and everything. ROBBINS: So you're talking about you preparing. Ela, I want to go back to the training that IWMF does, and this handbook, that we're going to share with everybody, that you all have developed. Which has really, I think, absolutely fabulous worksheets. This is one which I have here, which is an online violence risk assessment, which talks about things like have you previously been targeted. You know, questions that you want to ask yourself, that newsrooms want to ask themselves before the work starts. Can you talk about the training that you all have done, and some of the—some of the things that take place in that training, that that makes—preemptively, as well as once things have happened? Some of them are big changes—raising awareness in the newsroom—and some of them are actually technical changes. Like some of the things Tat’s talking about, training people about how to even reel back their information. When I read this I thought to myself, God, there’s so much information out there. Is it even possible to pull that back? STAPLEY: Yeah, so, unfortunately, Tat’s story is pretty familiar to me. It’s a story I’ve heard many times. And what we used to see—so I’ll talk a little bit about how online harassment kind of came to be and where it is now, just very briefly. So it used to be in the newsroom, online violence or harassment was seen as, you know, something that happened to normally women journalists, so nobody really paid that much attention, if I’m honest with you. It’s only in the last few years that newsrooms have started more seriously to pay attention to online violence as an issue in terms of protecting their journalists. So online harassers were also seen as kind of just a guy in a hoodie in their basement attacking a person over and over. And that stereotype still exists. That person still exists. But now there’s a whole other layer, there’s a whole array of other actors involved, including state-sponsored actors, particular groups online who are hacking groups, but also other groups who feel very passionately about particular topics on the internet. And I use the word “passionate” there not in—not in a positive sense but also negative. So they have strong opinions about it. And they will target journalists that publish on these issues. And I think before we could predict who those journalists would be. So if you were covering particular beats you were more likely to get harassment. But now we’re seeing it as just a general attack against journalists, regardless of the beat. So if you’re a sports journalist, you’re likely to get attacked by sports fans equal if you’re covering—you know, the journalists who are covering LGBTQ+ issues, anything to do with women, anything to do with race—disproportionately likely to face attacks. And if they are from that community themselves, even more so. There’s a lot of academic research that’s been done on this. So Tat’s situation, unfortunately, for me in my position, I would see Tat, I would think: This is a story that Tat’s covering. The likelihood of Tat getting abuse is incredibly high. Now, from our work with newsrooms, what we began to see is that newsrooms started to think about how they could better protect their staff. In some newsrooms, you know, that conversation needed to be had. But some newsrooms were reaching out to us proactively. And I have to say, the Seattle Times was one of those. And I have to give a big shoutout to the Seattle Times for their interest in the safety and security of their journalists. And we’ve worked very closely with the Seattle Times on this guide, actually. So part of the pre-emptive support is not only raising awareness with upper management, because if upper management are not on board it’s very difficult to implement changes, but also putting good practices in place. So the more you can do in advance of an online attack, the better it is for you. Because it’s very difficult to be putting best practices in place when you’re in the middle of a firestorm. So the more pre-emptive steps you can take, the better it is for you, as a newsroom but also as an individual journalist within that newsroom, especially if you fit into one of those categories that are more high risk. So we, at the IWMF, we've been working very closely with journalists. We started training journalists and newsrooms in data protection. So how to best protect your data online? This is the kind of information Tat was talking about—your email address, your cellphone, your home address. But what we realized was the training wasn't enough, because after the training the journalist would go, well, now what? And the newsrooms would be, like, well, we don't have anything. So what we needed was policy. We needed best practices that journalists could access easily and, ideally, roll out fairly easily to staff. Now, I will say that a lot of content for this does exist. There are other organizations that have been working on this topic also for an equally long number of times, and they do amazing work. But what we were hearing from journalists was: There's a lot of information and we need short, simple one-pagers that will really help us protect ourselves. And also. editors were saying: We need it to help protect our staff. So they didn't want to read a fifty-page document. What they wanted was a one-page checklist, for example. So the guide that we created came out of a pilot that we ran with ten newsrooms in the U.S. and internationally, where we worked with—and the Seattle Times was one of those—we worked with the newsroom very closely, with a particular person in that newsroom. to think: What do they need and how could we implement that for them? In some cases, in the Seattle Times, they created their online—their own guide for online harassment. In some cases, it was newsrooms that they only could really manage to have a checklist that would help them protect staff data as quickly as possible. So it really depends. Different newsrooms have different needs. There's no really one-size-fits-all when it comes to protecting staff. I can't say to this newsroom, you need to do this. I can say, what is your capacity? Because a lot of newsrooms are overstretched, both financially but also in terms of people. And how many cooks in the kitchen? Generally, the bigger the newsroom, the more difficult it is to roll out change quickly because you need more buy-in from different areas within the newsroom. And the most successful pre-emptive support we see is from newsrooms where there is, what we call, a newsroom—a champion in the newsroom. Someone who pushes for this. Someone who maintains that momentum and is also able to communicate with HR, for example. Because some support needs to come from HR. What do you do if you've got a journalist who needs time off, for example because they've been getting death threats? Support from IT departments. Traditionally, IT departments in newsrooms are responsible for the website, for making sure your email is running. They're not generally resourced and trained in how to deal with a journalist who's receiving thousands of death threats via their Twitter feed. So getting newsrooms to think about that, and also getting newsrooms to think about you have journalists who are using their personal social media for work-related content. And you request them to do this. But you are not responsible for protecting those accounts. And that’s a real gray area that leaves a lot of journalists very vulnerable. So their work email may have all the digital security measures in place and helped along by their IT team, but their personal Instagram account or their Facebook account has no security measures on it at all. And that is where they will be most vulnerable. Because online attackers, they don’t just look at the journalist in the newsroom. They look at the journalists, the whole picture. So your data that you have on the internet is really your calling card to the world. So when people Google you, what they see is how you are to them. So they make no distinction there. There’s no distinction for them in terms of work and personal. So at the IWMF what we’ve been doing is really working with newsrooms to help them roll out these best practices, best as possible, to put them together, to help them write them, and then to sit with them and try and figure out how they can roll it out. And some do it quicker than others, but there’s been a lot of interest. Especially now, during the election. ROBBINS: So, Tat, can you—what’s changed since your experience? What do you do differently now? BELLAMY-WALKER: Yeah. I would say maybe like one of the main things that I—that I do differently is, like, trying to prepare ahead of these potential attacks. So that includes like, doxing myself and removing personal info about myself, like, online. So like signing up for, like, Delete Me, sending takedown requests to data broker sites, submitting info removal requests to Google. Sometimes that works. Sometimes it doesn’t. But trying to, like, take away that, like, personal information about myself. I would also say, locking down my accounts and using more two-factor authentication. For, like, my passwords in the past, I have just used very simple, easy-to-remember passwords. But I have learned, like, since the training that it’s really important to have a password that’s way more secure. Even for me on the go, I just want something that’s easy to remember. So using, like, a password manager, like one password. So that has also been helpful for me. And also paying attention to my privacy settings. You know, on, like, Facebook or Twitter. You know, making sure that it’s only me that can look up, like, my phone or my personal, like. email address. So that is helpful. And just generally, like, using the resources from IWMF’s online violence response hub. That has been very helpful as well, and making sure that I have a good self-care practice. And having, like, a team of folks that I can process these different challenges with, because unfortunately, like, you know, this won’t probably be, like, you know, the last time that I experience threats like this, given the nature of my reporting. So it’s really important for me to also have, like, a self-care practice in place. ROBBINS: So maybe, Ela, you want to go through some of that a little bit more deeply, although Tat sounds like Tat’s really on top of it. So these online data brokers, can you just—do you have to pay them to delete yourself? Or are they legally—you know, do they have to respond to a request like that? STAPLEY: OK. So let me start by saying that the U.S. has some of the worst data privacy laws I’ve ever seen. ROBBINS: We’ve noticed that before. STAPLEY: So it’s very difficult for a journalist to protect their personal information, just because so much information in the U.S. has to exist in a public-facing database, which, for me, is quite astounding really. If you buy a house—I don’t know if this is statewide or if it’s just in certain states— ROBBINS: Let me just say, as journalists, we are ambivalent about this, OK? On a certain level, we want to protect ourselves. But on another level, that’s really useful if a corrupt person is buying that house, OK? So we’re—you know, we’re not really crazy about the ability to erase yourself that exists in Europe. So we’re ambivalent about this. But please, go on. STAPLEY: Yeah, but I think from a personal safety standpoint it makes you very vulnerable. And the reason for this is that journalists are public-facing. So but you don't have any of the protection that is normally offered to kind of public-facing people. If you work in government, for example, if you're incredibly famous and have a lot of money, for example, you can hire people. So a lot of journalists don't have that. So it makes them very vulnerable. And they're also reporting on things people have strong opinions about, or they don't want to hear. And they're also very—they're very visible. So and they give—this gives people something to focus on. And when they start digging, they start to find more and more information. So and when I talk about journalists having information on the internet, I’m not saying that they shouldn’t have anything. Because a journalist has to exist on the internet in some form, otherwise they don’t exist and they can’t get work, right? So it’s more about the type of information that they have on the internet. So ideally, if I were to look a journalist up online, I would only find professional information about them, their professional work email, where they work, probably the town they live in. But I shouldn’t be finding, ideally, pictures of their family. I shouldn’t be finding pictures of their dog in their home. I shouldn’t be finding photos of them on holiday last year, ideally. So it’s more about controlling the information and feeling that the journalists themselves is in control of that information that they have on the internet, rather than people putting information on the internet about you. So data brokers sites, you’re very familiar with them. As journalists, you use them to look up sources, I’m sure. But people are also using them to look up you. If I was a citizen, never mind just a journalist, in the United States, I would be signing up to a service. There are a number of them available. One of them is called Delete Me. And they will remove you from these data aggregate sites. Now you can remove yourself from these data aggregate sites, but they are basically scraping public data. So they just keep repopulating with the information. So it’s basically a constant wheel, basically, of you requesting the information to be taken down, and them taking it down, but six months later putting it back up. So companies will do this for you. And there’s a whole industry now in the U.S. around that. Now the information that they contain also is very personal. So it includes your home address, your phone number, your email. But also, people you live with, and family members, et cetera. And what we do see is people who harass online, if they can't find data on you they may well go after family members. I've had journalists where this has happened to them before. They've gone after parents, siblings. And so it was a bit about educating your family on what you're happy and not happy sharing online, especially if you live or have experienced already harassment. So that's a little bit about data broker sites. We don't really see this in any other country. It's very unique to the United States. With all the good and bad that they bring. But in terms of privacy for data for journalists' protection, they're not great. Other preemptive things that journalists can do is just Google yourself, and other search engines. Look yourself up regularly and just know what the internet says about you—whether it’s negative, whether it’s positive. Just have a reading of what the internet is saying about you. I would sign up to get Google Alerts for your name, and that will alert you if anything comes up—on Google only—about you. And when you look yourself up online, just map if there’s anything there that you’re slightly uncomfortable with. And that varies depending on the journalist. It could be that some are happier with certain information being out there and some are less happy. But that’s really a personal decision that the journalist makes themselves. And it really depends on, what we call in the industry, their risk profile. So what do I mean by that? That’s a little bit what I was talking about earlier, when I was talking about Tat’s case. The kind of beat you cover, whether you’ve experienced harassment previously, or any other digital threats previously, who those attackers may be. So it’s very different the far right or alt-right, to a government, to, you know, a group on the internet of Taylor Swift fans, for example. So knowing who the threat is can be helpful because it helps you gauge how more or less the harassment will be and also other digital threats. Do they do hacking? Are they going to commit identity theft in your name? So getting a read on that is very important. Identity theft, a lot of groups like to attack in that way, take out credit cards in your name. So it’s quite good to do a credit check on yourself and put a block on your credit if you are at high risk for that. And you don’t need to have this all the time. It could just be during periods of high levels of harassment. For example, during an election period where we see often a spike in online harassment. Once you have seen information about yourself online, you want to take it down. If you are the owner of that information, it’s on your social media, et cetera, you take it down. The internet pulls through, it removes it. Please bear in mind that once you have something on the internet it’s very difficult to guarantee it’s completely gone. The reason for that is people take screenshots and there are also services such as the Internet Archive, services like the Wayback Machine. These types of services are very good at taking down data, actually, if you request. You have to go and request that they remove your personal data. So you may have deleted information from Google or from your own personal Facebook, but maybe a copy of it exists in the Wayback Machine. And quite often, attackers will go there and search for that information and put it online. So if somebody has put information about you on what we call a third-party platform—they’ve written a horrible blog about you, or it exists in a public database—then it’s very difficult to get that data taken down. It will depend on laws and legislation, and that varies from state to state in the U.S., and can be quite complicated. I’ve had journalists who’ve been quite successful in kind of copyright. So if people are using their image, they’ve—instead of pursuing it through—there are very few laws in place to protect journalists from this, which is something else that that’s an issue. If you do receive online harassment, who do you go to legally? Or maybe even it’s the authorities themselves that harassing you, in certain states. So maybe you don’t want to go to the authorities. But there’s very little legal protection really there for you to get that data taken down and protected. So once you’ve done kind of knowing what the internet says about you, then you just need to make sure you have good account security. What do I mean by that? That means having something called two-factor authentication turned on. Most people are familiar with this these days. They weren’t when I was doing this five years ago. Nobody had heard of it. Most people are using it now. Most people are familiar with this through internet banking, where you log into your account and a text message comes to your phone or an email with a code. Most online services offer this now. Please, please turn on two-factor authentication. There are different types. Most people use SMS. If you are covering anything to do with alt-right, far right, anything where—or hacking groups, or particular—if you’re covering foreign news, I don’t know if there’s here, and you’re covering countries that like to hack a lot, you want to be looking at something a bit more secure, such as an app or a security key. And then making sure yeah, and Tat mentioned a password manager. The most important thing about passwords is that they're long. They should be at least fifteen, one-five, characters. And they should be different for each account. Sorry, everyone. And the reason for that is if you are using the same password on many accounts, and one of those services that you have signed up for gets hacked, they've been keeping your password in an Excel sheet on their server instead of an encrypted form, then everyone will have your password for your Gmail account, your Instagram account, et cetera. That's why it's really important to have different passwords for different accounts. How you can do that? Using a password manager or, it is statistically safer to write them down and keep them safe in your home. If you feel safe in your home, if you're not at risk of arrest and detention and you don't cross borders, statistically it's much safer to write them down. Don't obviously stick them to your computer, but you can keep them somewhere safe in your home. Much safer than having passwords that are very short or reusing the same password on many accounts. Or, on any other account. That will prevent hacking, basically. Which online abusers do like to do? So that's kind of a little bit of a very quick walkthrough on that. And we do have resources that we can send out which will guide you through that. ROBBINS: So I want to turn it over to the group. I’m sure you guys have questions. You’re journalists. So if you could raise your hands or put it in the Q&A, please. I’m sure you have many questions for our experts here. While you’re doing that, I’m just looking at the participant list. If not, I’m going to start calling on people. It’s something I do all the time. It’s the professor side of me that does that. Well, while people decide what they’re going to ask, Tat, so since I said Ela said that your newsroom is actually one that’s been trained in, and that’s actually quite good, how much support they give you? And what sort of support? I mean, if something costs money, did they pay for it, for example? You know, have they—you know, have they given—paid for password manager? Have they given you, you know? And what’s the—what’s the support they gave you, and what do you wish they gave you? BELLAMY-WALKER: That’s a really good question. Well, I would say, maybe the first thing that they had—like, you know, they sent over the different, like, resources, and, you know, for, like, online harassment. And also, they recommended that I take out my, like, email address from the bio online. Since so many of my—since so many of the messages were coming to my email. But in terms of, like, money towards, you know, getting, like, a password manager or, you know, trying to delete some of these, you know, information about me from the internet, I was not provided, like, support with that. And I think just, like, in the future, I—you know, at the time of these stories I was very new to my position. And I think it’s, like, you know, it would be great if, like, news organizations, like, give more trainings on online, like, risk. I think that would be very, like, helpful. Like, alongside having a guide, like a training as well, for, like, new employees. I think that would be very helpful. ROBBINS: So sort of basic onboarding? I mean, this should be a required—a required part of—a required part of it. Ela, are there newsrooms that are doing that now? They've just sort of included this as part of the onboarding process. STAPLEY: Well, ideally, it would be included in the onboarding process. A lot of newsrooms we’ve worked with have included it within the onboarding manual. But obviously, training is money. Newsrooms are short on money these days. So it can be quite difficult. And also, if there’s a high staff turnover, one of the issues we’ve noticed is you can create the best practice, you can train journalists, but journalists leave. New journalists come. Who’s staying on top of that and managing that? And that’s why it’s important to get HR involved from the beginning because maybe HR—in some newsrooms, HR is the editor and also the IT person. So it really depends on the size of the newsroom and how much support they can offer, in terms of financially as well, how much support they can offer. Delete Me is expensive if you add it up for many journalists within your newsroom, or other data broker removal services. One Password actually does free accounts for journalists. So I would recommend that you have a look at that. They have One Password for journalism. And you can—and you can sign up for that. But obviously, it costs money. You know, and there are bigger issues newsrooms need to think about as well. So one of the things we encourage them to think about is how much support can you offer, and also to be honest about that support. So what you don’t want is a journalist who’s been doxed, their home addresses all over the internet, they’ve had to move out, but they find out their newsroom can’t pay for that. So where did they go? Do they still have to work during that period, for example? So getting newsrooms to think through these issues in advance is really helpful for the newsroom because then they can say, look, if this happens we are able to provide this for this amount of time, and after that, you know, we can do this, this, and this. Some newsrooms can't afford to pay for journalists to move out of their home because their budget is too small, but maybe they can offer time off, for example, paid time off, or mental health support through insurance. Maybe they can start to build community networks in the newsroom. This is increasingly more important, as newsrooms—we were speaking about this earlier—are more remote. So people aren't coming into the office so much. So you're not connected to people as much. There's no kind of chatting to people around the water cooler like they used to. So, you know, this kind of self—almost kind of exchanging information between journalists around, like, how to protect against issues or which issues are causing more conflict or—could be tricky. It may not be being picked up, on especially for younger journalists coming into the newsroom because, you know, they're just starting out on their journalism career. They don't have years of experience behind them. And they can often be vulnerable to attacks and, you know, I, on several occasions, spoke to editors at newsrooms, small local newsrooms, who had sent out, you know, like, a young reporter or just a reporter—junior reporter to cover a protest, which was actually a far-right, or alt-right march. And then that journalists would be doxed. And the journalists were completely unprepared for that. The newsroom was completely unprepared for that. Because they hadn’t assessed the risk. They hadn’t seen what the risk, and they wouldn’t have known that doxing was a very common tactic used by these groups. So planning for that in advance is really important. That’s why risk assessment can be really great—a great tool. Getting newsrooms to think through risk assessment processes. ROBBINS: So we have two questions. One from someone named Theo. I’m not sure, I don’t have a list in front of me. Do you recommend any apps for password managers? This person says: I went to a seminar that suggested LastPass, and then LastPass had its data stolen a few months later. This has always made me actually nervous about password managers. I sort of wondered how secure they are. It seems to me every time I get my snail mail I’m getting another warning that, like, something else of mine has been hacked. And we’re going to give you a year of, you know, protection. Are there any of these apps—are they actually secure? STAPLEY: So, one of the things about digital security and safety that journalists really hate is that it’s a changing environment. So, something that was safe, you know, yesterday, isn’t safe today. And the reason for this is, is that tech changes, vulnerabilities become open. Hackers attack. Governments and other groups are always looking for ways to attack and find access. And people in my industry are always looking for ways to protect. So it’s always in a kind of constant change, which is frustrating for journalists because they just want to say use this tool, it’ll work forever, and it’ll be fine. And I’m afraid digital safety is not like that. So nothing you use that is connected to the internet in any shape or form is 100 percent safe, or any device. And the reason for that is, is there is always a possibility that there is a vulnerability that in some area that could be leveraged. So what you’re looking for is really for journalists to stay up to date with the latest tech information. And you’re all journalists. So this, you know, it’s just research. So it should be pretty OK for you to do. The best way to do it is just to sign up to the tech section of a big newspaper, national newspaper, and just get it coming into your inbox. And you’ll just stay up on, like, who’s buying who, what data breaches have there been, who’s been hacked, what hacking groups are out there. You don’t have to investigate in depth. You just have to have a general read of what’s happening in the global sphere around this issue. I think Elon Musk's buyout of Twitter, for example, is a very good example of, you know, what happens when a tech tool that we all depend on changes hands, right? I know journalists who built their entire careers on Twitter and are now just really floundering because it's so difficult to access audiences and get the information. So in order to answer your question, no, nothing is 100 percent safe. But if you're looking to use something, there are certain things that you should look for. Like, who owns this tool? What are they doing with your data? And how are they storing that data? So in terms of password managers, for example, password managers are currently the industry best practice for passwords for the majority of people. There are certain groups within that who may be advised not to use them, most of them are the more high-risk ones. So they—password managers are keeping your passwords in encrypted form on their servers. What does that mean? If someone hacks a password manager, they can't gain access to those passwords. In terms of LastPass, what we saw was security breaches but no actual passwords being accessed. But the fact that they'd have several security breaches made people very unsettled. And, you know, people have been migrating off LastPass, basically. It means their general security ethos may not be as secure as people want. So, you know, you have to move elsewhere. And that is for any tech tool that you use. So now maybe people aren't using Twitter; they’re moving over to LinkedIn. You may be using iMessage one day but may have to migrate over to WhatsApp another. So having many options in play is always—is always good as well. So don’t just rely on one thing and expect it to work forever in the world of tech. Generally, it doesn’t. ROBBINS: We are we have—so, Theo, I’m just going to answer your question really quickly, because that’s one that I actually know something about. This is—Theo asked whether there’s any suggestions—and Theo, I believe, is Theo Greenly, senior reporter at KUCB. Suggestions when finding/choosing a fixer on a reporting trip, especially abroad? Questions to ask or things to look for when initially assessing risk before a trip. I would just say, for finding a fixer, find somebody who’s worked in that country already and ask their advice. That’s the only way you can do it. It’s just—the same way if you’re going down a road and whether or not you think there are mines on that road, ask people who know. There’s, like, no—you just have to rely on the kindness of people who’ve already worked in that environment. And it’s just—that’s what I did for years and years and years working abroad, is that I always relied on people who knew more. I can tell you the first trip I had was in Haiti. The overthrow of Baby Doc. Yes, I’m that old. And I was flipping out. And I called my husband, a very experienced foreign correspondent. And he said to me, find Alfonso Chardy from the Miami Herald, and do everything that he’s already doing. He was completely right. And that’s how I learned how to do it. So that’s—you know, there’s no secret here. It’s just find more experienced reporters. And they’re usually really kind, and they’re really, really helpful. So there’s a question from—is it Steve Doyle? StDoyle. What suggestions do you have for journalists facing physical threats? How should journalists be prepared for that? Ela, Tat? I don’t know if you—this is focused on digital, but do you guys—have you heard of any training? I know that when my reporters at the Journal went overseas, they had a lot of training on security, particularly the ones who went to Afghanistan and Iraq. And we had to pay for it. We went to security companies that trained them. Have you heard anything about people being trained for physical protection in the United States? STAPLEY: Yeah, the IWMF is currently actually on their U.S. safety tour. So they’re visiting states and training them in physical and digital safety. So you can go to the website and check that out. So they do do also the HEFAT training as well. I’m not a physical security expert, so I can’t really speak to that. But, yes, there are organizations that offer this. But there’s a lot more that are obviously paid for than are actually free. But, yes, there are organizations out there that do offer this type of training, press freedom organizations. ROBBINS: Tat, have you done any training on physical security? Because you’re out and about in the community all the time. BELLAMY-WALKER: Hmm. Yeah. So I would also echo the IWMF’s HEFAT training. During the training, like, we learned how to, like, you know, if we’re in a protest and it gets extremely, like, hostile, we learned how to navigate ourselves, like, out of that situation. We learned how to navigate—if there’s a mass shooting, like, what to do. If—you know, if we’re, you know, getting kidnapped or something, we learned how to navigate that situation. So I would definitely recommend IWMF’s HEFAT training has something for folks to use to learn how to navigate these different physical threats that can come up in the field. ROBBINS: Great. Well, we will share a link to that as well when we send out our follow up—our follow-up emails. That's great to know, that that's available. Also never go in the center of a crowd. Hug the buildings. You don't want to get trampled. It's another thing my husband taught me in the early days. These are all really useful things. Question: For a reporter who covers a remote minority community in a news desert, she must be visible on social media for sources to reach her. At the same time, she’s getting harassed/doxed. We provided Delete Me, but she still needs to be findable. Best practices? That was—I mean, it seems to me, sort of that’s the great paradox here. You know, how can you be visible so people can find you, but at the same time you don’t want to get people—the wrong people finding you? How do we balance that? STAPLEY: Yeah. And, like I said, it’s different for each journalist. Depends on the degree of harassment, and how comfortable, and who’s harassing you as well. So generally, if the people who live close to you are harassing you, the physical threat level is higher. So that’s something to be mindful of. So, you know, if you’re—some of the most challenging cases are journalists who report on the communities that they are living in, and those communities are hostile to them in some form. And it can be very, very difficult for them to stay safe, because they also know where you live. Because, you know, they know your aunt or whoever, like they live three doors down. But I think really it's then about putting best practices in place. So having a plan for what if this happens, what will we do as a newsroom to support this journalist? And maybe seeing—asking the journalist what they feel that they need. So when it comes to harassment on social media, I'm afraid—a lot of responsibility for managing that harassment should come from the platforms, but it doesn't. And there are very few practices now in place, especially, you know, what we've seen with X, or what was previously Twitter. You know, the security there is not as efficient as it once was. I think I could say that. So you can be reporting things, but nothing's happening. Or they say that it adheres to their community guidelines. Often we hear that from Facebook, for example, or Instagram. One thing you should know, if you’re reporting harassment, is you should read the community guidelines and see how that harassment—you need—you need to parrot the same language back to them. So you need to show them how the harassment is violating their community standards, and just use the same words in your—in your report. And document it. So keep a spreadsheet of who—what platform it happened on, take a screenshot of the abuse. Don’t just have the URL, because people delete it. So make sure you have the handle name, the date, the time, et cetera. And the harassment, the platform it happened on, whether you reported it, who you reported it to, have you heard back from them. Why would you document it? Well, it really depends. Maybe, you know, it’s just personal, so you can track it. Maybe it’s for you to show editors. Maybe it’s to take to the authorities. But that’s not always appropriate for everybody. You may or may not want to document—and you can’t document everything. So you’re just looking for threat to life there, I would say. And it can be helpful to get—I know Tat mentioned this—to have, like, a community of people who can help you with that. So in the case of this journalist, like, what’s their external support network like? Are there other journalists that journalists can be in contact with? What can you offer that journalist in terms of support? So does that journalist need time every week to kind of document this during work hours so she doesn’t—or, he—doesn’t have to spend their time doing it on the weekend? Do they need access to mental health provision? Do they need an IT team? So it sounds like it's a small outlet, you probably don't have—maybe have an IT team? Or, you know, the owner's probably the IT person. That's normally how that works. So what can you do there to make sure their accounts are secure, and make sure they know that they don't always have to be online? So one of the most important things for journalists is for people to contact them. But if you're on a device all the time, and that device is just blowing up with hatred, it can be quite useful to have a different device, a different phone number that you use for personal use. And that, you know, maybe you don't work on the weekend, you switch your work phone off so you don't have to be reading all this abuse. I know switching the phone off for a journalist is like never going to happen, but in some cases it could be useful. If you’re in the middle of a sustained, like, vicious attack, you know, just having your phone explode with calls, messages, emails, all just coming at you 24/7, is really not great. And it really impedes your ability to do work as well. So, you know, putting a bit of separation there, and helping that journalist—letting that journalist know that you support that journalist doing that is really helpful. That’s a really good, important step for a newsroom to do, kind of giving them that support. ROBBINS: So one of the things that Ela said, and, Tat, I want to ask you about it. Ela said something about knowing something about who your attacker is, because then you might know more about whether they just—they’re just going to dox you—I don’t mean “just”—but if they’re going to focus on doxing, versus they maybe want to hack your personal accounts, or they want to go after your aunt, or they may actually come to your newsroom and physically threaten you. That people have patterns of their attacks. When you were getting attacked over the story you were doing about drag laws, did you have a sense—did you know who was attacking you? Did you research it? BELLAMY-WALKER: Yeah, I did. At first, it just seemed like it was just, like, random folks, you know, from, you know, the internet. But I started to see that there was definitely this, like, conservative Facebook page. Like, everyone from that conservative Facebook page. They were all definitely emailing me. You know, I’m definitely maybe not 100 percent sure about that, but it seemed like the Facebook page took the harassment to a whole different level, especially because they included, like, where I work. They, you know, had spoke about like a tweet that I had wrote about, like, the journalism industry in general, in terms of diversity. So many of the attacks started to heighten from the Facebook page, and then the article that was written about me. And so for me, it’s really important for me to, you know, check, you know, what is being, you know, written about me through either Google searches or I will search Facebook, and that’s how I came across this, you know, conservative Facebook page. I think they were called, like, the Whiskey Cowboys, or something like that. Yeah, yeah. So that’s how I look at—that’s how I came across them. It was after I had done, like, a search of my name in Facebook. And if I had not done that search of my name, I would not have realized, like, why it was becoming so intense. Because before then, I did—you know, definitely I get some emails here and there, but never something as targeted as it was. I’m like, whoa, like, these are getting, like, really, really personal. And then with the Facebook page, it was very, very personal attacks on me. ROBBINS: So, Ela, I think my final question to you is, sometimes a Facebook page isn’t necessarily who we think it is. I mean, it could be the Iranians. It could be somebody in New Jersey. It’s not—I mean, there’s Donald Trump, it’s some 300-pound guy in a basement in Newark, New Jersey. OK, well, that’s a story for another day. Do you guys or does someone else have—you know, has done more forensic research so that if we’re getting—we’re getting attacked we can say: That looks like X group, and we know that they tend to mainly focus on doxing, or you probably should be more aware that they’re going to go after your financial resources? Is there some sort of a guide for particular groups in the way they do their work? STAPLEY: Not a guide, as such. But, yes, there are journalists who’ve researched the people who have harassed them. And it also makes very good stories—I know journalists who have written good stories about that. And, obviously, there are tech professionals, IT professionals, who can also look into that. They can study things like IP addresses and things. And it helps build up a picture of who the attackers are. But I think here, the important thing is if you are writing on a particular story—on a particular topic or on a particular region of the world, knowing who’s active online with regards to that topic and regards to that region of the world, and what they can do in terms of their tech capacity, is important. Ideally, before anything happens, so that you can put steps in place. ROBBINS: But how would I, if I work at a medium-sized or small newspaper—you know, where would I turn for help for that sort of risk assessment, as I’m launching into that? You know, how would I know that if I’m going to go down this road that I might draw the ire of X, Y, or Z that has this capacity? Where would I look for that? STAPLEY: Yeah, speaking to other reporters who cover the same beat is very helpful, whether in your state or just, like, if you have reporters in other areas of the country or in other countries. You know, if you’re covering international news, like, speaking to them and finding out if they—what digital threats they’ve faced is a really useful step. So connecting to that network, like we talked about fixers in different countries. Like, getting a feel for it. But ideally, this should come from the newsroom themselves. So, you know, ideally, newsrooms should be proactive about doing risk assessments. And ideally, they should train managers. They should train editors on this. So a lot of responsibility does kind of fall to the editor, but a lot of them haven’t been trained in how to, like, roll out a risk assessment appropriately. And so getting newsrooms to really be proactive about this, training their editors, and being—you know, looking at the risk assessments, putting them in front of people, and getting them to—and asking them to fill them out. Because the risk assessment really is about mitigating risk. It’s getting you thinking, what are the risks? How can you reduce them in a way that makes it safer for you to go about your daily life, but also to continue reporting? Which at the end of the day, is what all journalists want to do. ROBBINS: Has anybody—like Pew or anybody else—brought together sort of a compendium of, you know, significant online attacks that journalists have suffered, sort of organized by topic or something? That would be really useful. STAPLEY: Yeah, there’s a number of organizations that have published on this. There’s been a lot of academic research done. The ICFJ and UNESCO did one, The Chilling it’s called. That was a global look, against women journalists, and involved a lot of case studies. We have our online violence response hub—which Tat mentioned earlier, which I’m very pleased to know that Tat was using—which is a one-stop shop for all things online harassment-related. And there you will find the latest research. So you can go there and search for academic research, but it also has, like, digital safety guides, guidance for newsrooms, as well as for journalists and for those who want to support journalists to better protect themselves. ROBBINS: That’s great. Ela, Tat, thank you both for this. I’m going to turn it back to Irina. We’re going to push out these resources. And this has just been—I’m fascinated. This has been a great conversation. Thank you so much, both of you. STAPLEY: Thank you. FASKIANOS: Yes. And I echo that. Ela Stapley and Tat Bellamy-Walker, and, of course, Carla Anne Robbins, thank you very much for this conversation. We will send out the resources and the link to this webinar and transcript. As always, we encourage you to visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they are affecting the United States. And of course, you can email us to share suggestions for future webinars by sending an email to [email protected]. So thank you for being with us today. And thanks to all of you for your time. We appreciate it. ROBBINS: Ela and Tat, thank you for the work you do. Thanks, Irina. (END)
  • Digital and Cyberspace Policy Program
    Influence Immunity and Addressing Misinformation
    Play
    Dolores Albarracín, professor and director of the Social Action Lab and the science of science communications division of the Annenberg Public Policy Center at the University of Pennsylvania, discusses ways to address misinformation. Dana S. LaFon, national intelligence fellow at CFR, discusses malign influence campaigns, how to combat them, and their implications for national security and democracy. The host for the webinar is Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times. TRANSCRIPT FASKIANOS: Welcome to the Council on Foreign Relations Local Journalists Webinar. I’m Irina Faskianos, vice president for the National Program and Outreach here at CFR. CFR is an independent and nonpartisan membership organization, think tank, and publisher focused on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine, and as always CFR takes no institutional positions on matters of policies. This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues and provides a forum on sharing best practices. Thank you all for taking the time to join us. I want to remind everyone that this webinar is on the record and the video and transcript will be posted on our website after the fact at CFR.org/localjournalists. We are pleased to have Dolores Albarracín, Dana LaFon, and host Carla Anne Robbins with us to just have this discussion. Dolores Albarracín is a Penn Integrates Knowledge professor and renowned scholar in the fields of attitudes, communication, and behavior. She has published close to 200 journal articles and book chapters in leading scientific outlets and has been cited over 20,000 times. She is also the author of six books and her forthcoming book is titled Creating Conspiracy Beliefs: How Thoughts Are Formed. Dana LaFon is a national intelligence fellow here at CFR. She most recently served as chief and founder of the National Security Agency’s office of operational psychology, which is responsible for scaling psychologically-based insights for government operations to counter some of the most egregious national security threats, and she’s an expert in the fields of remote psychology assessment, influence psychology, and malign influence campaigns. And Carla Anne Robbins is a senior fellow at CFR and co-host of CFR podcast “The World Next Week.” She also serves as the faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs and prior to that she was deputy editorial page editor at The New York Times and chief diplomatic correspondent at the Wall Street Journal. So thank you all for being with us. I’m going to turn the conversation now over to Carla. ROBBINS: Irina, thank you so much, and it’s great to have everybody here today and this is a—quite a hot topic and just going to have the conversation among us for about twenty, twenty-five minutes, and you all are journalists—I’m sure you’ll have lots of questions. So feel free, please throw up your hands. Put questions in the chat and we’ll go from there. So, you know, this is the year of elections. There are going to be more than sixty elections around the world. In the U.S. we have a highly polarized presidential election. We have races for thirty-four Senate seats, every House seat, eleven gubernatorial elections, forty-four states with eighty-five legislative chambers up for grabs. So, Dana, if I may call you that— LAFON: Of course. ROBBINS: —if we can start with you. You know, what do we have to watch out for in the 2024 elections around the world and in the U.S. itself? When I think about misinformation I’m sort of still caught in 2016, to be perfectly frank. It’s the Russians doing it to us. You know, are the dangers in the U.S. still mainly foreign? How much of this is homegrown and is it different from what we saw in 2016? LAFON: Thank you, Carla. Thank you for having me. And I just want to remind everyone—I know this is on the record—I speak for myself and not any government agency or their constituent agencies. So thank you very much for hosting me and this topic. So here’s the thing. The idea of misinformation or disinformation which are called misinformation is larger than the average citizen in the U.S., in my opinion, understands. It’s a very—it’s a larger problem than we experienced in 2016. Nation states such as Russia, China, Iran, and more are actively motivated. Their motivations, if you will, to understand is to keep us distracted, to divide, driving wedges in U.S. social discord and what this does for them it allows certain autocratic nation states to achieve their global goals with less Western interference. So if we’re occupied with other things it’s easier for those folks to achieve their goals, and this strategy goes back decades but we are seeing an increase particularly both from China and Russia but is also a domestic issue. There is a domestic threat that is on the rise. I can’t give you numbers as far as percentages. I don’t know how we would know. But from our observations we can see that it’s a very large increase. For example, China, they have—I’ll give you a few examples. They have a state initiative that is multilingual influencers. In other words, they have multilingual influencers online that reach an audience of over a hundred and three million people in forty different languages. So if you think about influence strategies and influence techniques that they’re perpetrating in their influence strategies they’re using things like likability because they’re speaking the same language. They’re able to pose as native to the country to which they’re targeting. In the run up to the 2022 elections we saw activities such as Facebook removing Communist China Party false personas that were targeting pro-democracy activities. They use what’s called a seed and amplifier strategy. In other words, they post a seed and then they’ll come back and refer to that seed for truth validation to validate their position and that then ignites an amplification strategy. So that’s one basic core of misinformation amplification that we see used through false personas, and from a techniques perspective this is engages—this engages techniques such as influence principles of authority—you know, believing those in an authoritative position by perhaps using pseudo fake or fake news sites that they’ve sponsored, likability because of the commonality that they have with their targets, and social proof. If they can get folks to start amplifying and authentically sharing then that amplifies the strategies. We’re also seeing some things ahead of the 2024 elections. We’ve seen Meta remove thousands of China Facebook accounts that were false personas posing as Americans to post on U.S. politics and relations between Washington and Beijing. They all have their different profiles of action of how they conduct their activities that are rather distinct but they’re also getting quite savvy at it. So as we progress it will be even harder to recognize those false strategies without strategies of our own in place to recognize. Russia has a role, too. They have even more motivation to interfere in the 2024 elections than they even had, in my personal opinion, in the 2022 elections. Their goal has traditionally been at a high level to denigrate democracy through sowing that divisiveness, that confusion, the distraction and through other means. They’ll use state-owned media. They’ll leverage generative AI. They’ll use false personas and covert Russian intel-directed websites and as well as cyber-enabled influence operations like we saw in 2016 when they hacked the DNC. We’ll see more of that, in my opinion. How we combat it I think that’s something we’ll discuss later in this forum. But there are ways to combat it and so I’ll pause there as the threat intel picture and we can come back to that topic. ROBBINS: Thank you. So, Dolores, I do want to talk about you were co-author of this APA report about how to deal with it. But can we just—is there a difference between disinformation and misinformation? Because we tend to use these words interchangeably. Are we making a mistake using them interchangeably or can we get away with that? ALBARRACÍN: I prefer an umbrella term, misinformation, used more broadly. Misinformation, of course, implies—denotes this concerted attempt at introducing known falsities with intent. Regardless, even if it’s unintentional, misinformation can be equally harmful. So I don’t think the distinction is necessary, although, of course, everything that Dana has been describing would be clear disinformation. However, its impact has spread beyond the initial influencers is equally harmful or perhaps more and ultimately the intended goal. Yeah. ROBBINS: So we shared your report and I do want to—I want to talk about—with both of you about some of the recommendations. You know, we—as journalists we’ve really studied with—we’ve really struggled since 2016 about how to deal with misinformation and outright lies and we’ve been warned again and again that flagging something as untrue would only actually draw more attention to it, make it more viral and for some people even more trusted. You know, if you journalists write about it I don’t trust you journalists in the first place so it’s probably true if you tell us it’s not true. So this APA report, Using Psychology to Understand and Fight Health Misinformation, has a series of recommendations that seem really relevant not just for people covering health but also for people covering politics. And I wanted to start with avoid repeating misinformation without including a correction. Can we talk about this? You know, people talked about truth sandwiches. You know, what can you recommend to this group embarking on a year of really intense election coverage? If we know there’s a lot of misinformation or disinformation on a particular topic, you know, when do we know that it gets to a point in which we can’t ignore it? Do you—is there, you know, some rule of thumb—you say something’s a lie, then talk about the lie, then remind people it’s a lie. You know, how do we deal with that? ALBARRACÍN: One rule of thumb is if something is present in at least 10 percent of the population that makes it a highly salient belief and it’s likely to be used in various decisions throughout the day. So in that case I would recommend some degree of acknowledgment with all the caveats that you mentioned, yeah, like, you know, say it’s false each time you bring it up and then have the proper correction that’s detailed enough to tie fully to the initial representation of the event. So those are the general recommendations. They continue to be true. At the same time I would say that other approaches such as what in my research we’ve called bypassing can be equally impactful. So when trying to combat misinformation the question you want to be asking is, well, what do I want to achieve? Am I worried about the fact that if I say the election was stolen people are going to go storm the Capitol? Well, that’s one approach. Then you start there. So how—what would prevent storming the Capitol is a slightly different question from how do I make sure they know the election was not stolen, and there are different pathways, one being that, well, first of all, if you maybe address the misinformation but especially reinforce any belief and attitude that would protect the Capitol that’s our most urgent goal, in my opinion. It’s not the misinformation in and of itself. It’s the actual consequences. So with that in mind, I think in addition to corrections even more important sometimes is simply talking about everything that’s going to essentially lead to your goal and making sure that information is properly received. So, for instance, if you’re trying to get people to support GMOs how are you going to get there? You can say, well, one problem is people think they bring cancer. I can go ahead and correct that. But at the same time you can emphasize and introduce new ideas that people don’t have in mind and that is actually a lot more effective than trying to simply correct the misinformation. But that begins with knowing that you want to change the ultimate attitude towards GMOs. You don’t necessarily care about simply their belief in the cancer effects, if that makes sense. ROBBINS: Of course, it wouldn’t be the role of a newspaper to get people to, you know, support GMOs but it would be certainly—I say this as a former deputy editorial page editor, which is different from being a reporter but having been both—but on both sides it certainly is the role of newspapers to support free speech and democracy and the truth. I struggle—we certainly struggled even on the editorial page when I was at the Times with saying somebody was lying. You know, we didn’t used to use the word lie because lie implied intent. You know, I think the first time we even used the word lie on the edit page at the Times was we finally got around to say that Dick Cheney was lying. The lie had been said so many times that we finally decided we could do it, and when the Times finally used the word lie with Trump on the front page it made news. People were writing stories about the fact that the Times had said on the front page that Trump had lied. And part of it was, you know, the Gray Lady had finally done it but part of it was this question of were we drawing too much attention—were we undercutting the credibility. But I think there is this other thing. Is there a certain percentage of the population that you’re just not going to reach and what percentage of the population is up for grabs even for this conversation, right? I mean, I suppose that’s really—you know, Dana, to go back to you about can you change people’s minds with good coverage? LAFON: Yeah, I think it’s important to that final point you make, Carla, that you have to know your audience—you know, who are you reaching. It’s intuitive to believe that truth will counter misinformation and that’s psychologically not so—that there are certain cognitive biases and influence strategies that are going on that make the opposite actually true, right? So if you commit to one belief you will behave consistently to demonstrate your belief and when you’re faced with, you know, someone telling you, no, you’re wrong you will traditionally—human beings, right, will traditionally dig their feet in and look for evidence that supports their belief and naturally not even observe evidence that dissuades their belief. That’s confirmation bias. It’s a very normal human response. Facts don’t move people. Stories move people, and I think that is one of the core essence of a good misinformation campaign. It’s that narrative that speaks to an emotive knee-jerk response. The goal of a misinformation campaign is to get people to do something that’s in the narrator’s favor. So simply changing a belief is not the end goal of a misinformation campaign. The end goal is to provide an action. Whether it’s we’re distracted, whether we increase our social discord, whatever the action that’s the goal of the misinformation campaign. So journalists, I think, are in a really interesting yet very difficult position because countering it with truth is not savvy enough to counter the effects of the misinformation campaign. In my opinion I think it would be useful to use some of those strategies, for example, prebunking, right—getting the information out before it’s used in a misinformation campaign. Perhaps calling out the alternative or the critic’s side to your piece to say, you know, my critics may say this, which describes the potential narrative of a misinformation campaign, and then you address it from, you know, demonstrating the actual motivations of who is providing this information. So that’s one example. We saw that when in the beginning of the Ukraine invasion the U.S. government declassified some information that we understood Mr. Putin was going to use as fodder for misinformation campaigns, and if you notice those information campaigns were really, really thwarted because there’s a phenomenon that who says it first it’s true. So getting that information out there first gives us sort of credibility to that information. There’s also repeatability. You know, I’ll say repeatability is believability. So expressing the journalists’ article, their writing, in this way repeated through various forums, repeated through various mediums and different sources, is a way to build that credibility and the veracity of their article. Highlighting the motivations of the source and letting the reader—you know, you’re taking the reader on the story of this is why it would be in the best interest of the narrator, here’s what they’re trying to accomplish. It appears that this is what they’re trying to accomplish. Now, I’m not saying don’t use facts, of course, but if you can marry it into your own narrative into a story that is fact based I think you have a much more robust possibility to counter the effects. You know, the tricky thing is once that misinformation campaign or disinformation campaign is out there you can’t put that cat back in the bag, you know, as they say. It’s out there. It’s going to have an effect because it’s implicit. It works on implicit levels unconsciously. But you can shape a response to build that immunity for future influence campaigns. ROBBINS: Dolores, Dana went through quite a few things that were on your list. I wanted to have your reaction to those things, this notion of journalists playing a role in prebunking, journalists playing a role establishing their credibility by taking us behind the curtain saying—you know, explaining the motivations of sources, establishing more credibility that way. Can you talk about those things? ALBARRACÍN: Yes. I think the—I mean, the impact on actions brings us back to what is the ultimate goal here, and prebunking can be effective and corrections can be effective but in the end they will come up with a new false piece to bring to the table. So, in that sense, anything that’s tied to the content, yeah, that you have to go out and prebunk has a limited effect because they have ample time to come up with new facts that then you need to have and prebunk and we cannot even fully anticipate them. So I would say that a more general approach of fortifying the knowledge base of our citizens is key so then they have enough structures in their minds to be on the alert and be able to recognize it themselves, not have to be told each thing that’s false. Yeah. So that general structure is important, and second—I mean, secondly, of course, is protecting the integrity of our institutions, which you know how to do probably better than any other player in society. So what do we do about attacks on science? What do we do on attacks on academia, all the respected sources kind of falling apart? This seems to me like the real problem more than having to prebunk each potential misinformation piece that’s going to be produced. So if you protect some of the sources and they remain critical reference in society that is going to be a lot more impactful than operating on the level of each misinformation piece as a strategy and as a nation. What else can I tell you? The other is, of course, in denouncing the actions. Yes, we care about misinformation. It can be harmful. But still, there is a very long way between misinformation and behavior and for the most part the impact is very, very small and this is something that we show in the report. It’s not like, oh, you hear a piece of misinformation and you move from there to, you know, walking into the White House with some assassination attempt. So this is something important to keep in mind. What is really a source of concern, in my view, and based on a lot of research is it’s not just the misinformation but what do we do when influencers online introduce new processes by which we should take down, you know, the university leaders? That—it’s the actions. It’s not the misinformation. It’s that this has happened and now I’m going to move to do X and creating opportunities for a congressional hearing on X issue. So, to me, a very good catalog and denouncing of the actions of these actors that’s where I would go because that is very close to behavior and perhaps more or less, you know, focused on that aspect. I don’t know if that answers— ROBBINS: The decline of trust in institutions is across the board in institutions in this country. I mean, people’s—I mean, Congress is lower than universities. But if you look at things like the Edelman Trust Barometer and other—you know, like, the Gallup does this, you know, year after year and you see the decline of trust. I mean, trust in the, quote, “mainstream media” is—you know, we’re not going to be going around propping up trust in, you know, the president of Penn or the president of Harvard, and—because people don’t trust us in the first place. I mean, I think we certainly have to think about why people don’t trust us and how—you know, and how we can—one of your recommendations is to leverage trusted sources to counter misinformation and provide accurate information in, say, health information. I think how much we interact with the community around I think we have to figure out who people do trust when we tell our stories because we can’t rely as we did, you know, fifty years ago and just say people just trust us because we are the press. So is there a way of building up trust in our institutions so at least people can listen to what we believe we’re purveying, which is as close to the truth as we can get it? Dana? LAFON: Sure. I think what we’re talking about there is building credibility, right, and that’s a different animal than influence inoculation. There’s different ways to build, and I know that’s not the topic of our discussion but there are different ways to build credibility and I think having that reputation is key, right? Reputation is lost, particularly in crisis. Reputation is made or lost and I think that having that long-term reputation and also that repeatability of the same messaging, you know, that your stories are providing from a journalistic standpoint is valuable. But if you’re talking about how do we build institutional credibility I think that’s a whole another topic for a whole another journalism seminar. But there are certainly psychological ways to build that credibility. ROBBINS: I wanted to ask both of you a question which is have we seen countries that have done a better job of getting through a barrage of misinformation of outside attempts or inside attempts to get people to throw up their hands and say, well, I just don’t trust anybody so I’m just not going to trust reality? Certainly, we saw that in Ukraine and that started, you know, from 2014 on, long predated what we went through, and you see this even today. I was just preparing for something for our podcast and I was looking at something in the pro-Russian press saying there were threats against the Hungarian foreign minister who’s going to have a meeting in Ukraine and he possibly couldn’t set foot in Ukraine. I mean, they are fabulists at disinformation. I mean, there’s—the EU has something called the EU disinformation review and every week you see this—sort of these viral stories they’re pushing out there. But people in Ukraine have just gotten—I think they’ve just gotten to the point in which they just have shrugged this off because they’ve had a decade of it. Taiwan seems to have done a very good job of getting through their election of not taking it particularly seriously because they were warned about it, whatever. I mean, are there countries that have figured out how to deal with this better than we have? Have you seen, you know, sort of strategies to prepare going into an election season, either of you? ALBARRACÍN: I’ll let Dana answer this question. I mean, I know that there is variability in specific forms of trust. So, for instance, Argentina is highest in trust of—in trust in science but I don’t know that there is one particular country that has managed to fight misinformation as a whole. LAFON: I think if we look to countries, particularly the near abroad for Russia, those countries, they’ve been dealing with this challenge for a long time and they’ve had some success in managing, you know, living in the solution space and how do we counter those offenses toward their country. We’ve seen it in Georgia. We’ve seen it around, you know, countries that are bordering Russia. We see them do it through entertainment. There are countries in—I want to say Georgia but it might be one of the different countries, forgive me, that uses a television show to address disinformation campaigns and to provide in an entertaining way the alternative narrative that—and the narrator and what they’re proposing. Very interesting. One country—I believe this was Georgia—used the artists, right? So they would give the narrative to the artists and artists would reinterpret it through their art and then communicate that to the populace and that was extremely, extremely useful in countering the narratives that came from these disinformation campaigns. And it also continually acts to inoculate the people viewing those programs so that they’re continually getting a booster on inoculating their ability to build healthy skepticism, to slow down their thinking to be more critical thinking. So all of those attempts that we’ve seen in those Near East—near abroad countries to Russia have had some success. Now, we can’t simply translate those actions—those activities that have been successful into different cultures without some thought, without some appreciation for how they would be translated into different cultures and different countries. But I think there’s a body of work there that—to look in the solution space to those countries and take a note. ROBBINS: So before I want to—we do have a question already but I just wanted to very quickly ask Dolores if I wanted to write a story about, you know, programs that people are trying to prepare people, to better train people to resist misinformation are there any cool programs out there that I could, you know, go and write about? I mean, I know it’s in early stages but—and I know that right after January 6 the military was looking into this about how to prepare people who were new recruits so that they weren’t radicalized. I mean, are there effective programs out there or at least ones that are in early stages that are worth looking at for training, whether they’re for young kids or for their—for young adults or— ALBARRACÍN: There is media literacy types of training. I would argue that in the K to 12 system should be the system that does that because it’s not a matter of, OK, give me two lessons on how to identify misinformation. It’s going to change next month in response to what we teach them. So it’s the critical thinking skills, the fact that, you know, scientific decisions or health decisions are well informed by science and you don’t politicize science because it’s almost sinful. It’s really building the values that are going to protect society from manipulating these institutional actors. I think that’s the only long-term solution I can imagine. It’s not going to be a quick fix by the time people have no understanding of science or cannot even think about the word “evolution” without thinking it’s potentially problematic religiously. You know, I mean, I don’t really know that we can change specific pieces without a massive overhaul of how we instruct kids on all of these issues and create spaces in which there are certain institutions that are trustworthy and others that shouldn’t be involved in certain spheres. ROBBINS: Can I ask a research question for either of you, which is so if I wanted—you talked—both of you talked about how important it is to sort of get ahead of a story, because it’s very hard to take—to get the cat back in the bag. So how do I do that? I mean, it’s certainly—I’m now going to date myself from how long I’ve been out of the daily business, but we had people at the Times whose job it was to monitor Twitter to see whether, you know, there was news that was breaking on Twitter before it broke out on the wires. But how do we know—when you said—you know, Dolores, you said when 10 percent of the population starts believing something, you know, how do we know that? How do we know when something is previral but threatening to become viral so that we can get out ahead of a misinformation or disinformation story? ALBARRACÍN: I mean, in our world we would know through surveys or some sort of check. I think journalists have—are pretty well tuned to whatever is in the mix that could be potentially growing, yes, and that could be leveraged to explain new things and especially, you know, in a false way. I don’t know any other way than following your gut on what might become problematic and putting it out there as early as possible. ROBBINS: So, Dana, you come from the intelligence community. How do you guys, you know, get at—know when something’s coming down that hasn’t, you know, totally blasted into mainstream consciousness? LAFON: Yeah. And we may not always know as well, right? So it’s too—I don’t think it’s reasonable to ask the journalists to get in front of that narrative. It’s impossible. I think our challenge as—to you as a journalist is once it’s there how do you report it in a way that begins that inoculation process by explaining, as Dolores spoke to, you know, linking it to that misinformation and explaining why. Not just that it’s misinformation but explaining why and then explaining what is the technique that was used—the influence technique that was used, and that is basically the influence inoculation process, right? You’re aware that it happens, you understand the link, and then you refute it through identifying the technique and then explaining how you could refute that, how you could counter that. If you could shape the writing to integrate those steps of the inoculation process that is one way but I don’t think it’s one—you’re not going to do it as a society with one aspect. To Dolores’ point, education is key and I absolutely agree K through 12 should increase critical thinking skills particularly when looking for ways that we’re influenced, and you can use different strategies. You can use—marketing strategies are full of influence campaigns for good or for bad. But you still—being able to recognize them helps inoculate them. And then I would put out a challenge. You know, here at the Council we talk a lot about AI and so I would challenge AI to help us identify what are the strategies to get in front of these information campaigns or misinformation campaigns that are going to be at our doorstep. So it’s education, building that immunity, building that critical thinking and healthy skepticism. It’s as journalists hopefully being able to report it in a way that increases that inoculation and connects the story that you’re writing to the misinformation and explaining why, and then looking to our technical—our technological advances to help us with this problem. ROBBINS: So both AI for good and evil, and I do want to get into— LAFON: There you go. ROBBINS: We have a question from Robert Chaney from the Missoulian. Robert, do you want to ask your question? I can read it but I’d much rather have him read it or speak it. Q: Can you hear me? ROBBINS: Yes, absolutely. Please. Q: Hi there. We’re in a situation in Montana where the state Republican Party is about to start a big campaign promoting mail ballot use for the 2024 elections. But there is a faction of our state GOP party that is an extremely John Birch Society election denier crew and they are bound and determined to counter their own colleagues with a lot of mail ballot is insecure, mail ballot is a fraud, election offices are untrustworthy. And, unfortunately, an awful lot of them consider most of the traditional media in the state as also untrustworthy so they’re not real receptive to us covering their own campaigns for mail ballots. So we’re kind of looking at this as do we just cover their internal battle over who’s going to be right about mail ballot security or do we try to somehow—I’m just looking for strategies for how to cover this because we’re going to see a whole bunch of anti-mail-ballot, election fraud stuff floating around in the waves in large part by a party that doesn’t want to have it. LAFON: I’m happy to let Dolores speak it or I can speak to it. ALBARRACÍN: Is it possible to not cover it? If that’s an option, I mean, deemphasizing it and minimizing would be probably the best solution in this case rather than giving them a megaphone to further disseminate their false claims. Is that an option? ROBBINS: More than 10 percent of the population believes it. I mean, by your own standard, I mean— ALBARRACÍN: Well, no, no, no. That was how do we know if this is potentially impactful but it may still—it may be 10 percent of the population but it’s not impactful. So, for instance, if people believe the Earth is flat it has no consequences for us. So that’s the other part. ROBBINS: Robert, is this impactful? It sounds like this is potentially impactful, isn’t it? Q: We’ve had a couple of county election offices won in political races by election deniers who have then caused all kinds of mayhem for school board and local mayor elections because they’re—they don’t know how to run the system they hate it so much. So we’ve got active internal disputes within the GOP party here who are vying for power and influence at the statewide level and part of this group is actively pushing that the mail ballot system is illegitimate. So I don’t think we can not cover it— ALBARRACÍN: OK. Q: —but we’re going to have a lot of difficulty getting what you might call the majority portion of the GOP party to look at us with any more credibility than they look at their own internal adversaries. ALBARRACÍN: And it’s a—how large proportionally is this group, the deniers? Q: I would give them between 15 and 30 percent. ALBARRACÍN: So, I mean, anything emphasizing that it’s a minority and probably putting them in a different context with experts from other states so then you kind of change the weight of the 15 percent as evidence for the credibility of the claim. That might be a way of doing it, sort of going outside Montana, see what this really, you know, says about election processes more broadly, perhaps. LAFON: And I might add that this is a local—very big local challenge for you and I want to be sensitive to the journalism integrity as you are that, you know, you report what’s happening but at the same time if you know that—if you know the potential narratives that come through the disinformation of saying mail-in ballots are not effective, saying mail-in ballots are not going—or lead to false election results, you know, you can put all of that in your stories ahead of time when you’re covering, you know, mail-in ballots. When you’re covering mail-in ballots you can cover, you know, the critics say this might not work or here’s some information about, you know, the veracity of mail-in ballots and their effectiveness prior to their campaign if possible. I don’t know the timing that you’re dealing with but I would recommend that, you know, getting ahead of it and then repeating it is a potential option. But I think it’s—yeah, it’s a local problem that’s not avoidable. ROBBINS: Particularly because it’s also a national story. I mean, there’s this gap between the GOP in lots of states, and meanwhile—there are a lot of states in which the RNC is pushing for people to vote early or to vote mail in; and then you have former President Trump, who is on the campaign trail saying: Don’t vote early. This stuff—the integrity of this is just—so this is going to go on between now and the election. So it’s a story. I don’t think you can avoid it. But I think it’s—you know, this is going to be a truth sandwich story also, which is every time you write about it you’re going to have to say there is no proof that mail-in ballots are insecure. So we have Gabrielle Gurley from the American Prospect. Gabrielle, do you want to ask your question? Q: Sure. I’ll read it. Let me just find where I am. And is there any evidence that hostile state actors are working with domestic groups to create or facilitate misinformation campaigns? Do you want me to— ROBBINS: And you referenced—can you describe the New Hampshire incident that you referenced? Q: Oh, the New Hampshire robocall incident is the robocall using—it’s a voice cloning, apparently, where President Biden’s voice was used telling people not to vote in the primary and wait until November to vote that went out from a former New Hampshire Democratic official’s phone number, apparently. ROBBINS: But you’re not suggesting that was from a hostile state actor? Q: No, I’m not. I was just asking. ROBBINS: So these are separate questions. Got it. Q: These are two separate questions, yes. ROBBINS: So, Dana, hostile state actors working with domestic groups to create—facilitating misinformation campaigns. LAFON: Yeah. I’ve not personally seen any evidence of that. I would also suggest that there certainly could be a false flag which means that it could be Iran posing as someone within the U.S. or someone within the U.S. posing as Russia. So there are false flag operations that go on and that is quite common. So I don’t see any—I personally don’t see any evidence. It doesn’t mean it doesn’t exist or the evidence isn’t out there. I haven’t seen it in my work, in my readings. But there are false flag operations. ROBBINS: Can I follow up with a question, which is that before the 2020 elections and before the midterm elections there were—there was a pretty big effort by Homeland Security to work with states to avoid hacking and a variety of other things. People seem to be quite vigilant about it and nothing bad happened that we’re aware of. Have they let down their guard or is there continued vigilance going into this election that you’re aware of, Dana? LAFON: From what I see at CISA, the organization working with states, and I also see FBI domestically working with states to shore up those secure environments. So I think that there is a lot of domestic efforts to ensure the security of the elections, particularly around the technology around it. I can’t speak to any particular action but I think that’s under the guise of CISA at this point. ROBBINS: So there’s probably quite good stories for people to be looking at what CISA is doing in individual states because you’re already hearing, you know, people are going to be raising questions about the integrity of the vote and voting machines and ballots and there will only be paper ballots, you know, going forward—there will never be another electronic voting machine and all of that. So I think that will probably be a story closer to the election. But I’d be really interested to see what CISA is doing, particularly you have to remember Krebs quit and there were just all of these things that happened in the end of the Trump administration. I don’t think people have been covering CISA very much, but I think there’s probably some really good state stories about state funding and CISA and what people have done to strengthen the integrity of their—particularly the election. There’s going to be a lot of them because I’ve heard the Trump campaign talking a lot about electronic voting machines and how unreliable they are, and if they don’t have paper trails I think there’s interesting questions there as well. LAFON: I would love to see that. I would love to see the journalists on this call dig into, you know, what is DHS doing—you know, what are they doing at a state level. I don’t know personally those stories so I would love to see that. ROBBINS: Christina Shockley, who is with public radio from a local—of All Things Considered at WUOM in Ann Arbor, Michigan. Christina, do you want to ask your question? Q: Hi. Yeah. So I’m no longer at Michigan Radio, which is now Michigan Public, but I am still heavily working with stations across the country. And because of the relationship that we tend to like to have with our listeners and because of how we are funded, which is primarily through listener contributions, I’m wondering if you think of—if you think that there are any specific unique ways that public media can go about fighting disinformation and misinformation. Thank you. LAFON: I would love to see public media advertise the inoculation process and make that a common nomenclature amongst your audience so that they are understanding and getting boosters and actually exercising that building of the immunity to influence attacks. I think there’s two ways we can stop disinformation. We can stop the actors doing it or we can harden us cognitively to be more prepared to thwart it. So I would love to see—I would love to see that. ROBBINS: Dolores, can I ask you about the role of the social media companies? Among your recommendations is the demand of data access and transparency from social media companies for scientific research on misinformation. I mean, I have seen reporting that says that, you know, that Meta has laid off all sorts of people who were supposed to be, you know, hardening themselves against misinformation in the run-up to elections, and are they taking this seriously and are they willing to be transparent? Can you talk about the relation between researchers and what you’re seeing about their preparation for elections? ALBARRACÍN: Yes. It’s been a problem to ensure access to the data especially, of course, Twitter, which was completely cut off and became incredibly expensive. So our ability to get a glimpse even has diminished if not gone away completely. So, I mean, that’s what I can say about the researcher side of things. Companies are, obviously, not motivated by the goal of ensuring national security or anything like that. It’s a matter—it’s all financial gain, the basic. So whatever efforts they have tend to be very short lasting and often too late and then they are gone, and until we’re up to here of misinformation they typically don’t activate new methods. But I think Twitter has been perhaps the most problematic in terms of ensuring any sort of filter. ROBBINS: And does Meta share information? Because Dana in the beginning cited some numbers from Meta. I was just wondering whether—does Meta share data with you? ALBARRACÍN: No, they don’t. They’ve done the occasional project in which they will share results of what they analyzed internally but they normally don’t share anything broadly, and Twitter used to be the most accessible but still had a lot of limitations. You never knew what the sample really looked like and it’s all fairly restricted. Yeah. So that’s unfortunate. I don’t know what Dana has to say about this. LAFON: I know from what I’ve read in open source in reporting that Meta has—I think there was a recorded futures article I reference in my paper for Renewing America for CFR about how Meta had reportedly taken down a significant amount of false persona accounts attempting to pretend to be U.S. citizens. So what I know I learned from the folks in this call in the journalist environment. But there is some reporting of different companies. There’s also a wonderful Microsoft report on misinformation and disinformation campaigns from China, Russia, and North Korea that I would strongly recommend to folks who are interested in getting some understanding of activities and how they’re approaching social media and what—a little bit about what social media is doing. I would send your audience to that. ROBBINS: We can share links to all of that with everybody. We’ll push them out after this. Can we talk a little bit about AI while I’m—do we have another—we have another question? Yea? OK. Jordan Coll, can you—would you like to voice your question and tell us who you work with? I’m sorry. I’m not looking at the list right now. Q: Hi. Can you hear me? ROBBINS: Absolutely. Q: Wonderful. Wonderful. Yes. So thank you so much. Jordan Coll here. So recently—I am a freelance reporter and I just finished my graduate program at Columbia graduate journalism school so I just finished there. My question is in two parts. One, we know that there seems to be, especially with Twitter, you know, journalists are consumed by the platform. I think a lot of the messaging—you know, a lot of political journalists that cover politics are in X because there seems to be a lot of—you know, you have your senators, you have politicians that represent these platforms. So my first question is how, and especially given the fact that, like, CEOs like Elon Musk, tech company hosts, are replatforming figures that were clearly giving, like, disinformation ads. Like, we have Alex Jones, for instance, with the whole defamation case of Sandy Hook and then Trump as well spreading these. So how do we resolve this issue, A, and where—in terms of ad space some subscribers, you know, are being—like, advertisers are pulling out of X significantly because of what CEOs are—you know, the decision making process that they’ve placed. I’m not sure if I’ve clarified the question but one of them would be how do we resolve those actions taken by higher execs that permit—to give them open access to these people to rejoin these accounts again when they clearly have violated the guidelines and the misinformation guidelines, et cetera? ROBBINS: Do you mean from a public policy point of view? Do you mean you want—you think government should be doing something or do you mean that we as journalists should be boycotting? Q: I think it would be, like, more so the journalists part of it because, yeah, what are we—our job is not to—you know, we simply—again, we don’t write the policies that they’ve put for themselves. So how do we become more, I guess, alert and how to navigate through the—through that. ROBBINS: Would we—do we do the—Dolores, sort of just ignore them? You know, that’s sort of—I mean, I must admit that I’m so profoundly ambivalent to the point of—I mean, we still—the Council still do things on X. We do X Spaces, what used to be Twitter Spaces, and—(groans)—that was an editorial board language groaning. (Laughter.) So, I don’t know. Do you guys think that everybody should just boycott Twitter and get it over with? ALBARRACÍN: We all think that but institutionally people don’t find a way of getting out because they have their following and nothing has been—has emerged that’s equally successful, you know, or popular. So nobody can leave because how do they get their information out of the credible institutions that still have X accounts? LAFON: One point is, and I think Dolores spoke about this earlier, is that where do you get your news source from. You know, people should not be getting their news sources from social media. I wouldn’t—I shouldn’t say should. People—I encourage people not to get their news sources from social media. It’s a way to absorb information and that’s great. But this is a very difficult challenge, right? This has been a challenge for many years. There’s no easy button here, Jordan. It’s—and to—I can’t speak to how journalists could counter, you know, these effects. But I think looking at it as a social media platform versus a news platform is something that you could educate your audience on. And congrats for finishing your grad school. Q: Yeah. ROBBINS: Let’s hope you haven’t gathered— ALBARRACÍN: About Twitter, I mean, Twitter used to be an excellent source for all kinds of news so why couldn’t we have a platform like that that’s maintained by, you know, the main media sources and use it for that purpose, a more centralized— LAFON: Like a news source that is that media. ALBARRACÍN: Yeah. LAFON: A news media. Yeah, it’s a great idea. ROBBINS: Well, we are going to have—you know, if President Trump is reelected we’re going to have a real challenge, which is how much do we quote, you know, things that sound like social media posts every time they come out of his mouth, which is that’s a—as it is right now people are being very careful about how much they—people quote the things he says on Truth Social. But once they’re utterances from the Oval Office these are real—these are major journalistic challenges. You know, how much are you repeating things that are frightening versus just sort of draw us that are frightening? These are—these are major, major challenges to come. Well, I just want to thank you both for really—I wish it were a happier conversation, but a really interesting conversation. Thank you, everybody, for raising questions. It’s always good when we end with more questions. And Emily Bell is wonderful to talk about this—from Jordan. So, Jordan, I hope you don’t—haven’t graduated with too much debt and that you get hired soon. So I’m going to turn it back to Irina. Thank you all so much. FASKIANOS: Thank you very much. As Carla said, we appreciate it and we will be sending out the link to the transcript and the video after this as well as links to the resources that were mentioned. You can follow us, our work, on CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they’re affecting the United States, and do email us your suggestions for future webinars. You can email [email protected]. So, again, thank you all for being with us today. (END)      
  • Media
    Selective News Avoidance
    Play
    Benjamin Toff, senior research fellow at the Reuters Institute for the Study of Journalism, discusses his research on the increasingly complex relationship with traditional news media that many people experience today. Ginnie Graham, editorials editor at Tulsa World, discusses her experience reporting on news avoidance and how this trend can affect local communities. The host for the webinar is Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times.  TRANSCRIPT FASKIANOS: I’m Irina Faskianos, vice president for the National Program and Outreach here at CFR. CFR is an independent and nonpartisan membership organization, think tank, publisher, and educational institution focusing on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine. And, as always, CFR takes no institutional positions on matters of policy. This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover and national and international dynamics. And our programming puts you in touch with CFR resources and expertise on international issues and provides a forum for sharing best practices. We’re delighted to have participants, over sixty, coming to join us today from twenty-four states and U.S. territories. Again, this discussion is on the record. And we will post a video and transcript on our website after the fact at CFR.org/journalists. We are pleased to have Ginnie Graham, Benjamin Toff, and Carla Anne Robbins with us today to discuss selective news avoidance. I will give you highlights from their bios. Ginnie Graham is the editorials editor at Tulsa World. She has over thirty years of experience as a journalist and writer in Oklahoma. Her work focuses on social justice, equity, education, and health issues around children, youth, and families. And she recently published an article about news avoidance in local communities and its implications for democracy, entitled “Extremism Thrives on People Not Checking Out the Daily Headlines.” Benjamin Toff is a senior research fellow at the Reuters Institute for the Study of Journalism, where he leads the Trust in News Project. He is also an assistant professor at the Hubbard School of Journalism and Mass Communication at the University of Minnesota. And among his projects with the Reuters Institute, Dr. Toff is an in-depth examination of news avoidance and infrequent news use among audiences in the U.K. and elsewhere. And finally, Carla Anne Robbins is our host. She is a senior fellow at CFR. She is also the faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. Previously, she was deputy editorial page editor at the New York Times and chief diplomatic correspondent at the Wall Street Journal. And she is the cohost of CFR’s podcast The World Next Week, which I commend to all of you. So thank you all for being with us today. I’m going to turn it over to Carla to have the conversation with our distinguished panel. And then we’re going to go to all of you for your questions, comments. And, as a reminder, we like to use this as a way to share best practices. So please ask questions. You can either raise your hand or type them in the Q&A box. With that, Carla, over to you. ROBBINS: Thanks so much, Irina. And thank you so much, Ginnie and Ben, if I can call you by your first name. That’s great. The only person who ever called me doctor was my mother—(laughter)—who made me finish my Ph.D. before she let me become a journalist. So what can I tell you? It was a great fallback. So, Ben, can we start with you and this report that you just worked on? I actually never even heard the term “news avoidance,” although for someone who spent thirty years of my career in journalism, I sort of can relate to it right now. I mean, not that I’m not a total news junkie, but it’s pretty grim out there. So can you talk about what you found, particularly about the United States, because this is a local journalists group? TOFF: Sure. So yeah. So this is actually something I started researching back in 2017. Actually, no, 2016. I was, at the time, a postdoc at the Reuters Institute for the Study of Journalism at Oxford, which does a big annual survey of news audiences around the world. And one of the things that they had been noticing is they typically had screened out people who consume news less than once a month from their surveys each year. This is the digital news report that they publish each year. And they didn’t really know anything about them. And so that one of the things I started doing back then was just talking to people in that category to better understand what their lives were like, how they related to news. And the Reuters Institute started measuring what we’ve now started calling selective news avoidance in their surveys starting in 2017, and now, most recently, this year. And there has been this kind of steady increase in the percentage of people who are in this category, who say that they are actively avoiding news. And so the research I’ve done over time has been a mix of survey data analysis, but also a lot of in-depth conversations with people in this category in the U.K., in Spain, as well as in the U.S. And so what’s—it’s a very complicated set of different kind of intersecting phenomenon in terms of the way people think about news in their lives. So there isn’t really just sort of one kind of news avoider. But it does tend to be people who are younger, people who are less educated, people who are less interested in politics. And in the U.S., it also tends to be more frequent among people on the—on the right, conservative ideologically. And then the other big piece of it has to do with technology. And a lot of—particularly that measure of selective news avoidance—a lot of people were expressing a kind of feeling like they’re inundated by information at all times, and so they’re actually needing to actively screen it out in a way that was really quite different from, if you think about, you know, twenty, thirty years ago, the media environment that we were living in. You made much more of a sort of conscious choice to seek out news when you wanted to consume it. And so a lot of that percentage that of what we’re seeing over time of more and more people saying that they’re avoiding news, some of that is not necessarily to be super concerned about. A lot of those people are actually consuming a lot of news, but they’re just expressing a kind of frustration about the amount of news that they’re seeing, or the way that they have to kind of sift through news to kind of get what they want. ROBBINS: So I saw some numbers. And is it—was it 38 percent of Americans say they sometimes or often avoid news, including—of that, what, 41 percent of women and 34 percent of men? Is that the right—the right number? And that the highest percentage that you’ve seen over time? Because you’ve been doing this for a while. TOFF: Yeah. So in the survey data, 38 percent is the percent over the average across all the countries in the Digital News Report in 2022. The U.S. is slightly higher, 42 percent. And this is people who say that they actively avoid news sometimes or often. And that’s up from 38 percent in 2017 in the U.S. Brazil has actually seen the largest increase, from 27 percent back in 2017 to now 54 percent of people either often or sometimes actively avoiding news. And, yeah, gender is one of the other kind of divides you see across a lot of countries, where it’s slightly higher among women than men. ROBBINS: And, Ginnie—and I want to come back, Ben, also to talk about what stories and what people are talking, because I know you do focus groups or narratives. Ginnie, you wrote about this. And you also—you have a pretty close relationship with your community, not just covering it but you also have this advisory board. You know, obviously, your advisory board cares about the newspaper or they wouldn’t be involved in the advisory board. What are you hearing from people about—you know, and doesn’t seem that there’s a much higher percentage of people who are just turning off from the news? GRAHAM: Yes. And what prompted me to write the editorial was that there was a situation in Oklahoma where our state superintendent and state school board was going to take over the Tulsa School District. There was a threat to do that. And that’s a big move. It’s the largest district in the state. These are sort of on the right wing of politics, politicians that were wanting to do this. And we’ve been covering it for about six months. Came down to the day of the vote, and my phone was about to melt down because everyone in my bubble, like, woke up that day and they were wanting me to, like, what’s happening? What happened here? And these are people who, you know, with—how do you sum up in, you know, text messages, what we’ve been reporting for six months? And that was just sort of an example of sort of what I think a lot of journalists deal with. The other reporters in the meeting had the same thing. For whatever reason, the community had been avoiding this very—in avoiding news, they didn’t know about this big thing that was happening until it was almost too late. And I’d also been seeing this pop up in our editorial board meetings with other officials and other leaders in the community. Like all editorial boards, you know, periodically the mayor will come in or state lawmakers or, you know, CEOs of companies. And it became pretty obvious some of them were not reading news, were not up to date on what had been reported. And at one point, I remember this in the last election, a candidate who was in pretty high office very upset that we hadn’t been covering all the dark money. Why aren’t you covering dark money? We had written in the last—the prior two weeks something like eight stories on dark money coming in—and one of them was in his campaign, in his race. And so we were seeing these trends of things happening. And you can tell the difference of who does and who doesn’t keep up with the local or state or national news, just because of what they know and don’t know. And it’s a difference of their being told what’s being reported rather than just knowing for themselves. And in thinking about that, there are real consequences to that. You know, when you don’t know who’s running for the school board, right now we have a lot of extreme candidates running. And so for people—when you think of 42 percent of Americans choose not to know news, that’s huge. And so that allows people who may not have, you know, the best motives to come in. And that’s where my headline came from, was extremism thrives when people don’t know what’s going on. Because I’m seeing that play out in real time right now. ROBBINS: And it was a really good piece, and we’ve just shared it in the chat. So, Ben, your study has a lot of different reasons. I want to get into them. But in part, it talked about particular stories that turned people off. And can we start with that? Are people identifying—you know, what is it that they’re identifying that they don’t want to read about? Or when they say, when I read about it, it makes me not want to keep reading? TOFF: Yeah. I mean, it’s just when you really talk to people about, you know, their news habits and why they’re making the choices they are, there’s kind of two different sets of responses that people will give you. Some of them will actually say, it’s not really about the news. It’s about me. Not me, but themselves. (Laughs.) They’ll talk about—you know, they’ll say that they’re not a news person, they’ll say it’s just never been part of the personality. Many of them will point to things that are very stressful or demanding circumstances in their lives. You know, they may be raising three kids, taking care of an aging parent, they got a lot of stuff going on. And so there is, I think, a very strong element of this that is, you know, about recognizing where people are and what they’re dealing with their lives. The other piece of it, though, you do hear people saying it is not really about me, but it’s about the news. And there, a lot of it, it tends to be a feeling like news is too much doom and gloom, too much negativity. It creates too much anxiety. A lot of—particularly in the U.S.—a lot of people feeling like they can’t trust it because they feel like it’s—they have these kind of ideas about what journalists do that is kind of motivated by ideological agendas. And then there’s a lot of people who are in the category who their biggest complaint is that it’s just difficult to make sense of it. The news is tedious or boring, or that they aren’t making the connections between what’s being covered and why it’s relevant to their lives. And they’d rather focus on things that are kind of more close to home, that are easier to make those connections. And I think for many of the journalists on this call, probably you’re feeling like, no those—it’s so relevant to things in their lives. But the connection there is sometimes implicit and not drawn out in a way that I think a lot of people who are not particularly engaged with news, they’re not following it day to day. And so it’s just very hard to kind of dive in in the middle of a story and be able to make sense of it. And given everything else in their lives. They’re looking for a way to not necessarily spend an hour going through it, but they do just want kind of the basic highlights and why this matters. ROBBINS: Ginnie, do you find that—I mean, the dark money story is the sort of thing that you, like, want to kill yourself after you—(laughter)—written all this— GRAHAM: Is like you spend all this time—you spend all this time on stories, and then you’re, like, is anyone reading this? (Laughs.) But, you know, it’s always been—you know, I’ve been doing it for thirty years. And that it’s interesting phenomena of people saying it’s all bad news. I think that’s what people remember. Because at one point, I went back and pulled—because it was a school official was saying all we do is cover bad news out of the school. That’s all you do. So I went back to look at all the stories for a year that had been written about that school. And at least three-fourths would be what you would call, like, features—good news type stories. But all they could remember are the things that—it’s like you remember when you have the flat tire. You don’t remember all the times your car ran well. So I think that’s an interesting phenomena because I kind of wonder if that’s something about our human psychology that works on us, because we do write good news stories. You know, things going well and right and what we can—but those aren’t the things people remember. The other thing that—I got into your research, which I really found helpful and fascinating—to try to make sense of what I was seeing locally, that this is a global and national issue, that cost comes up a lot. That were too expensive. And I always get a kick out of someone telling me it’s too expensive, as we’re sitting at Starbucks and, you know, our digital subscription’s, like, you know, $4 a month. But there is a real sense of news is too expensive. And though what I can’t find is any agreement on, OK, well, if this traditional model doesn’t work, who should pay for news? If we don’t want—they don’t want government paying for news. People don’t want nonprofits or foundations paying for news, and advertisers is a corporate sellout. So there’s no real consensus of who should pay for news, but just news should be free. So that’s another thing that I’m hearing from people who don’t keep up with with the headlines is, you know, it’s a lot to take in, and I don’t have time, and it costs a lot. And I thought it would be bias popping up a lot, but that’s not necessarily what I’m hearing from people. ROBBINS: So cost was one of the things on on the list in your study. And I have to have full disclosure here, I sat on the committee that made the decision to go pay at the Times, in the fervent belief that people had always paid for newspapers. This is just—but there was this sort of odd, you know, the internet must be free phase. And, of course, there would be no New York Times or any other newspaper in American if people didn’t have to pay for subscriptions. But for another day, but everyone here knows this. Can we talk about some of the causes here? I mean, on your list there’s a repetitive agenda, the news is a bummer. you know, they’re overwhelmed by news, there is the untrustworthy factor, of course, if you look at, like, the Edelman Trust Barometer and all these other things, nobody’s trustworthy these days. Too hard to understand. Can you sort of talk about what—having worked on this, Ben, for a while, and Ginnie as well, what do you think are the—are the real reasons versus the ones that people present because they then have to justify why they’re not—(laughs)—why they’re not engaging with the news? TOFF: Yeah, I mean, I think that, you know, as a social scientist, I try not to kind of categorize them as real or unreal if—what people are telling us is their reasons. But I do think, you know, you’re right. There are a lot of very intertwined factors here. And I think that for if you—I think to really put yourself in the perspective of a lot of people who are in this category, it’s a kind of lack of connection to individual news organization. So a lot of people’s perceptions of news are about the kind of ideas they hold about news and journalism in general, not necessarily specifically your organization. And so their perception of like negativity in local news, it could be very much driven by what’s on the local TV affiliate, regardless of what’s in the newspaper. And, you know, for people who are less and less likely to have a kind of regular habit around consuming news in the way that people once had much stronger habits around, there’s kind of a more impressionistic relationship that people have with news that that makes it very hard to kind of change those attitudes. But I think that’s fueling a lot of people’s perceptions about what this product is, and along with the cost. So, yeah, it’s not a huge cost, but they’re thinking in terms of, like, relative to what they feel like they’re getting from it. And a lot of people have this perception, rightly or wrongly, I think often wrongly, but they have a perception that there isn’t a lot of difference between all the different sources of news that are out there. And so they feel like, you know, if I don’t get it from this particular organization, I’ll just Google it and see the headline somewhere else. And so there—you know, there are real differences in terms of what’s actually being reported and who’s putting the resources into actually gathering the news. But for a lot of users, particularly those who are really disengaged, they aren’t necessarily thinking about it in those terms. They’re thinking about, like, why would I pay for it—why would I pay for this product, when I feel like I can get all the other information I want that’s actually relevant to me from all these other sources. ROBBINS: So, but Ginnie, I mean, that raises some really interesting questions, because I could see people feeling—and I’m always saying to my students—I teach a course on media and politics—I’m saying to my students, you know, media is a plural word. Don’t talk to me about the media. But even my students, who self-select to take a course about media and politics, start out saying, well, you know, you in the media, and— GRAHAM: Right, you know, I do—as a matter of fact, I just got some complaints. I mean, as the editorials editor, in the opinion section I get the complaints. But I’m finding that I get more and more complaints about stuff that is on TV, or it’s on—I was trying forever to find this one story this one gentleman was complaining about. Well, come to find out, he had heard it on, like, NBC. Because I thought maybe it was an AP story. We hadn’t run anything on it. And so there is maybe that because there is so much out there, people, I think, have a hard time parceling out where they’re getting it. And it’s that idea of what is your source, that we’re so used to asking, what’s your source? But I don’t think the general public necessarily thinks about it. You know, they see a story, they think it’s out there, and everyone’s covered it. And so for us at the local level, because I am not covering. I’m in Tulsa, Oklahoma. We’re spending our time covering the state capitol, our city councils. We don’t cover President Biden. I’m not at Congress. We cover our congressional delegation. And so I always kind of, from my end of things, come back to say: We’re your local newspaper. If you want to find out why your tax rates are going up, you really need to read us because, you know, Fox News isn’t going to cover that. And that’s on our end as local journalists in America trying to connect with our audiences to say, you can’t get this anywhere else. No one else is sitting in on the Tulsa County Commission meeting that’s going to be determining jail policy. So to a certain degree, we have to sort of make that connection with our populations. And where I kind of see, you know, you kind of talk about self-selection. There’s so much national news out there that my concern, where I live, is the lack of local news in some communities that don’t have a newspaper. They’re in rural Oklahoma. All they have are the national channels. And so there is a disconnect somewhat there that when we talk about just journalism overall I’m concerned about the lack of connection of what’s happening at the local levels to our local populations. Because that whole idea of media, it’s not coming—it’s coming from everything from Fox News to MSNBC, to all of that. So and I think that sometimes that is overwhelming, and they’ll just tune it out. And then they’re tuning out everything. ROBBINS: So to go back to the—and I do want to get to—talk a little bit about ways to fix this. And I also know that Ben has to leave in twenty-two minutes. So we have limited time today, and I want to turn it over to the group. But just to look at, if you were going to put sort of an order on this, I suppose, you based on what people say is the most important, how much of this is that we live in really grim times and the news itself is just really grim? Accurate news is grim. I mean, we’ve got wars going on. We’re just coming out of this pandemic. People’s economic situation isn’t great. You’ve got the polarization in the country. And that’s—a lot of this stuff is global. How much of it is the fact that accurate reporting is talking about—it’s not that just, you know, if it bleeds, it leads. It’s just that it’s accurate. It’s true. And how much of this is more of a reflection of political polarization and this is one more institution that people don’t trust? I mean, Ben, do you have a sort of sense of that? Is it content or is it institutional, I suppose, is what I’m asking. TOFF: Yeah, I mean, I think it’s both, and it’s also—I think there’s a third piece of it that has to do with kind of the growing—and this is one of the things that technology has really afforded—which is, you know, it’s easier—there more and more voices, more and more places for people to express themselves. And I think in many ways, there’s a lot of positives that come from that in terms of pushing news media to be more representative of the communities they’re trying to serve and to incorporate more of those voices. But it also means there’s a lot more disagreement about kind of what is the truth on any given matter, which makes it that much harder for people to navigate the media environment at the same time. So it’s all I think it’s all these things at once, which, you know, does pose—(laughs)—a very complicated series of challenges. I think if I had to point to one thing in particular to my mind that’s unique, because of course there’s always been really negative, complicated, depressing moments in history. I think one of the things that’s really kind of at the core of this is the changing habits that people have around the ways that they use media, and they find information. And that’s, you know, tied to the kind of longer trend of the internet. But I think at the center of it is kind of people’s daily connection to individual news organizations has really frayed, because they’re just not consuming in the same ways that they once were. ROBBINS: So part of it is just exhaustion, because there’s just this firehose of information coming at them and they’re scrolling their phones all the time, and there’s just information blaring at them all the time. And so the notion that there is—carefully curated news out there, it’s just seen as one more thing blaring at them? TOFF: Yeah, it’s combined, though, with this attitude I think a lot of people have that the smart way to respond to that environment is to be to avoid it, to not be taken in and be manipulated. And so they feel like to actually be a good citizen, I need to be resistant to all this information I’m seeing. Rather than, I think, for those of us who do consume a lot of news, we have a sense of, like, which ones we can trust, which organizations are doing things differently than others. And we kind of come to it with a sense of how to—how to navigate it, which I think can be really empowering. But for a lot of people who are distrusting or avoiding news, they see that as naïve, that those of us who have organizations that we do trust. ROBBINS: So, Ginnie, you advantage of people know you. You’ve been there for—people know your paper, you’re in a smaller community than New York. And they know you. You’re the editorial page editor of this—of this paper. And it’s been almost an act of faith that people trust local newspapers more than they trust national newspapers. You’ve seen this. You guys must be sitting around talking about how to deal with it. I mean— GRAHAM: Oh, yeah. I mean, I would say that it’s not—I don’t think it’s the, the news itself, because we’ve been—our country has faced a lot of tragedies. And we’ve—I don’t think it’s that. I think that there is—and we did start seeing it, you know, under President Trump’s, you know, leadership. Because, I mean, he had two rallies here, remember, in Tulsa. And I just remember, we had a political reporter who’d been covering politics for decades, you know, being pointed at and saying: That’s the enemy. And then people we knew, you know, people that we covered their high school football games and their theater programs, you know, they’re yelling at us. And it was a weird theater to be in. And so there has been this sort of trickle down distrust that, you know, this was a national—I mean, it was the weirdest thing we’ve ever been involved in, because you really felt like I’m in a theater production and I didn’t know it, you know? (Laughs.) We were like, OK, these same people would turn around the next week and want us to cover something else. But for that moment, they didn’t trust the media, they didn’t like us, and because they were told that’s the case. And so we are constantly trying to figure out how do we, you know, connect with our local community? And I also kind of wonder if part of the tuning out is there’s nowhere they feel—because we—I do think there’s a big middle. Because we talked about polarization. And there are certainly two sides. And in Oklahoma, it’s definitely a red state, but I hear people in the middle saying: I just don’t know where I fit. And so I think those people tune out, because if you have more analysis than you do news, and they don’t feel like they fit into that, you know, ends, I kind of wonder if part of that is what’s leading people to say: I don’t see myself reflected here. ROBBINS: So we have a question from Mary Ellen Klas. Ms. Klas, will you identify yourself and ask your question? Q: Hi, yes. I am a reporter with the Miami Herald and have been covering government and politics for more than thirty years. And, you know, there was a time before the internet when we actually did man-on-the-street interviews and, you know, before people relied on social media. And back then—back in the, you know, ’80s and ’90s, it was—there was a lot of focus on national politics because of cable and the network news. And I’ll tell you, I never have—I can’t remember any time when we do these man on the street interviews where people actually could identify who—people sometimes didn’t know who their governor was. They rarely could tell you what the controversial issue was before the school board. So I think—I think news media—maybe that was selective avoidance, but I also think that’s is kind of where we are. So my basic question here is, is it really an issue of—you know, are people really selectively avoiding the news? Or is it the fact that, as Ben was kind of mentioning, people no longer have a connection to their local news source? And I feel like it is the disappearance of local news that has led to the fact that people don’t trust. They used to—you know, how many times have you seen somebody put a clipping on their wall because they were quoted in the in the local paper? There was some connection. Every politician I’d connect with knew a reporter. That no longer happens. And so I think it’s—I’m wondering how much is selective avoidance and how much is really the absence of the deterioration of our local news ecosystem? GRAHAM: I’ll let Ben take the first— ROBBINS: My husband worked at the Miami Herald for eleven years, and you’re breaking my heart. I just want you to know that, but—(laughs)—you still have a paper, you know, Ginnie, in your—in your city. It sounds like it’s reasonably robust. GRAHAM: Well, I mean, to be honest, I’ve been here thirty years. When I started, our newsroom had about 180 people. We’re at about sixty. So, I mean, it’s—in the thirty years. And so, when you say people don’t know a reporter, there may be no reporters. I mean, there’s not as many to know. But I have the same concern. I do think there is news avoidance more than before, just by what I’ve seen and experienced and heard. But there is this disappearing of—you know, we’re Tulsa, we’re the metro area. It’s all the papers around us, you know, the small town in the panhandle of Oklahoma used to have a lot more papers. And so, but when that goes away—and this happened with my mom. She lived in a rural part of the area. And the bridge that took her to and from her community was out. She had no idea why it was out. There was no newspaper, no media to explain to people why is this bridge out? But she knew exactly what was happening in the White House that day, because the only news that she had access to was that national news. And so sometimes that—whatever the national rancor is, kind of we feel that on the local level. And so, but I will say that I think that more people—this was sort of, you know, talking about good news, we’re bringing back our newspapers in education program, which were kind of big in the ’90s. And our company, and were owned by Lee Enterprises, was offering to teachers a complimentary digital subscription to the Tulsa World, our local paper, and that would give access to their students. And they were hoping for, like, you know, a couple of hundred teachers. And, like, within a week 522 teachers signed up and 31,000 students. So that was great news. To me, it shows people want local news. They want to teach it. They want to learn from it. And so I don’t think it’s completely doom and gloom. I do think there are people out there, but maybe—but I was thinking, maybe that’s the cost issue. And so we’re always at the local level trying to figure out how do we connect? How do we find that, so people do know a reporter at their local paper? ROBBINS: So, I mean, that’s fabulous. I mean, that’s exciting to me, as someone who’s now a teacher. (Laughs.) GRAHAM: Yeah, and it’s still going up. So I’m excited about it. ROBBINS: So, Ben, you teach in a journalism school, which means that people are still paying to study journalism in hopes of getting careers in the business. So you must be giving people advice about how to—you know, how to produce news that people are going to want to read. So give us some—give us some advice here, Ben. (Laughter.) TOFF: Yeah, I mean, I think—so, yeah, I’ll start there. I mean, I do think that there are things that can be done on the kind of content side to meet people where they are in terms of, like, particularly the very disengaged audience. So thinking in terms of really focusing to communicate the relevance of a particular topic for somebody who’s only, like, half paying attention to this, and they’re going to see this in between eighteen other things are doing, and haven’t necessarily been following it. And so I think that there’s opportunity there to kind of get more targeted in terms of the way that you present a story for these different audiences that maybe go more in depth or less in depth, when people are looking for that. I think, obviously, you know, when it comes to like, a lot of negative news, there’s—you know, I’m not going to say anything that people haven’t heard before in terms of solutions journalism, but, you know, focusing on things that can actually be done, not just, like, hitting people over the head with all these horrible things happening over and over again. And then I think—you know, to be honest, I think a lot of it is really about the kind of larger relationship that people have with news, which is not always about the content. A lot of what I think—if we’re really honest about what drew people to local news or news in general, it’s often a sort of cultural connection, cultural practices that go back to habits. It goes back to communities they’re a part of, and the extent to which people talk about what was in the paper that day or what was on the news that evening. And are sort of embedded in larger communities of social ties where news as a kind of—news consumption as a practice is something that people enjoy, and get something out of, and it’s, like, part of those connections with other people in their communities. And to the extent that that’s going away, which is a matter of not only about the news itself but also a matter of these different ways in which people have to spend their time with all sorts of different media that just weren’t around before, they’re—so, yes, news avoidance has always been around in some form or another. But I think there was a lot more news consumption that was happening incidentally as people were keeping the TV on after primetime, and they would see the local evening news because of that. Or their family would always get the local paper because they’d always gotten the local paper. It was always around when people were exposed to—and then ultimately developed these habits, which became kind of ritualized, became a kind of a touch point in their lives, that they got a lot out of that wasn’t even about the actual content of the information. And those practices have gone away for a lot of different reasons. And they’re probably not coming back. Finding ways of kind of becoming part of people’s regular routines that they do have, whether it’s through newsletters or whether it’s through podcasts or things that kind of fit into the modern ways that people live their lives, I think is essential to be able to be sort of top of mind when people think about where they might turn for information. Because the other thing is, like, there’s a lot of local information that people can get from all sorts of places that are not the local news organization. The things that people really value local news for in the past, like the weather, or restaurant openings, or, you know, things that—you know, sort of announcements about people’s weddings or deaths or—these things that were sort of so essential to local news in the past, there are lots of other places that people turn to for that kind of information now. And so that’s what news organizations are competing with. They’re not just competing with other news now. They’re competing with all those other tools that people have at their disposal that people like. ROBBINS: So I want to ask Ginnie about whether there are any cool additional ideas in her newsroom. But we have a whole bunch of local journalists, or journalists, on this. Come on, you guys. You’re awfully quiet today. Is anybody doing anything cool in your newsroom? You have any suggestions, any ideas, debates going on? Susan Gile Wantuck, please. Q: Hi. I’m not—I don’t have a response to your last thing that you said. But I work for a public radio station in Tampa. And one of the challenges that we have is with the changes in people’s habits post-COVID. A lot of people aren’t listening in the car because they’re not commuting. So that’s affecting our bottom line. And I just—I just wondered if you all had any ideas about that. I mean, we’re trying to do more podcasts that are relatable and things that people want to hear. But other than that, I don’t know how to alter what we do to make ourselves more relevant and indispensable to people who are consuming news, or who have always consumed news from our station. They’re just listening less and because, of the economy, that’s probably why they’re not giving as much. GRAHAM: I mean, we’ve had really good response with podcasts. I mean, we’ve been playing around with it for a few years. I do see your issue with when people are wanting the headlines in the car, because that’s where I—you know, I turned to my local NPR station and get the headlines. But I do find people are listening more because they’re at the computer more. And so we’ve—for locally—what I do is I think, for people who are writing op-eds for me, I use the podcast to delve more into that. And depending on what the topic is and who I’m interviewing—you know, if it’s a lawmaker who’s doing something kind of big or controversial, they want to know more of, it’ll have bigger numbers, of course. So I always sort of use it as an extension of what’s in the paper. And then—(laughs)—I always tell them, I go, don’t give everything away. You want them to read what you’ve written for the opinion section. And so—and sports always post big numbers, because people love sports. I mean, it’s Oklahoma. We just beat Texas. You know, it’s a big thing. But we also have—and right now with the Killers of the Flower Moon movie is getting ready to come out this month. And so we’ve done a lot around that, done a lot of podcasting around that. So I do find when people don’t read, sometimes they will listen. So we’re still playing with that. And we’re having some pretty good response from it. ROBBINS: So, Ben, can we talk about—a little bit about who the audience is? I mean, do we have to write off the right and the left and just focus on the shrinking middle for straight news? Because one of the things in your study I think I saw that sort of freaked me out was people are even moving away from Facebook. Now Facebook’s moved off of news. But people are moving on to Telegram and WhatsApp because they’re moving away from even MSNBC and Fox. That they seem to want to go even into a tighter and tighter bubble for their conversation, which is a pretty—to me, a pretty frightening notion. You know, we saw the news bubble in 2016 as being just separated between two universes. But the notion that you can have an ever-tight tighter blackhole there. Are we basically just focusing on, you know, the swing vote here, and just see our audience that way? TOFF: I guess I would think about it as kind of more than just those—that kind of left-right dimension. That there’s actually a larger set of people who are—it’s not so much that they’re necessarily in the middle, but they’ve kind of—they’re less interested in politics altogether. And so, you know, when you ask them they will have a hard time placing themselves on the right or the left. They might not—they might have some ideas that are—you know, you would recognize as being very right-wing or very left-wing. But they—most of them, they want nothing to do with politics. And there’s a larger segment of those people in this country than really see themselves on the left or the right, or see themselves as moderate. And I do think that there’s a real, I think, both need and opportunity there to kind of focus on that set of people. You know, I think that there’s—it’s very clear that the left is currently—feels very well-served by the sources of news that are out there. And, of course, there are some—there’s people on the on the right who consume tons and tons of news, and they have lots of sources that they can turn to as well. But I think so much of our—as we’ve moved towards more of a, like, subscription model for most newspapers, and as local news have gotten smaller and smaller, everybody’s kind of fighting for—because those are the people most likely to subscribe, are the people who are most politically engaged, and they’re the ones who are kind of most interested in the, like, ins and outs of every little political story. But that, I think, has had an effect on—for the rest of the public, who’s not that interested in politics—which has always been the case. There’s always been a large segment of the American public who’s not particularly interested in politics, not particularly knowledgeable. But there are fewer news outlets that are really aimed at, when it comes to their political coverage, really trying to communicate to the audience because the incentives aren’t really there. And so I do think that’s, I think, a need and also, I think, a way—I don’t know, I do think of it as maybe an opportunity to try to engage with that set of readers, or listeners, or audiences differently. ROBBINS: Well, I know we only have a couple of minutes left because I know you have to catch a plane and we promised to tie this up. But, Ginnie, I’m going to ask you an out of left field question here, but we are at the Council on Foreign Relations. You seem to sort of have this demarcation, and maybe I’m reading it wrong, but, I mean, I was horrified by the notion that your mother’s cohort didn’t know why this bridge was down. And clearly my first reaction was, where was Pete Buttigieg? (Laughter.) GRAHAM: Not in Oklahoma. ROBBINS: Who should have been—he was too busy fixing I-95, right? But does that mean that that you all shouldn’t be, you know, writing about the war in Ukraine? Or do you only write about the war in Ukraine if there are Ukrainian refugees in your community? I mean, how do you take something as big as that and make it relevant for your local readership? Or do you just say, well, I’ll let the wire services do that. I’ll let the New York Times handle that. I’ll let, you know, the national news handle that. That’s not my job. GRAHAM: No, we still, when something big like that happens, there’s a local connection. I mean, we’re—we have 650,000 in our, you know, metro area and, you know, million when you get out to the MSA. I mean, it’s—we have Ukrainian refugees here. We have, you know, a thriving Jewish community that is very much affected by what’s happening in Israel. And so we still have, you know, reporters that we run that national story, but we have our local reaction. Because I think the local—because people know who these local people are. And, you know, we’re curious. Do you have someone that you’re worried about? How does this affect you? And then we also, you know, have people who are—you know, also we have Palestinian people, you know, people have Palestinian families who, you know, are worried, how—you know, how will this affect us? You know, we don’t, you know, make that differentiation between Hamas and Palestinian, especially when you’re in a place like Oklahoma, where people may not know the ins and outs of the history of that conflict, to make that clear. Because we don’t know what they’re hearing on national television. But no, that’s still very much, I think, an important part of what we do, because, you know, we’re a diverse community. Politically, we don’t look diverse from the outside of Oklahoma, but for those of us who lives here, we certainly are. And I think that’s our job, is to point out those different perspectives and people living here. And, you know, you talk about most people aren’t political, I would agree. But I think sometimes so much politics is just—it’s a turn off for people who particularly aren’t tuned in. But I do think that’s where some of the other things that media companies do—those food reviews. One of our biggest podcasts was we have an old Casa Bonita. It’s a cheesy restaurant that, you know, was on South Park, if you guys don’t know. But it’s empty. And we had the owner of the building do a podcast with us on what’s going to happen to that building. We had—I mean, it was like through the roof. So people will come to us for the different kinds of content. And I think as long as we keep doing that, doing just interesting local stories, eventually that trust is built. That, you know, we did a good job, we provided them information they are just naturally interested in. So when maybe something political comes up, we’ve already sort of established that we’re here doing that kind of work. So I think just—it sounds cliché—but just doing good journalism, I think, really is at the heart of what we have to continue to do. ROBBINS: It’s a wonderful place to end. We’re going to—Ben, thank you so much for doing this. Ginnie, thank you so much. Thank you for everybody. We’re going to share, Benjamin’s study and some other clips with you all. And I’m going to turn it back to Irinia. And, Ben, run to your—run to your plane. TOFF: Thank you. ROBBINS: Run, run, run. Don’t fall. FASKIANOS: Yes. Run, run, run. Thank you all. GRAHAM: Thank you. I loved your research. It was very helpful. TOFF: Thank you. FASKIANOS: It was. And we will share the links again, along with the link to this video and transcript. And, again, please do come to CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for latest developments and analysis on international trends and how they’re affecting the United States. We have a lot of information about the Israeli-Hamas conflict on our website. So I encourage you all to go there. As well as tune into Carla’s podcast, The World Next Week. And thank you, again. We ask that you send us suggestions for future webinars. Email us at [email protected]. So thanks to Ben Toff, who’s already signed off, Ginnie Graham, and Carla Robbins. ROBBINS: Thanks, Irinia. GRAHAM: Thank you. ROBBINS: Thanks, everybody. (END)
  • Economics
    CEO Speaker Series With Mathias Döpfner
    Play
    Mathias Döpfner discusses global trade, political polarization, the role of media in foreign policy, and lessons learned as chairman and CEO of Axel Springer SE, a multinational media company and owner of U.S. media brands including Politico and Business Insider. The CEO Speaker Series is a unique forum for leading global CEOs to share their insights on issues at the center of commerce and foreign policy, and to discuss the changing role of business globally.
  • Sub-Saharan Africa
    A “New Scramble for Africa"?
    When a phraseology says more about its users than the reality it purports to describe.  
  • Education
    Academic Webinar: Media Literacy and Propaganda
    Play
    Renee Hobbs, professor of communication studies and founder and director of the Media Education Lab at the University of Rhode Island, leads the conversation on media literacy and propaganda. FASKIANOS: Thank you, and welcome to today’s session of the Winter/Spring 2023 CFR Academic Webinar series. I am Irina Faskianos, vice president of the National Program and Outreach here at CFR.   Today’s discussion is on the record, and the video and transcript will be available on our website, CFR.org/academic. As always, CFR takes no institutional positions on matters of policy.   We are delighted to have Renee Hobbs with us to talk about media literacy and propaganda. Professor Hobbs is founder and director of the Media Education Lab and professor of communication studies at the University of Rhode Island. Through community and global service as a researcher, teacher, advocate, and media professional she has worked to advance the quality of digital and media literacy education in the United States and around the world. She is a founding coeditor of the Journal of Media Literacy Education, an open-access peer-reviewed journal that advances scholarship in the field. She’s authored twelve books on media literacy, published over a hundred-fifty articles in scholarly and professional journals, and she was awarded in 2018 the Research Excellence Award from the University of Rhode Island.   So, Renee, I can think of no one better to talk to us about this topic, very important topic, that you’ve been researching and advocating on for over thirty years. So, let’s start by defining media literacy and propaganda and why it is so critical for all of us to deepen our understanding of these topics.   HOBBS: So happy to be here, Irina. Thank you so much for the opportunity and the invitation to a dialogue.   I’ll take about—I’ll take about ten minutes and talk about media literacy defining and propaganda defining, and then we can have a robust and vigorous exchange of ideas. I’m looking forward to questions and comments from everyone who’s joining us today.   Why don’t we start with the phrase media literacy because media literacy is best described as an expanded conceptualization of literacy. So just as we think about literacy as reading and writing, speaking and listening, media literacy includes critical analysis of media and media production.   So to be an effective citizen in an information age, reading and writing and speaking and listening is no longer enough. One has to be skillful at critically analyzing all the different forms and formats and genres that messages now come to us in, and one has to be effective in communicating using media, using digital platforms.   So media literacy is literacy for the twenty-first century. Now, media literacy is sometimes taught in schools and often taught in the home but maybe not taught enough. The best evidence we have in the United States is that about one in three American students gets some exposure to media literacy in their elementary or secondary years, and because of deep investment in media literacy by the European Union, the European Commission, and quite a lot of research work happening in the twenty-eight member states, there is a robust and global community of media literacy educators and they come from very different backgrounds and fields.   They come from psychology, they come from sociology, they come from journalism, they come from education, they come from the humanities, even the fields of art and design. So to be media literate actually includes, if you think about it, a lot of different competencies, not just the ability to fact check, and media literacy isn’t just about news and information because we use media for all kinds of purposes, right, as media inform, entertain, and persuade us. And so media literacy considers media in all its complex functions as part of daily life.   OK. So how about the term “propaganda”?   Irina, this is a much harder word to define and, actually, some people have quibbled with me about my definition of propaganda. But my definition of propaganda is rooted in a deep understanding of the way the term has been used over, well, 400 years now.   In its original formulation propaganda was spreading the Gospel, the good news, as the Catholic Church tried to spread its messages about faith to people around the world. In the twentieth century the term began to be understood as a way to unify people. Propaganda was a way to build consensus for decision making, especially in democratic societies.   And then, of course, during the middle of the twentieth century it took a darker turn as we recognized how Nazi propaganda was used to lead to genocide, right, to destroy—to attempt to destroy and to create mass murder. So the word propaganda is kind of loaded with that twentieth century history.   But, yet, when we lived through the pandemic—here you are. You lived through it, didn’t you? (Laughs.) You lived through the pandemic because you got exposed to what I would call beneficial propaganda—propaganda that told you to wear a mask, propaganda that told you to get vaccinated, propaganda that said use social distancing.   So to understand propaganda and all its complexities we could say propaganda is communication designed to influence behavior, attitudes, and values, and propaganda is a form of mass communication, right.   So it isn’t persuasion that just happens, you know, you and me deciding, you know, should we go for pizza or Chinese for dinner tonight, right. I’ll try to persuade you. You try to persuade me. When we do it to large numbers of people and we use mediated symbols we’re engaging in propaganda.   So propaganda is a really important concept. Its meaning is situational and contextual, which is why when I work with students I often talk about how our understanding of propaganda is inflected by our cultural histories.   So, for instance, when I’m working with educators in Croatia, having had a long history of influence in the Soviet and, you know, in the communist era, their understanding of propaganda is inflected by the exposure to state-disseminated messages. And so the meaning of propaganda in your country and your cultural context might differ.   In Brazil, Irina, the word propaganda just means advertising, right, and advertising is a type of propaganda. Diplomacy can be a form of propaganda. The actions of government, politicians, can be a form of propaganda, but so can entertainment function as propaganda and so can education.   So propaganda is a really rich concept. Why is it important? Why is it important that we use media literacy skills like asking critical questions about media with propaganda?   Well, because propaganda tries to influence us by bypassing our critical thinking and the best way that propaganda has tried to change our behavior and influence our attitudes is by activating strong emotions, simplifying information, appealing to our deepest hopes, fears, and dreams, and attacking opponents, and these four mechanisms of propaganda can be used responsibly or irresponsibly.   So we are vulnerable to the terrible side of propaganda if we aren’t vigilant.   FASKIANOS: Fascinating. So in terms of the literacy how are you teaching this? Are you teaching students how to discern between the propaganda that is the good propaganda and, I mean, what—how do you make that distinction?   HOBBS: Got it. So students—propaganda—there’s a bunch of big ideas about propaganda that are really useful to understand. One is propaganda is in the eye of the beholder. So I don’t try—I don’t tell students what’s propaganda and what’s information, right. I encourage students to engage in a process of asking critical questions to come to their own conclusions about that.  And just want to show you one tool I use, Irina, in my teaching, I call it the media literacy smart phone. I’m going to show it to you a little bit so you can see it. The smart phone has some buttons on it that invite you to ask these questions like this one. Reality check—what is accurate or inaccurate about this message? That’s a good question to ask when you’re trying to determine whether something is harmful or beneficial propaganda.   Or how about this one? Public gain or private good—who’s making money from this message? Answer that question and you can often gain insight on the difference between harmful propaganda and beneficial propaganda.   Or how about this one? What’s left out? You know, the best way to spot propaganda is to notice what’s missing, right, because all media messages have a point of view, right. All media messages are selective and incomplete. So to identify the point of view of a media message notice what’s missing, what’s not being said, what’s left out.  There’s the values check button, the read between the lines button, the stereotype alert button. Propaganda often uses stereotypes to create in groups and out groups. If you’re in the in group propaganda feels really good—(laughter)—and if you’re in the out group you are being painted as an enemy, a villain, a dangerous person. Solution’s too easy. And record—save for later, you know, with the world we live in where we’re constantly swiping, clicking, we’re devoting only a few seconds to media messages because we’re moving so fast through so many of them.   This button reminds us that we actually have to make choices about what to pay attention to, what to allocate our attention to, and that means we sometimes have to slow down, right. So learning to allocate your attention and decide which messages deserve your attention and which messages don’t, these are all media literacy competencies.   So we aren’t telling people what to think, right. We aren’t—we aren’t naming that’s misinformation, that’s malinformation. We don’t do any of that. What we do is invite people to ask critical questions like who’s the author and what’s the purpose? What techniques are used to attract and hold your attention? How might different people interpret this message differently? What’s omitted? What are the values presented?  We want people to think for themselves because media literacy is a literacy practice and when people have these habits of mind built in, when they use them automatically when they’re reading the news, when they’re being persuaded, when they’re being entertained, then this goes back to the Enlightenment, Irina. We trust that people can differentiate between quality and junk, right, when they put the cognitive effort, when they’re effortful and strategic. And this kind of work can’t be done by yourself. It has to be done with others.   I mean, think about that question, how might different people interpret the message differently. This is why discussion and dialogue are so critically important to analyzing propaganda and to developing media literacy competencies.   FASKIANOS: Great. Fascinating.   Let’s go to questions to the group. We already have a few in the chat. You can also raise your hand and I will go back and forth between. If you do write your question in the chat or the Q&A box, please tell us who you are.   So I’m going to go to the first question from Andrew Jones, who’s an assistant professor of communications at Davis & Elkins College in Virginia. Would you draw a distinction between propaganda and public relations or do you see the two terms as interchangeable?   HOBBS: Ha ha, great question. Of course, I’ve had vigorous discussions about this in—with my students and with my colleagues. In 1928, Edward Bernays wrote a book called Propaganda and it became a classic in the field of communication. He very quickly recognized that the term propaganda was so negatively loaded that he changed the name to Public Relations.   So the grandfather of public relations understood that the word propaganda and public relations are, I would say, kissing cousins. So I don’t generally differentiate because I like—I think there’s a lot of utility to using the word propaganda in its big tent meaning, right.   So we—what we don’t want to do is just have propaganda be used as a smear word. That’s the term Neil Postman talked about when he said, you know, that’s a shortcut to critical thinking, right. By labeling something propaganda or, i.e., bad now you don’t have to think about it. Now you don’t have to ask critical questions, right.   So we want people to—we want to do whatever we can to make people think. So advertising can be a form of propaganda, right, and education can be a form of propaganda and entertainment can be a form of propaganda, and to determine whether you think it’s propaganda or not you really have to look very carefully at the form, the context, the audience, the purpose.   You have to really look at the whole rhetorical situation, make that determination yourself. What do you think? Is propaganda and public relations the same or are they different?   FASKIANOS: OK. People are not raising their hand but they’re writing their questions. So the next question is from Chip Pitts, who’s a lecturer at Stanford University, and it kind of follows on to what we were just talking about, to distinguish between propaganda and truth or falsity if that distinguish is important.   HOBBS: Oh, that’s a great question. This goes back to the earliest definitions of the term propaganda when it has long been recognized even at the very beginning of the First—when the First World War happened and propaganda really was becoming a tool used by governments, right, recognized that propaganda works best when it uses truth, right.  So propaganda can use truthful information, half-truths, or lies and, of course Goebbels was famous for saying that the best propaganda is truthful, right. (Laughs.) So propaganda can be truthful and very, very dangerous, right. Very harmful.   And so I think it’s important to recognize that propagandists use—can use truth, half-truths, or lies.   FASKIANOS: Yeah. So how do you, though, distinguish or have people, if you’re not telling people—you know, you’re teaching students how to think critically, which is so important. But as we saw with January 6 there is a subset of people who do not call it an insurrection.  HOBBS: Right.  FASKIANOS: You know, so we do have different groups that have—are using a different basis—set of facts. So what do you do in that case?   HOBBS: So media literacy is really rooted in this idea that we are co-learners in the search for truth and that none of us have a handle on it completely and we all need each other to apprise the complexity of what’s going on in the world.   So dialogue and discussion becomes a really central pedagogy of media literacy with this idea that we want to engage with each other with—from a position of intellectual humility. When I come into the classroom and I decide you can only call it an insurrection and if you call it a riot there’s something wrong with you, then I’ve created an in group and an out group, haven’t I? And I’ve set up a hierarchy that says if you agree with me you’re right and if you don’t agree with me you’re wrong. I can’t really have a discussion, can I?   That discussion is going to be false or artificial. It’s going to be stilted. Some people are going to be silenced in a discussion where I set the terms of what truth is, and that’s the very phenomenon we’re trying to fight against, right.   But if I come in with these critical questions and put you in the position of having to say how are they grabbing my attention, what is true, what seems accurate and inaccurate, how are stereotypes being used, right, then you have to engage in some genuine thinking.   And so teachers take—in that position don’t take—choose to take—choose not to take the position of an authority telling people what to think but, really, as a co-learner guiding with critical questions for students to come to their own conclusions about that.   FASKIANOS: Mmm hmm. That’s great. All right.   So we do have a raised hand. I’m going to go to Beverly Lindsay. And, Beverly, if you could tell us who you are—I know who you are, but for the group.   Q: I’m Beverly Lindsay, University of California multi-campus.   I spent a number of years working in the Department of State, in particular the Bureau of Educational and Cultural Affairs, and I’m still doing some funded programs from them. And years afterwards I was able to speak with the late Secretary of State Dean Rusk. I wasn’t in the State Department when he was there so we’re talking about a more recent period.   One of the statements that he made to me was the best propaganda has no propagandistic values. Years later when I was an international dean at a former university the executive vice president and the provost said to me, because this is a university wide program, that getting a Fulbright was simply propaganda in developing countries.   So you had two different views from two knowledgeable people. How would you think we might think about those type of responses now? He valued the—if you got a Fulbright to Oxford?  HOBBS: I really love this question, Beverly, and I actually do—I do something on this with my students as we look at the Voice of America, right, and we look at, well, this is journalism, right, and it’s journalism that’s designed to bring diverse perspectives on world issues to people in countries where they may not have this kind of journalism and, at the same time, there is a distinctly American ideology to this kind of journalism, right.   And so there’s a very interesting way in which maybe both of those ideas, maybe both of those frames that you just presented to us, maybe both of them are true, right. And I feel like it’s quite liberating to acknowledge that there’s some truth in both of those ideas, right, that the best diplomacy doesn’t have a propaganda intent and that soft power in whatever form it takes is strategic and intentional and it’s designed to accomplish a policy objective.   FASKIANOS: Great. So I’m going to take the next question written. Oh, Beverly has raised her hand. So I think there’s a follow-on before I go to the next one.   Beverly, do you want to follow up? You’re still muted.   Q: Sorry. If someone has a Fulbright to University College London or Oxford or one of the redbrick universities in the United Kingdom why would that not be propaganda in one country and not in another? Are we assuming that the people in England are more sophisticated?   HOBBS: Hmm. I like—I can’t speak to the specifics of that situation but I do think that one of the reasons why we say that propaganda is in the eye of the beholder is that meaning is not in texts. Meaning is in people, right. So as we humans try to use symbols to communicate and express ourselves, right, there’s slippage—(laughs)—right, between the meaning I’m encoding as I’m using language and words right now, right, and the meaning that you’re interpreting, because I’m making my choices based on my cultural context and you’re making meaning based on your cultural context.   So that humility, the humility of recognizing that we’re imperfect meaning makers, let’s be in a position where, again, both points of view might have validity, and one of the pedagogies that we try to emphasize in media literacy is listening with genuine curiosity and asking good faith questions with genuine curiosity is more generative of learning than asking questions or using questioning as a mechanism of attack, right.   And so we can see in our public discourse right now that we all are—we all have learned very well, right, how to weaponize information, right—(laughs)—how to use it for powerful purposes. But when we’re talking about education we adapt this stance of being open to the multiple interpretations that exist in any given context. So that’s the only way I can respond to that question.   FASKIANOS: So I’m going to go next to Asha Rangappa, who’s a senior lecturer at Yale.   It seems that the question is source and intention, not truth. Russia can say something that is true, but if they do it by covering up that they are the source of that content—black propaganda—with the intention of causing division and chaos, that’s still propaganda. So can you talk about how Russia is using propaganda in the war—in their war with Ukraine?  HOBBS: Oh, absolutely. What a great question and thank you so much for pointing out a very, very important—there’s two really important ideas in your question that I want to just underline and amplify.   One is that to be critical thinkers about propaganda the first question we want to ask is who’s the author and what’s the purpose. So many propagandists try to disguise that authorship, right, and there are so many ways to do that.   It’s so easy to disguise your identity. You can use a technique called astroturfing, which is you can set up a nonprofit organization, give it a little bit of money, and it sends out the message, right, and you, the company or government, whatever you are, you have some distance from it.   There’s, of course, sponsored content. It looks like it’s news but it’s really funded. It’s really propaganda. It’s really a form of—it’s an influence operation. So the first thing we want to try to do whenever we can is figure out who made the message and what is the purpose, and that’s why your second point is so, so important and I want to amplify this idea, this question about intentionality—what’s the author’s purpose.   But there’s something complicated about that, too, which is that intentionality is fundamentally unknowable. (Laughs.) I mean, we can make inferences about intentionality. But that’s what they are. They’re inferences.   Now, that being said, of course, we definitely see the very many and very creative ways that Russia has been active in creating and stoking and leveraging in groups and out groups to deepen divisiveness in this country and all around the world and in Ukraine and well before even the invasion of Crimea.   The Ukrainians were very much tuned into this and some of the best work happening in media literacy education was happening in Ukraine even before Crimea because they were so clearly aware of how propaganda was being used to create division between Ukrainians.   So this is partly why one of the things we want to help students recognize is how in group and out group identities can be amplified or weaponized through the power of language, right, the words we use to describe others, right, through the power of symbols and metaphors, and this goes all the way back to George Orwell in the 1930s, who wrote brilliantly about propaganda, and said basically every time humans open their mouths they’re persuading, right—(laughs)—by the very word you choose, right.   Irina, you chose insurrection. I chose riot. In the very choice of language we’ve got a point of view there, right.   FASKIANOS: Mmm hmm.  HOBBS: As we have, like, heightened consciousness about that then that really helps us recognize the very subtle forms that propaganda can take, and I think in the case of the Russian propaganda we see some brilliantly devious and terrible ways that propaganda was used to divide Americans and to polarize, and the polarization that we’re now experiencing in our country was created intentionally and strategically and is still being created intentionally and specifically by a whole bunch of different actions, not actors, not just foreign agents, I might add.   FASKIANOS: OK. So I am going to Holley Hansen, who is a teaching assistant professor and director of undergraduate studies at Oklahoma State University, asks even if people are able to teach media literacy techniques to people how do you counter the impact of the algorithms in social media, especially when they seem to reward extremist messages?   HOBBS: Yeah. Great question, and this is absolutely huge. It’s why the media literacy community is really working hard on a concept we call algorithm literacy, right, which is understanding how increasingly the messages that are in your media environment are tailored in ways that reinforce your prejudices, reinforce your beliefs.  There’s a lot of really cool activities that you can do with this. We have—there’s lesson plans and materials, resources, on the—on our website at MediaEducationLab.com. But, you know, my Google is not your Google and my Facebook is not your Facebook.   So one activity that I always do at the beginning of every semester with my students is I have them—we have some—certain keywords that we might use. Like, we might put in country names like Finland, Slovenia, the Philippines, and now take a screenshot of what comes up on your Google, and my students—within the group of thirty students my students will have different results on Google and then they’ll be able to sort of unpack how their Google has been trained by them, right, algorithmically to present them with some results and to deny them some other results.   This is a big a-ha for students and I think for all of us we’re—it’s so easy for us not to be aware. Again, we tend not to notice what we don’t see, right. So we aren’t even aware often of how our—how algorithmic bias is influencing our worldview. That’s another reason why media literacy educators insist on using dialogue and discussion and why increasingly educators are bringing people together using the power of Zoom technology from different regions of the country, different states.   So my colleague Wes Fryer in Oklahoma is working with middle school students in New Jersey to bring Oklahoma middle school students and New Jersey middle school students together to have dialogue and discussion because we—the algorithmic biases are not—they are not just limited to individuals. They also exist within community context and cultural milieus as well.   FASKIANOS: Fantastic. Let’s go to Serena Newberry, who’s raised—has a raised hand.   Q: Hello. I’m Serena Newberry. I’m a Schwarzman scholar at Tsinghua University in Beijing.  And somewhat building upon the previous question on Russia-Ukraine propaganda in addition to the critical thinking questions that you mentioned earlier, how would we go about separating propaganda attached to existing institutions, be it an organization or a country, when there is bias attached to that nation or institution?  For example, you mentioned that the author changed the name of the book Propaganda to Public Relations because rather than trying to convince people of using their critical thinking skills that word had so many negative connotations attached to it. So how do we go about that when trying to move forward in foreign relations and building bridges in other ways?   HOBBS: Yeah. That’s a really great question and I’ll tell you my China story.  I had the opportunity to go teach students media literacy in China on several occasions now and the word propaganda is very complicated in that country, right. (Laughs.)   And so we came to the conclusion that understanding media messages in all their many forms was something that required people to evaluate different levels of—different levels of trust and trustworthiness and that whether—what you called it was less important. What the label is was less important than the reasoning process that you use to make sense of it.   In China it’s called moral education, right, and it’s done in schools and it’s a way to create patriotic values, to disseminate patriotic values, and in the United States when I got taught I pledge allegiance to the flag of the United States that was also a form of moral education, patriotic education as it were, right.  And so I wouldn’t call that propaganda but I could see how someone might. And so I think it doesn’t matter what we call it. It matters that—what reasoning process and what evidence we use, what critical thinking skills we activate, in a dialogue and discussion.   FASKIANOS: So, John Gentry, adjunct professor at Georgetown University, has a question that also got an up vote—two up votes.   As I’m sure you know, the Soviets and then the Russians developed sophisticated propaganda mechanisms by what they called disinformation and active measures. They developed a doctrine known as reflective control designed to induce targets to make ostensibly independent decisions consistent with their interests.   How do you propose that targets identify and defend against such devices?   HOBBS: Wow. Yeah. That’s really a hard question because what is powerful about that framing is the way in which it is systemic, right, and that framing actually is really useful in understanding why people don’t act in their own best interest—(laughs)—right, sometimes—why sometimes people don’t act in their own best interests, right.   So I—what I appreciate about that observation, and this is—you’re acknowledging the way that sociologists have recognized that when propaganda is used in that way, systemic—in that systemic way it becomes actually really difficult or maybe even impossible for individuals to kind of work their way out of it or through it.  I think Jacques Ellul he defined—his framing for that—he called it sociological propaganda because of his sense that you couldn’t see the forest for the trees. So I think both the Russian framing of active measures and the way in which a whole worldview can be cultivated, right, that creates reality for people, and I think that’s partly why we value—we so much value freedom of speech and free markets as ways to protect us from the kind of abuses of power that are possible in more totalitarian or autocratic societies.   I think that’s why we so—we’re seeing countries, you know, sort of recognize and resist autocratic policies that allow one view of reality to be promulgated and all other interpretations of reality to be denied.   FASKIANOS: Mmm hmm. OK. So let’s go to Raj Bhala, who has raised hand.   Q: Thank you, and thank you for this wonderful presentation. So thought provoking.   So I’m asking you as a friendly member of the tenured professoriate like you, are we agents of propaganda, too? I have a new book coming out on the Sino-American trade war and I’ve often wondered have I fallen victim in researching and writing to propaganda from both sides. And, more generally, as you probably know from, you know, our careers, in our scholarship, in our teaching, in who we promote for tenure, the way we review their articles, are we also propagandists and even more so as universities get evermore corporatized with budget cuts?  HOBBS: Wow. What a—  FASKIANOS: And Raj is at the University of Kansas.  HOBBS: Raj, that is—thank you for asking that really, really great question, and this is a great opportunity to acknowledge the important work on propaganda done by Noam Chomsky at MIT, and in his book Manufacturing Consent he said that the information elites, and by that he meant the 20 percent of us who are knowledge workers and work in knowledge industries, he said we’re the ones who are most deeply indoctrinated into a system, an ideological system where propaganda—propagandists work their hardest on us and they don’t bother with the others because if they get us then they get the control. The control is embodied.   So I do think it’s very self-aware and reflective for all of us knowledge workers to be aware of how our own world view and understanding of the world has been shaped through communication—through communication and information—and the stance of intellectual humility is most urgent because—well, I think we’ve seen all around us the dangers of righteousness. What happens when you become too certain that your view of reality is the only view of reality, right? Well, bad things happen, right. (Laughs.) Bad things happen when you become too sure of yourself, too righteous, because you close yourself off to other ways of knowing and other sources of information and other points of view that may be mostly false but have a glimmer of truth in them and that’s the piece of truth you actually need to solve the puzzle, moving forward.   So the problem of righteousness, the danger of righteousness, is something that everyone working in the knowledge industries needs to be aware of and the stance of intellectual humility is so hard because we’re experts, right.  So it’s one of those things that we have to call each other out on and call each other into, right. Come into a place where we can accept that we might have a piece of—we might understand a piece of this complex problem but not all of it.   And my guess is, Raj, that in your writing and in your scholarship you adopt that stance of intellectual humility and that helps your readers recognize you’re offering them something but you’re aware that you don’t have the whole story, because that’s what we do, right, and that’s how we help each other to come closer to the truth.   FASKIANOS: So I’m going to take a written question from Skyler Ruderman, who’s at University of California Santa Cruz.   How do we start investigating internal propaganda when it is so thoroughly and casually disseminated throughout American mass culture and media, for example, the Department of Defense having oversight and script rewriting authority on movie production if the producers want to use military equipment or the ways twenty years ago consent was heavily manufactured with bipartisan support for the Iraq war in the major news outlets?   These things are easily written off, much like reciting the Pledge of Allegiance, as patriotic or nationalistic. So where do we start?   HOBBS: Wow. What a great—what a great question. Can I share my screen, Irina? Is that possible?  FASKIANOS: You should be able to. We’ll turn—  HOBBS: I should be able to share my screen. Let’s see.   FASKIANOS: There you go.   HOBBS: Can you see my screen right now?   FASKIANOS: We can.   HOBBS: I want to show you two resources that I think are really helpful for broadening our understanding of propaganda in just the ways that your question proposes.   One is: go explore my online learning modules on propaganda and check out propaganda in entertainment, right, or memes as propaganda, election propaganda, conspiracy theories, algorithmic personalization, and even art and activism as propaganda.   And then—let’s see if I can go back up here—and then go check out the Mind Over Media gallery. When I first started teaching about propaganda I was aware that my students live in a different media world than I do, right. I encounter some kinds of media and my students encounter different kinds of media because of what we talked about before—algorithmic personalization and this just gigantic flood of content that we get exposed to as creators and consumers.   So what I did was I created a tool that makes it possible for anyone anywhere in the world to upload examples of contemporary propaganda or what people think is examples of contemporary propaganda, and because I got some funding from the European Commission to do this work I have propaganda from a bunch of different countries and right now at the top of the list are these kinds of examples of different kinds of propaganda and, you know, some of them are really weird.   Like, for instance, this one, right. The person who uploaded this meme—the meme reads, for those of you who are not seeing the screen, remember when politics attracted the brightest and most intelligent—what the hell happened, right, and it’s got some pictures of politicians.   This person thinks this is propaganda because it attacks opponents and it attacks people who are Republican and it shows that Kennedy, Abraham Lincoln, and George Washington were good. However, it shows Trump as one who’s not very intellectual. And so some student uploaded this and I’m invited to rate this example, do I think this is beneficial or harmful. I think this is probably a little bit—no, I’m not sure how I feel. I’m going to be right in the middle here.   But take a look at the results, Irina. Twenty-seven percent of the people who’ve been to the website say they thought this propaganda was beneficial, 14 percent thought it was harmful, and then most of us are in the middle here. So it turns out that, in some ways, there is an opportunity to examine the stories we tell of the past and how they shape our understanding of the present day.   I’ve been doing that through recovering how propaganda used to be taught in the 1920s and ’30s in the years leading up to World War II as American educators began to be concerned about demagogues like Father McCoughlin (sic; Coughlin) on the radio, right, and the way in which the power of the voice coming—when the voice came into your living room it was a very powerful experience. It was so intimate. It was so personal. It had such an emotional power. And we realized that every generation has to address the emotional power of propaganda because the propaganda that you carry on your digital device, right, has got its own unique ways of bypassing your critical thinking and activating your emotions in ways that can be really, really dangerous.   FASKIANOS: So what would you say about TikTok?   HOBBS: Well, I’ve been fascinated. We’ve been using TikTok a lot in our education outreach initiatives and the project that I’m working on right now is called Courageous Rhode Island. It’s a federally funded project from the Department of Homeland Security and we’re using media literacy as a violence prevention tool to address the issues of domestic extremism, right.   And so we’ve been looking at TikTok videos that on the surface seem, well, quite entertaining. But then when you spend time, actually watching it—you watch it twice, right, and you start asking those critical questions that I shared with you earlier, then you really discover it’s, like, oh my gosh, this thing actually has a white nationalism agenda or an anti-trans agenda or a(n) anti—or a misogynistic worldview or an anti-Semitic worldview. But at first viewing it just looked like fun.   So we think it’s really important to take—to help slow down our encounter with TikTok, and when adults do that with teenagers and when teenagers do that with each other and when young adults do that with people of different ages it can be a mind-blowing learning experience.  And participants who are here in this call can join us on this journey. Every two weeks we have what we call courageous conversations. The next one’s coming up on April 4 and it’s called “High Conflict.” We’re talking about the media messages that put us into conflict with each other and what we can do about them. So TikTok’s one of those medium that can incite high conflict.   FASKIANOS: I’m going to take the next question from Pyonhong Yin (ph) at the University of Illinois at Urbana-Champaign.  Q: Hi. Can you hear me?   HOBBS: Yeah.   Q: Oh, OK. Yeah. Yeah. Thanks for your very interesting talk.   So I am a Ph.D. student in political science at the University of Illinois and I’m currently working on a paper about the propaganda during the international conflict, and I just got a question from two professors in a different field in political science and they asked me whether—or do you think propaganda is costly.  Like, so because I think during the conflict the leaders they—you know, the people—they usually make some very aggressive statements, right, and sometimes they might make some empty threats. So, to me, I think it’s costly because if they do not follow their words then, you know, the public—the majority of the people they do not trust the leaders. But, yeah, but I—yeah, so this is the—just the question. Yeah. So do you think the propaganda is costly?   HOBBS: So that’s interesting how you’re using the phrase costly, right. The idea is does—you’re asking, in a way, what are the consequences of the use of propaganda, right, and I think it’s a really important question because, remember, propaganda can be used to unify, right. So propaganda can be a vehicle that people use to create consensus in a group, right, and that’s—coming to consensus is part of the democratic process, right.  That’s how we—we come to consensus because it’s an essential way of solving problems nonviolently. But as you’re using the term costly you’re imagining a person, a propagandist, who says one thing in one context for one audience and one goal and maybe has to walk that back in a different context or at a different time period, and then that may have a cost because people may lower—the trust might be lowered, and I think that’s actually, like, a very important calculus that politicians have to consider in their use of propaganda.   So I really appreciate the idea of the kind of—almost like the mathematical or the financial metaphor that’s behind your question. There is a cost because the cost is trust can be increased or reduced, right, and from a politician’s point of view that’s currency, right. That has real value.   But we often focus on propaganda that diminishes trust. I want to make sure that we don’t forget that propaganda can increase trust, right. So it works both ways—the cost and the costliness. And you can learn more about this in my book. I’m putting up a link to my book in the chat, Mind Over Media: Propaganda Education in a Digital Age.   I think one way to interrogate the cost issues is to look at different agents of propaganda. Look, for example, at how activists use propaganda. For instance, Greta Thunberg, the world’s youngest and most important environmental propagandist, right. She’s been very skillful in using her language, her imagery, her messaging, to increase her credibility, right, and to—and she’s very aware of how at certain times certain messages might have a cost, and we can go back and look at the history of her speeches and see when she’s made some mistakes, right—when her messages had a cost, right, that weakened her credibility.   And so I think being strategic—looking at that—looking at propagandists’ choices and the cost or the consequences or the potential impacts, very interesting strategy. So great question. Very thought-provoking question around that metaphor. Thank you.   FASKIANOS: So I’m going to take a question from Oshin Bista, who’s a graduate student at Columbia University: What are your thoughts on the tensions/overlaps between approaching information with generous curiosity and the inaccessibility of the languages of media? How do we make this form of literacy accessible?   HOBBS: Great, great question. You know, the reason why that’s such a good question is because there is a vocabulary that has to be learned, right. To critically analyze news as propaganda there’s a whole lot of words you need to know—(laughs)—right. There’s a whole lot of genres that you need to know, right, and that knowledge, for instance, about the knowledge about the economics of news. To understand propaganda as it exists in journalism you have to understand the business model of journalism, right, why likes and clicks and subscriptions and popularity are a form of currency in the business, right.  So how to make that more accessible? I think actually journalists and media professionals can go a long way and one of the groups that I’m paying special attention to are the YouTube influencers who are doing this work through messages that are entertaining and informational and persuasive.   For example, check out Tiffany Ferguson and her internet education series. She’s a twenty-three-year-old college—recent college graduate who’s been helping her audience, mostly teenage girls, I would say—helping her audience learn to critically analyze all different aspects of internet culture, right.  That is a great example of somebody who’s using their power as a communicator to help their audience be better informed and make better choices, and I feel like a lot of media professionals can play that role in society.   In fact, another good example of that is Hank and John Green, the quintessential YouTubers, right. So I think media professionals are really well poised to bring media literacy knowledge and concepts to mass audiences and that’s why they’re a vital part of the media literacy movement globally. Not just here in the United States but all over the world.   FASKIANOS: So we’re seeing in Congress, you know, Congress taking on TikTok and wanting to ban it, and Chip has—Chip Pitts of Stanford has a follow-up question: Beyond education for media literacy, what laws, regulations, norms can our government and others deploy to help control the worst harms—required content moderation, you know, applied young international human rights standards versus U.S.-style free speech, et cetera? So what is your feeling on that?   HOBBS: Yeah. Great question. Of course, we’re always—we’re often asked—some people think that media literacy is a substitute for government regulation. But we’re always very attentive to say, well, our interest is in focusing on what media consumers need to know and be able to do.   That doesn’t mean that there isn’t a role for regulation and, for example, I think one of the easy to document positive impacts of media regulation is the GDPR regulation, right, that Germany enacted. That actually—that benefited the entire world, right.   And so the question about content moderation and Section 230 and the appropriate ways to regulate social media these are complex issues that people—that we can’t solve that in two seconds and we, certainly, can’t solve it globally because we can’t.   But we can think about how different countries around the world, as they implement social media regulation, it becomes like little laboratories. Let’s—so as countries pass laws about social media let’s see what happens, right. Let’s see what the results are culturally, politically. Let’s see what the benefits of that regulation and let’s see what some of the unintended consequences might be.   So that’s the only way that we’ll design regulation that accomplishes its beneficial goals without its unintended consequences. So I’m kind of happy that states like California are regulating social media now, right. That’s awesome to see little laboratories of experimentation.   But I’m not prepared to tell you what I think the best approach to regulation is. I think we just need to be attentive to the fact that regulation will be part of the solution in minimizing the harms of communication in the public sphere.   FASKIANOS: Well, unfortunately, we have to end here because we’re out of time, and we have so many more questions and comments. I’m sorry that we could not get to you all.   We will send out the link to this webinar so you can watch it again as well as links to Renee’s book, to her community conversations. I see it, “Courageous.” I have it up on my screen now for the “High Conflict” event on April 4, and anything else, Renee, that you think. I especially love the questions that you showed us on your phone. I want to get those so I can share them with my family.  So thank you for being with us and for all of your great questions and comments. Appreciate it.   The last Academic Webinar of the semester will be on Wednesday, April 12, at 1:00 p.m. Eastern time. So please do join us for that. We’ll send out the invitation under separate cover.  And I just want to flag for you all that we have CFR-paid internships for students and fellowships for professors. If you go to CFR.org/careers you can find the information there, and you do not have to be in New York or DC. You can be remote, virtual. They’re great opportunities for students even if you are not in one of these two cities.   Please follow us at @CFR_Academic and visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for research and analysis on global issues.   Again, Renee Hobbs, thank you so much for this conversation, your research. We really appreciate it and look forward to continuing to follow the really tremendous work that you’re doing.   HOBBS: Thank you so much for the opportunity, Irina. I really enjoyed talking with everybody today. Bye now.   FASKIANOS: Bye-bye.   (END) 
  • Nigeria
    The Lagos-New York-London Echo Chamber
    Desperate to will a preferred candidate to victory, Western journalists fell into a tunnel vision on Nigerian politics.
  • China
    China’s Global Influence Campaign, With Joshua Kurlantzick
    Podcast
    Joshua Kurlantzick, a senior fellow for Southeast Asia at the Council, sits down with James M. Lindsay to discuss how China uses its media power to influence the domestic politics of countries around the world.
  • Media
    Countering Disinformation Through Media Literacy
    Play
    Summer Lopez, chief program officer of PEN America’s Free Expression Programs, discusses the psychology and spreading of disinformation and how to avoid injecting it into public discourse. The webinar will be moderated by Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times. TRANSCRIPT FASKIANOS: Thank you. Welcome to the Council on Foreign Relations Local Journalist Webinar. I’m Irina Faskianos, vice president of the National Program and Outreach at CFR.   CFR is an independent and nonpartisan membership organization, think tank, publisher, and educational institution, focusing on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine. As always, CFR takes no institutional positions on matters of policy.  This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover, and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues, and provides a forum for sharing best practices. The webinar is on the record. We will make the video and transcript available to all of you. It will be posted on our website at CFR.org/localjournalists.   We are pleased to have Summer Lopez and host Carla Anne Robbins today with us, to talk about “Countering Disinformation Through Media Literacy.”  Summer Lopez serves as the chief program officer of Free Expression Programs at PEN America. Previously, she served as deputy director of the Office of Democracy, Human Rights, and Governance at the United States Agency for International Development. Ms. Lopez was also vice president of operations at the AjA Project, a nonprofit organization that provides media-based programs for refugee, displaced, and immigrant youth in the U.S. and internationally.   Carla Anne Robbins is a senior fellow at CFR. She is a faculty director of the Master of International Affairs program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. And previously, she’s—she was deputy editorial page editor at the New York Times, and chief diplomatic correspondent at the Wall Street Journal.   So, thank you both for being with us, and having this conversation. I am going to turn it over to Carla to get us started.  ROBBINS: Thank you, Irina. And thank you, Summer, so much for doing this. And thank you to—for everybody for joining us. And thank you to all the journalists on this, and for the work that you do. It’s a difficult and challenging time with the 24/7 news schedule, and always in awe of the work that you’re doing.   So, Summer—first, the way we’re going to do this, Summer and I are going to chat for thirty minutes or so. If you guys have a lot of questions before that, you know, throw them in, and we’ll talk to order, and could throw it open to you after that or before that.  So, Summer, before the midterms, the OSCE, the European-based election monitoring organization, issued a pretty chilling report about the United States political system. You know, we’d all read OSCE reports about, you know, developing countries, but suddenly, they were writing about us. And they were warning about threats of violence against election officials, potential voter suppression and voter intimidation, the level of election denialism among GOP candidates, and they warned about election misinformation.   Luckily, we didn’t see the voter suppression and violence. Many of the key election deniers lost, especially those who can affect the next round of voting on the state level. But how good was the misinformation campaign in the lead-up to the midterms? Was there the deluge that many feared? And if it did take place, what were the topics?  LOPEZ: So, I think that there wasn’t sort of the deluge that people necessarily feared, but I’m hesitant to say it wasn’t really an issue, you know? And I think what we have been hearing from a lot of journalists, and community activists we’re engaged with on the ground, is that essentially, the main issue was still the 2020 election, that the big lie was really the narrative that continues to be pushed going forward, and sort of stoking doubt in issues like mail-in ballots, and sort of the process of the elections themselves. And some of those kind of narratives that had initiated in 2020 were just kind of continued and magnified.  And so, I think that is still a lot of what we’re seeing, and I don’t think that’s going away necessarily. Obviously, you know, the sort of outcome in, particularly, some of the secretary of state elections around the country are reassuring, in terms of some of the election deniers not being in a place to make decisions about the elections. But I think the narrative and the attempts to kind of stoke doubt in the process are definitely still there, and will likely ramp up as we go into 2024.  I think one of the other things we’ve really noticed more in the past year or two, in particular, is that a lot of the disinformation isn’t happening in this sort of very visible way, that maybe it had been previously, where a lot of things were really happening on Facebook and Twitter, sort of, you know, memes and messages that were pretty visible, and people could see a lot of it. What we’re hearing from a lot of folks we’re working with is really that so much disinformation now is also taking place in encrypted, you know, WhatsApp chats, family chat groups, that information is spreading in ways that are not always entirely visible, either to the media, to researchers, or to really, to anybody. And so, I think it’s also just hard to kind of really assess the scale of what’s happening right now, because a lot of it is happening kind of behind the scenes, in some ways.  ROBBINS: So, the main topics of misinformation that you’re hearing from journalists about, is it all COVID, all the time? Big lie and COVID, are those the two main topics?  LOPEZ: Big lie, COVID, we heard quite a bit about some issues around the raid of Mar-a-Lago, and sort of the questions around the role of the FBI, and is the FBI compromised, had become kind of a significant topic, quite a bit around the Dobbs decision, and Roe v. Wade, and sort of just using some kind of really significant political issues, you know, as kind of wedge issues to stoke tension.   You know, I think disinformation can be about projecting a particular narrative, but it can also just be about, you know, stoking people’s doubt in narratives and in institutions overall. And I think that’s a lot of what we’re seeing right now as well, and particularly, as I said, in sort of the institutions and processes related to elections and public health, unfortunately.  ROBBINS: So I spent some time with your very useful 2021 survey of reporters and editors on disinformation and newsroom responses, and I’d like to ask you some questions about that. We shared it, and we will—we’ll share the link with everybody again.   So, first of all, I was struck by how much reporters and editors said disinformation had changed their approach to work. And according to the report, more than 90 percent of the people you guys interviewed had made one or more changes in their journalistic practices, as a result of disinformation.   You know, I’ve done a lot of writing about disinformation, but I thought back to the days in which—and I don’t want to make myself sound like I’m, you know, on the verge of death here—(laughs)—but it’s been a while since I worked in a newsroom—you know, I—writing misinformation, yes, but I hadn’t really thought about how it might change my own work.   So, what did you hear from reporters and editors about how it changed their daily work?  LOPEZ: Yeah, I mean, I—to some extent, I think it’s sort of a practical question, in that it adds, you know, time. People feel that they have to put more effort into, you know, fact-checking information, thinking about the sources that they’re utilizing, and to looking at, you know, is this photo legitimate? Is this source legitimate? Is somebody trying to get me to report something that’s false, providing false information in an attempt to call—you know, catch me out on something.   And so, I think there’s a lot—there’s both sort of the practical time that that takes. There’s also sort of the added emotional burden and stress of feeling that that’s a possibility. You know, I think we had 11 percent of respondents who said they had, at some point, accidentally reported false information. So I think that sort of consciousness of the fact that people are trying to manipulate journalists and the media as well, is adding stress and burden.  You know, then I think there’s also sort of things that people see they need to do more of, to respond to disinformation, so more kind of public engagement, community engagement, you know, spending more time kind of explaining how reporting is being done, which I think is really important. I think, you know, it’s part of—our media literacy work has always included a strong focus on just understanding a little bit more about the practice of journalism, and, you know, so that people don’t get manipulated into thinking, you know, that an anonymous source inherently means something can’t be trusted, or is false, right? Sort of just understanding a little bit of what goes into professional journalism, and that piece of it, I think, is really critical for the public.   But I think, you know, journalists and newsrooms feel like that’s something they need to spend more time doing, and just doing kind of the basic explanations of both the practice of journalism, and also some of the things they might be reporting on, in addition to, essentially, the kind of traditional reporting that they would be doing.  ROBBINS: So that’s something salutary about it, is that this sense that somehow—I just assume you trust me, because I am X newspaper, is—and as you know, there’s been a general decline—you know, a huge decline across the board—of trust in institutions. But if we as journalists, and as reporters, and as editors, are saying, we have to explain more to our—to our readers, how we made a decision, without it being overly navel-gazing—and how you find the balance there is an enormous—(laughs)—one, without sounding defensive. I mean, there’s something hugely salutary about that. And the difference between a hard-copy paper and the internet is that you actually have space to do it. And so I think there is something salutary.  Now, did you find that there were a lot of—you know, that this has become a new standard among news organizations, that just basically explaining how they made news decisions has become increasingly the standard?  LOPEZ: I don’t know if it’s quite the standard yet, but I think there is a lot more recognition that it has to be a part of the job. And you know, I think we’ve seen that in a lot of different ways. I mean, I was just noticing, I think, last week, during one of the horrifying mass shootings we’ve had just recently, you know, that if you are—if you’re following it on the New York Times’ live feed, every so often, a message will pop up that explains how they report on breaking news stories, and that they may rely heavily on policy information at the beginning, but they may make adjustments as they—you know, as that information becomes clearer, and as they speak to additional sources. And I—you know, I thought that was really helpful, honestly.   And we’ve done some work with the Texas Tribune. You know, they had a whole kind of webpage that just sort of explained how they were going to be reporting on elections. And that included a box about how they report on disinformation, including the fact that sometimes, they won’t report on disinformation, if they sort of assess the reporting might amplify something that might not find that much of an audience otherwise.  And so, you know, I think those kinds of explainers are becoming more common. And I think they—you know, I do think that’s actually a very good thing, in part just because, you know, I think we don’t have the kind of civic education in this country that we used to, that kind of walks everybody through the process of, you know, how the news gets made in high school, or something, that would—a lot of that has disappeared. And how that happens has also changed as the internet has come into play, and the way that news develops has changed too.  So, I do think that that’s, you know, a little bit of a silver lining out of all of this, is a bit more transparency about that, and public education, too.  ROBBINS: So, we were talking before we started about the CDC and the confusion about COVID, and so many of the rules. But we’re generally—Irina was talking about this expectation that somehow, we were all going to get shots and that nobody was going to get sick. And then we got sick anyway. And for those of us who nevertheless believe, or have some faith, in institutions, we didn’t come away and say, oh, well, it’s all a big, you know, lie, and I’m not going to get a shot because of this. And—you know—(laughs)—I can’t even count the number of times I’ve been vaccinated, and I still got COVID. But that doesn’t mean I’m not going to get the next booster that they offer.  But how much of that do you think was a failure of the CDC, which really had really bad communications, I think, in the Trump administration? And I think Walensky is doing a better job, but not a great job right now. But how much of that was a failure of the way science was reported by newspapers, in the coverage of COVID? And how much of it was inevitable because this was new, and we just didn’t know?  LOPEZ: Yeah. Well, I think—I think the answer is both of those, because I think, in part, there wasn’t enough explanation of the fact that this was new, and—(laughs)—we couldn’t possibly know all the answers, and that the vaccines are new.   I mean, when I had COVID earlier this year, I took Paxlovid, you know, and the little insert was very short, because they didn’t know a whole lot about this drug. And they were pretty honest, too—(laughs)—you know, only so many people have taken this, so we only know so much.   You know, I think that—so I think that was a failure, you know, in terms of how this information was communicated with, you know, being honest about the degree of certainty that existed, and why it might still be really important to get the vaccine, or to take seriously the warnings that were being given, because these were cautionary measures that were worth—you know, worth the potential bits of it that maybe we weren’t completely certain about.  And I think one of the things that we hear a lot—and that I know there has been some research on—is, you know, people talking about feeling like, you know, even if you just sort of have questions about something, sometimes, the media or sort of society as a whole might dismiss you as a conspiracy theorist, or an extremist, or just being dumb, or whatever it might be. And so people feel like, you know—I think there’s sort of the middle of society, who, you know, isn’t sort of fully enveloped by conspiracy theories, but also, you know, has some questions about things. And feeling like those questions are dismissed can, you know, drive people away from those institutions, and from outlets that they feel might not be, you know, sort of acknowledging that they may have some legitimate doubts, or they may not really understand, you know, how the vaccine was developed. Or they may not really understand the electoral process, and what it means to do absentee ballots.  And so, I think a lot of it is also about just, sort of, treating, you know, our audiences—whether that’s the CDC, or the media—treating the audience with respect, and kind of acknowledging that, you know, we live in a really confusing time, honestly. (Laughs.) And there’s a lot of anxiety about a lot of different things. And so, you know, I think starting from a point of empathy and respect is really important, in terms of how we try to bring people—you know, to build trust, and to—and to counter some of the disinformation narratives that are out there, because there’s also research that, you know, when people feel more anxious, they will also just cling to information that is more aligned with what they already believe. And so, you know, I think that’s kind of a level of anxiety that we’re just living with at this—in this moment, and have been, especially for the past two and a half years, is contributing to making people more vulnerable, or more susceptible to disinformation. And so, you have to really think about that as a factor, when we think about our messaging, too.  ROBBINS: So, the other key takeaway from your survey was that only 30 percent of the journalists said their news outlets—outlet had generally effective processes in place to cope with disinformation, and 40 percent said no organization-wide approach exists. OK, my first reaction was, as a masthead editor, oh my God, they just whine. But—(laughs)—the more I thought about it, you—none of that is surprising, but I came away realizing that I wasn’t sure what, quote, effective processes are, and if there is now a consensus about the best way to push back against disinformation.   And so, you have a list in your report. And they go from, you know, some technical things, to best practices. And so, I wanted to ask you first about the technical things. So, you have this list here of bot detection devices, image verification tools, social media monitoring tools, reverse-image searching tools, using fact-checking sites. You know, is this something that should be assigned to a desk? To individual reporters? Should we be training every one of them in this? I mean, these sound—some of them sound expensive. Some of them sound very technical. It sounds like something that we used to assign to librarians to do, or ask librarians to do. Or you know, I mean, how do—how does—how does a smaller newspaper, you know, master something like that?  LOPEZ: Yeah, I mean, I think—I think consensus is probably a strong word at this point, for kind of what are necessarily the best practices, and also recognizing that the same things are not going to work for large newspapers as they are for smaller, local outlets, that don’t have the same resources or the same staffing. So, you know, I think that’s part of what we’re trying to kind of work on, coming out of this report, is to consult with more journalists and editors, you know, on the findings, hear more from them about, you know, what—and I’d love to hear from folks in this conversation today too, you know, about what you feel would be useful, because, you know, we would like to, kind of, develop additional resources and programming to support newsrooms in thinking about what they can do, especially those that might not have the resources to do things easily on their own.  Some of these tools, though, are relatively straightforward and free. And so there are—you know, some of this is pretty low-hanging fruit, right? I mean, reverse-image search is something you can do very easily on Google. You know, bot detection, there are some websites where you can just, you know, put in a Twitter handle, and it’ll tell you the likelihood that it’s a bot or not. There are, you know—  ROBBINS: We want—we can—you will share this with us. And we will share this with everybody here.  LOPEZ: Yes.  ROBBINS: OK, go ahead. I didn’t mean to interrupt you—   LOPEZ: No. (Laughs.) It’s fine.  You know, and there are—there are some websites that do things like, you know, track disinformation narratives online, or where you can search for things. There’s one in particular—I will find the name of it, and definitely share it with folks—you know, that was created by researchers, specifically for journalists, to help them be able to track and record disinformation, because, you know, sometimes, people will see something they want to report on, and then the thing gets taken off the internet, as well. And so you kind of lose the original content. And then, you know, they kind of track, you know, the way that narratives might be moving around the internet, as well.   And so, there are some relatively simple and straightforward resources out there. I think one of the things we were struck by was that even that, you know, wasn’t something that most people felt they knew how to make use of.   And so, I think there’s a lot of pretty straightforward education and resource-sharing that can be done. But then, you know, some of these things are obviously more complicated. Not every newsroom is going to be able to have somebody on a dedicated disinformation beat. And that might not make sense for everybody.  So, you know, I think some of it is going to have to be quite tailored to thinking about what makes it an—it an individual newsroom, but at least, you know, making some of these resources available, and thinking about it as a thing that is affecting, you know, basically every journalist, I think, is a really important first step.  For us, a lot of this is also based on work we’ve been doing over the past four years, about online abuse and its impact on journalists and writers. And that’s—a lot of our work on that was initially just kind of basic training for journalists—you know, here are some ways to keep yourself safe online. Here are ways to think about, you know, how you can respond if you experience online abuse, how to, you know, report it and protect yourself. That really transitioned into working more with newsrooms on sort of what were institutional best practices, and what they could put in place to support and protect journalists, as more of an institutionalized thing.  But again, that’s very tailored too. And we really—we work with individual newsrooms to think about what is needed, and what will work best for them. But recognizing that these are things that kind of have to be thought about at that level, I think, is really critical.  ROBBINS: So, you have this list of questions about whether—has your new outlets taken any of these actions. So I assume that means that you think that these are actions that would be better practices, if not best practices.   Put more emphasis on choosing headlines, leades, and photos that minimize their potential misuse as disinformation. So, how do I write a better headline and leade, and choose better photos, that minimize the potential misuse of disinformation? I wrote a lot—I’ve written a lot of headlines in my life.   LOPEZ: (Laughs.) Well, you know, I think part of it is, you know, thinking about—there’s an—there’s an example in the report, actually—I’m not going to remember it off the top of my head, you know—but about, you know, headlines that don’t necessarily state very clearly what is actually happening, or leaves some room for interpretation. Sometimes, I think, you know, headlines that get drafted with a little bit more of a clickbait mindset could potentially be misrepresented. And you have to remember that, you know, most people, obviously, are kind of skimming through their social media, and they might see just the headline. So, I think a lot of it is just remembering that, you know, it isn’t that people are going to see the headline, and then read the whole article. The headline might be all that they ever see of this information. And they—if they only interpret that, you know, how is that going to kind of lodge itself in their mind?  So, you know, I think some of it is really just a little bit of extra consciousness about that, and about how people are consuming information, and what information you’re kind of putting upfront as what people are going to be most, you know, most kind of struck by.  You know, and same for photos. Obviously, every photo could potentially be manipulated. (Laughs.) But again, you know, what’s the photo that’s going to be the leade associated with this article? If that’s all people see is the Tweet about the article, and the photo that comes up at the top, you know, just being conscious about what that is, and what message it sends about the—about the article itself.  ROBBINS: So, I think the example that you—in the report was a story in the Chicago Tribune with the headline, a, quote, “healthy doctor died two weeks after getting a COVID-19 vaccine. CDC is investigating why.”  LOPEZ: Right.  ROBBINS: And it turned out that far fewer people saw the follow-up news that after an autopsy the doctor’s death was attributed to natural causes and not the vaccine. But you can see how something like that, you know, would be shared wildly.  LOPEZ: Exactly. Exactly.  ROBBINS: But this is a—you know, these are—these are sort of—so, other best practices that you think that, you know, that—I mean, I can go through your list of questions here, all of—you know, your list of, you know, implied best—you know, better practices, that completely intrigue me. You know, implement changes to attract and hire journalists to ensure a wide variety of perspectives. That’s—obviously, has many, many advantages. You know, have systems in place to respond quickly to disinformation. That’s a hard one, as you said, particularly as things move away from—you know, I remember when I was at the Times, and—(laughs)—God, I remember when we first started driving cars—(laughs)—that, you know, suddenly, we had people who were monitoring Twitter all the time, and getting stories off of—off of social media. But, you know, the notion that things are moving into encrypted platforms, that becomes a change, and it becomes a real challenge itself.   What sort of systems should organizations, if they can do it, have in place to respond quickly to disinformation?  LOPEZ: Yeah, well, I think—I mean, it kind of comes down to the fact that we talk about both pre-bunking and debunking disinformation. So obviously, there’s sort of how you can debunk disinformation that’s already out there, you know, and that, too, requires some assessment. As I said, you know, is this disinformation relatively isolated? Is it not going to, you know, reach that large of an audience? Do you risk amplifying it by saying anything about it? Maybe you don’t need to respond to it at all. But then, thinking consciously about how you do, if you do, debunk.  I think, you know, what we hope journalists and newsrooms will get better at is the—is the pre-bunking piece, and kind of anticipating where disinformation is likely to occur, you know—and thinking about what sort of information can be provided proactively. This goes a little bit to the—to the question of, you know, more explainers, again, about—not just about sort of the practice of journalism, but also about the issues that are most likely to be contentious.   So, you know, talking to people—in the 2020 election, we did a lot of work, you know, thinking about how to head off disinformation around an election that was going to be unusual, right? There was going to be a lot of absentee voting, it was going to take longer to count the ballots. And so a lot of, you know, what we were really pushing for was a lot of messaging about the fact that people needed to know that it was not likely that we were going to have results on election night, and that that was going to be OK—(laughs)—you know, there was a reason for that. Here’s how the process works, you know, here’s how elections get called, and really kind of prepping people for that, so that, you know, ideally, they’re then more resilient when disinformation narratives are coming at them, and telling them that this is—means the election has been stolen.  You know, and so I think there’s a lot of areas where that can be really valuable. And I think the—you know, again, kind of what we’ve heard from a lot of journalists, I think, is that they find a lot of hunger for that, and just sort of issue explainer information. There was a journalist we—I spoke with at a symposium I was at a few months ago, who was talking about, you know, that one of the issues that was really relevant in their community was about rent. And so, they did a whole explainer about sort of who had jurisdiction over rent in local government, and how you could reach out to people about it. They said it was the most popular article on their website for a month, because that was the kind of information people really just felt they needed, and that sort of thing, you know, on any host of issues can help people, you know, be more informed in advance of disinformation kind of coming at them.  So, you know—and then I think the other piece kind of goes to building community trust as well, and a lot—you know, I think there’s a lot of need for community engagement, especially for local outlets. And again, this is tough, because it requires time and effort. But being out in the community more, again, sort of explaining sort of your role in the community as a local journalist and a local outlet can really, again, kind of help build up that trust, so that when disinformation occurs, you know, your outlet is looked to as a trusted source. And I think that a lot of it is kind of about laying the groundwork for that, so that people—so that there’s a bit more of that resiliency in place in advance.  ROBBINS: So, I want to turn it over to the group. I’ve got a lot of questions about the resources that exist out there, including the resources that you guys have. But I—we do have one question in the Q&A, we’ll start with that one. But please, I want to—if we can remind everyone how to ask a question, can we—can we do that?  OPERATOR: Yes, as a reminder—  ROBBINS: Oh, thanks, Audrey.  OPERATOR: (Laughs.) No, no, just going to say.   (Gives queuing instructions.)  ROBBINS: That’s great. Thank you so much.  So, we have Mark Lewison asking a question. Mark, do you want to ask the question yourself, or should I read it?  I will read it. I’m good at reading.   Many of my college journalism students still treat Dem-GOP sides as equals, and both have legitimate political perspectives to cover in every story they write. They smile at, and forgive the disinformation, and then pretend the GOP is honorable, and worthy of, quote, equal-time coverage. What can we tell these future journalists about, quote, fairness?  You know, in their defense—I mean, it took me a very, very long time to get past the good people on both sides argument.   LOPEZ: So, I’m really glad you asked this question, because, actually, we just put out another report just a couple of weeks ago on, basically, this very issue, and really looking at how journalists have been reckoning with the question of, you know, how do you report—do political reporting in particular—but, you know, when one of the sides that you are reporting on, you know, represents a lot more extremist views than it used to, has a lot of, you know, candidates who are, again, election deniers or, you know, expressing white nationalist rhetoric, and things that, you know, are kind of outside the norms of political discourse, at least as they have existed for some time.   And I think, you know, our sense—we interviewed seventy-five journalists and others for this report. And you know, our sense was really that people are thinking about this a lot—(laughs)—and then that things have changed, you know, quite a bit over the past six years. You know, I think looking at even the reporting, you know, around Trump’s announcement that he was running was quite interesting, you know, in terms of looking at sort of the headlines, and how people brought in, you know, the fact that he had investigations against him that were active, that he had, you know, played a role in stoking an insurrection on January 6th, and sort of, you know, didn’t—I think didn’t kind of give in to some of the temptations that existed earlier to kind of, you know, take advantage of the hubbub that he—(laughs)—created, and that can, you know, can be useful, to some extent, in terms of generating views.  So, I think, you know, there is a lot of consciousness about this. I think—you know, I don’t think that anybody really believes that, you know, we should be abandoning principles of objectivity and of bringing all sides to a debate to bear. We certainly don’t think that’s a good idea either.   But I do think, you know, there’s a quote in that report that says something like, you know, that doesn’t mean that—you know, just being fair doesn’t necessarily mean that everybody gets an exactly equal perspective and time, or that you pretend that everybody is exactly the same, right? That it’s OK to acknowledge, you know, that this person was involved in an insurrection, or is an election-denier, or, you know, has some affiliation with an extremist organization. You know, that that is not something that should be sort of left out or that that shouldn’t be a factor in how you report on what they’re saying.  And this really connects to the disinformation piece, because a lot of problem—you know ,the problem is also that a lot of those same folks are the ones spewing a lot of disinformation, and from positions of power. Which, of course, makes it, you know, more impactful and more challenging to undo its impact. And so I think, you know, again journalists being kind of prepared to go into interviews equipped with the facts, so that they can counter false statements, making sure that they, you know, include that context in the reporting.   A lot of what the report looked like was also sort of the—some of the reporting around prominent white nationalists some years back, Richard Spencer and other folks like that, you know, that kind of took an attempt to humanize them and sometimes didn’t fully include all of the context and associations and viewpoints that they represented as starkly as it could have. And so I think, you know, that’s—there was a lot of reflection from the journalists we talked to about how some of that reporting needs to and has evolved.   But I think it’s—you know, one of the conclusions we kind of came to was that political reporting has become extremism reporting, to some degree. And extremism reporting is a particular thing—(laughs)—that requires some—you know, some particular knowledge and preparation as well. And so I think there’s a lot of lessons to be learned from reporters who have been covering those types of beats for a long time, and thinking about, you know, how we—how that gets a little bit more integrated to reporting more broadly.  ROBBINS: So Matt Rodewald was a comment here. Matt, as an editorial writer, I’m going to respond to you. I think—I think this is a truncated question. But would you like to—would you like to read your comment, or? Well, Matt wrote: This is the problem right here. You automatically assume the GOP is bad. Why aren’t the Dems bad? Who’s talking to independent voters? What about Republicans who don’t associate with Trump?  LOPEZ: Well, and I think this gets to my point earlier about the fact that you don’t want to be alienating your audience either, right? And alienating in particular the folks who very much are in the middle of society right now, and feeling, I think, very lost, to some extent, between a pretty polarized debate. And so, you know, I think there’s—you know, there’s a realization that there are, you know, a very significant number of GOP candidates this past election, you know, had expressed denial of the outcome of the 2020 election. You know, that, to me, is concerning, and that’s a little different than sort of a policy debate.   But that doesn’t necessarily—that doesn’t mean you can dismiss, you know, an entire one of our two major political parties, by any means, or that you don’t want to still talk about those policy issues. So, you know, how do you balance these things? It’s, obviously, very tricky, but I think very much—you know, there is a risk to coming across as, you know, dismissive of one entire side of the political spectrum and of people who are really struggling, you know, to figure out what is their space within such an extremely polarized political dynamic? And so I think that that’s really critical as well.  And recognizing that people have legitimate questions, you know, that people do want both sides to be challenged and held to account, as they absolutely should be in a democracy, and that, you know, people need to feel like journalism is taking a sort of responsible approach to that. Or else, you know, it could also further stoke doubt and distrust too.  ROBBINS: And I do—Matt, I think you do have a very legitimate concern here. But I also do think that the responsible coverage is quite clear about where people sit in their perception of the elections—that they’ve lied, their allegiance to Trump or not to Trump. You know, that’s—I think people make pretty clear distinctions on that when they talk about the Republican Party. And—(laughs)—my husband covered the Hill for years for the Washington Post. And over breakfast every morning I’d ask him the same question: You know, all these people who we knew when we covered—you know, do they really believe the things that they say? You know, I don’t think we can make that judgment for them. And Matt and Mark are duking it out in the Q&A right now. (Laughs.) But we can’t make assumptions about what people believe. All we can do is work on what it is that they say. And that’s our responsibility as reporters.  Which does go to another question here, which is the truth sandwich question, which I wanted to ask you about. But I also wanted to ask you, you know, in a broader context here, when I was on the edit page at the Times we agonized overusing the word “lie.” Just absolutely agonized over it. And we just—and this was on an editorial page. Because we said to ourselves, to say someone’s lying means that we know what’s inside their soul, we know what their intention is. So we would use things like “misspoke” or “prevaricated.” And we just came up with all these things. And I think we finally decided that Dick Cheney was lying about Iraq, OK? (Laughs.) And we just—it was just an agony to finally do it.   And then—and then suddenly—you know, then after the Times, on the news side, came out with, you know, Trump lying about the election, and then Trump lying—something about Hillary Clinton and, you know, undocumented voters being bussed in from New Hampshire—or, to New Hampshire, or something like that. And they used it twice in a very short period of time. And I remember I wrote a piece about this, about how we had agonized, and how it migrated to the news side, and this question about calling a lie a lie was an important thing. On the other hand, did it lose its meaning if you used it too often?  Now, we use the term “lie” all of the time because people are lying. And I’m not saying—I’m not making a judgment here about which party here, Matt. I’m just saying that people lie, that we should call out when they’re lying. That said, how do you deal with the more general issue of issues that are not true? I mean, do you subscribe to the truth sandwich issue? Do you say, of course, the election wasn’t stolen. So-and-so said it was stolen. Let me remind you, it wasn’t stolen. I mean, what’s the best way of covering that?  LOPEZ: Yeah. I mean, I think—I mean, first of all, I think I agree that it is important to call an obvious lie a lie. But I do think that there’s a risk to using it too much and that that, you know, that should be somewhat reserved for things that are quite clear. And there are those things, right? So I don’t think it’s, you know, impossible to say that. But I do think it’s important that it be something that people don’t feel is getting bandied about kind of recklessly either. Because, again, I think that will undermine some degree of trust.  You know, I think the truth sandwich which, you know, just so people know, is you kind of you state the true thing, they you acknowledge the falsehood, then you restate the true thing, so that it’s kind of captured safely within the context of truth. You know, it can be very effective. Obviously, that’s a good kind of shorthand. But I do I think there are cases where, you know, I’m not sure that every article right now about election denial needs to kind of acknowledge, you know, well, some people believe X. You know, but it can also just kind of go to how it’s presented, right?   This isn’t a question of, you know, some kind of giving credence to this as a belief that the election was stolen. I think stating there is zero evidence that the election was stolen, nothing has ever demonstrated that there was any manipulation of this election, is an important thing to keep repeating. And I do think that, you know, one of the reasons that disinformation works is just because our—the way our brains are wired, and it manipulates the way our brains are wired. So, you know, the more that you hear something, the more likely you are to believe it’s true. Even if you start out knowing darn well that it’s not, it just kind of becomes more normalized the more you hear it.   And that’s something that, you know, purveyors of disinformation very much use. So I think it’s something that we should use in reverse as well. (Laughs.) And so continuing to repeat the facts about a situation even if it feels like maybe they’ve been overstated, I think, is really critical. And so, you know, I think the truth sandwich is not a bad shorthand, but it might not necessarily be exactly the right approach in every case.  ROBBINS: So you got—I mean, we have a lot of journalists on this, who are not asking questions. Come on, you guys. Ask questions! I mean, I think the question that we want to hear from you is: What help can Summer’s group provide for you? And for resources so that we can make it easier for you? I mean, newsroom assets that journalists told PEN should be developed include a database of exerts organized by topics that reporters can turn to for help in debunking disinformation, mechanisms for collaborating across news outlets, for—let’s face it, most news organizations are under-resourced these days. And you’ve got PEN, you’ve got Summer here. And they do have resources. They already do have things that Summer can describe of what’s already out there. But, you know, she’s here. Tell her what you want.  You want to talk a little bit about your resources while people think about what they want for Christmas from you?  LOPEZ: (Laughs.) Sure. Well, let me just say, a little bit of the work that we’re doing right now. So there’s work we’re going to be doing to kind of develop resources. And we’re hoping to develop sort of an online hub that will pull together a lot of the existing resources that are out there into one place that’s very accessible, as well as other things that we might develop. You know, we’d like to do sort of some short video explainers and things so that, you know, if you want to make use of, you know, this bot detection tool, or whatever, you have kind of an easy way to figure out how to do that.  You know, the other part of our work right now is kind of looking at engaging in communities that are particularly targeted by disinformation and working with, you know, trusted figures within those communities who can be sources of resilience and sources of credible information for people. So local journalists, faith leaders, librarians, educators, community leaders, and kind of helping equip them with tools as well and connecting them to each other too.   And so, you know, I do think that one of the things we did in the runup to the 2020 election was to kind of hold some town halls, virtual town halls, that were, you know, bringing together some of those folks within different communities to talk with the public about, again, what the election was going to look like, what each of their role in that process was going to be, and answer questions. And, you know, those were actually quite well-attended. And I think there is, you know, a lot of, as I said, just kind of interest in understanding what the process is going to look like and what different people within a community, what their role is going to be, and who people can kind of go to if they have questions.  I mean, disinformation is also—you know, debunking is more effective, and fact-checking is more effective, if it’s coming from people that the audience already trusts and that they identify with. And so if you can, you know, bring people into your efforts to fact check and debunk disinformation, who are part of the communities you’re serving and who are, you know, trusted voices, that can be much more effective. So one of the things we’re trying to do is kind of help foster some of those connections in places where, you know, people are really working on a lot of these issues already, but may not necessarily always have a chance to talk to each other about it.   And that work right now is focusing in Miami and South Florida, and Austin and Dallas-Fort Worth in Texas, and Phoenix in Arizona, which are all places PEN America also has chapters, some existing engagement and presence. So would love to hear people’s thoughts on any of that work as well. And then, as I said, we have a lot of—a tremendous amount of resources about online abuse, which we’ll also share both for individual journalists and for newsrooms to think about. How to be safe online, and increasingly, you know, offline we’re seeing a lot of that shift into offline intimidation and harassment as well, unfortunately.  ROBBINS: So that’s great. We have a question, and it’s a question I would love to hear the answer to, from Julie Anderson, who I gather is the editor of the Sun Sentinel in South Florida. Ms. Anderson, do you want to ask your question?  Q: I’m also the editor of the Orlando Sentinel. And my—especially in Orlando, my audience is very different than South Florida, which is very liberal. But my audience in Orlando is—it’s blue, surrounded by red. And I get letters almost every day from readers who say, you know: How come you’re not covering Hunter Biden? How come Hunter Biden isn’t getting the same treatment as, you know, I don’t know who. But they want it on the front page every day. And so I’m very—(laughs)—wary about this—you know, it’s going to be a storyline coming around again, it seems like. So how do you think the media can handle this better this time around, without covering the wildest conspiracies?  LOPEZ: Right. No, I think it’s a great question, and really challenging. You know, I do think that it is one of the opportunities to, you know, maybe get ahead of the story, in a way, and talk people through, you know, what are the facts of this story? You know, this is something that is on people’s minds, that people do have questions and concerns about. And so, you know, an explainer kind of laying out what has actually happened in this story from the beginning, you know, anticipating that it’s likely to come around again, could be very effective.   And I think explaining, again—you know, even explaining some of the disinformation narratives around this. I think this is another thing that, you know, some research has found. I think a lot of this research is pretty nascent, so acknowledging that—(laughs)—as I suggested others should earlier. But, you know, there is research that shows that if you kind of walk people through how disinformation is often taking a kernel of truth and then manipulating and twisting it into something false. But beginning with something that may be, you know, a legitimate piece of fact, and how that is manipulated and why that might be being manipulated, and who might be behind that manipulation—to the extent that you can kind of map that out for people, which is not necessarily straightforward, but that can be very effective.  Because people don’t like to feel like they’ve been duped either, right? And I think, you know, back when a lot of the disinformation was coming internationally and then some of it, you know, being stoke by the Russian government, you know, I think there was—when a lot of that was kind of exposed as being, you know, farms of disinformation creator who were, you know, paid by the Russian government to manipulate Americans and divide our country, you know, I think there was a real sort of sense that, oh, like, that’s—you know, understanding why that might be happening and how it was happening, you know, I think broke a little bit of the effect of that, to some degree.  It's more challenging, I think, when it’s happening domestically, because people’s own kind of political identities are bound up in it. But I do think that kind of just walking people through the facts of the story—you know, how some of those facts have been manipulated into falsehoods and then, you know, how and why your outlet might choose to report or not report on it. And being pretty upfront about a lot of that could be very effective.  ROBBINS: That’s an—and I believe that it’s going to be all Hunter Biden all the time once the Judiciary Committee is seated. And there may be there there and it may be a completely legitimate news story.  LOPEZ: Right.  ROBBINS: And so we will see. That’s very helpful and quite challenging.  Chris Joyner from the Atlanta Journal-Constitution. Chris, do you want to ask your question? Or it’s a comment.  Q: Sure. Thanks. This, as I said, is less of a question, more of a comment. But, you know, I find that readers who are vulnerable to disinformation aren’t reading my newspaper. You know, they’re—readers have become so siloed that they go to their own, you know, information sources that reinforce their own opinions. I’m wondering, you know, if—what’s our role as reporters or news organizations if those people who are vulnerable are not actually coming to our site? How do we handle that?  LOPEZ: Yeah. And definitely a challenge I’ve heard a lot of folks ask about as well. I mean, I think, you know, obviously you can’t sort of force people to come to your website and seek out your information. But I do think that can go to the point of connecting to other parts of the community that may have a different reach. You know, figuring out if there are ways that, you know, you can maybe reach communities, you know, through libraries or through faith communities, and that you might be able to partner with in some way. You know, I think that there are—you know, in our conversations, there are—we’ve had quite a few conversations with librarians.   We’re PEN America, so we like libraries. And, you know, librarians are still very trusted figures, most of the time, in their communities. They’re becoming increasingly politicized by external narratives as well. But, you know, they are people that folks go to for information. So I think thinking about, you know, how—is there information you can make available to librarians that might enable some of those resources to get out into the community in different ways? But it really is challenging.   And I think, you know, one of the things that—you know, was one of the researchers we spoke to for our most recent report talked about the fact that if you are kind of immersed in a certain media ecosystem that even if you go to read other sources, you’re still reading them through the lens of sort of your primary media consumptions and your—and the sort of primary narratives that you’re hearing. So it also—you know, it can be very challenging to kind of breakthrough that.   But I do think, you know, there is, as I said, I think, a significant portion of the population right now that is really just kind of seeking and trying to figure out what information can be trusted and what sources can be trusted. And the more kind of public engagement you can do to connect with some of those folks and build some trust and bring them, you know, potentially more into your orbit, you know, I think is an important thing to at least attempt.  ROBBINS: Thanks. Alex Hargrave from the Buffalo Bulletin in Wyoming, do you want to ask your question?  Well, Alex’s question is: My name is Alex and I work for a newspaper in a community of around 5,000 in Wyoming. Disinformation we encounter primarily comes on Facebook, in community groups and comment sections. For example, there was somehow a rumor, untrue, that the Game and Fish Department was moving grizzlies into our area from the greater Yellowstone ecosystem. How much do we respond to false claims, either on social media or in our publication? What is the most effective way to do so?  LOPEZ: Yeah. So, I mean, I think this is something that is always sort of a question is, you know, do you kind of take it on? (Laughs.) And if so, how? And, you know, as I said, I think getting a sense of whether something is, you know, getting a lot of engagement—if something is getting a lot of engagement, and it’s false, then I think taking it on and countering that with factual information is really important. If something is maybe going a little bit under the radar and it’s incorrect but it doesn’t seem that people are paying that much of attention to it, then it can be better to just kind of leave it be.   You know, we have guidance for people on how to—you know, how to deal with, like, a family member who might post false information or share false information to the family chat. And, you know, again, it’s kind of the case that if something—if somebody has kind of just posted something, you know, maybe you want to kind of see—you know, if it’s a friend or a family member, you might go ahead and say something to them at that point, because maybe they’ll decide to take it down. But if it’s something that, you know, is out there and hasn’t gotten that much engagement, then its potential for harm is relatively reduced.  You know, a lot of disinformation is out there but it doesn’t go viral, and it doesn’t have the same impact. It’s a relatively small amount of disinformation that really, you know, has the most significant impact. And so I think being a little bit selective and also, you know, our outlet doesn’t have the capacity necessarily to take on everything that’s out there. So, you know, you can—you can pick and choose a bit. And then, you know, I think it depends a little bit on the source. If there’s—you know, if it’s something, you know, who you think is sharing something by accident, you know, a lot of what we’re talking about is also misinformation that may start out with the intent to deceive people. But the people might just be sharing because they don’t realize it’s false. You know, that can be a little bit easier to address.  But I think, again, sort of responding with facts, responding with credible information from sources that people trust—even if you’re not sure if they might trust them. You know, I think a lot of peoples do actually have relatively high levels of trust in, you know, local institutions, in local government outlets. And so I think there—you know, there is some ability to bring, you know, whether it’s the local election official or the local housing official, or whoever it is, to bring their voices into the conversation can actually be an effective way to counter some of that disinformation as well.  ROBBINS: You know, I was—I was struck when you talked about the pre-bunking issue, that—to how one makes—and this is an interesting news judgment. How you make a decision when to jump on something or whether it’s better to let it lie because you don’t want to add more fuel to the fire—using as many cliches as I can possibly do in one sentence. And, you know, we’re good at news judgment as editors, but sort of news judgment is—it’s different to say something you’d prefer it didn’t capture people’s eyes. It might be really useful for news organizations if there were some sort of central hub of people monitoring across the country when things were taking off, to warn people.   Almost an early warning system. It’s one thing in a community like Alex’s, when she will know, you know, and have a sense of whether something’s taking off in her community. But the question about Hunter Biden is a good one that is going to have more of a national resonance. And what’s taking off, and how it’s taking off, and what particular aspect of it, that would be a huge service, I would think. And that would be the sort of thing that might have to have people on Telegram or getting, you know, almost investigative in encrypted platforms. And that would be really quite helpful to warn people, when is something taking off? Because the pre-bunking would be a major challenge.  LOPEZ: Yeah, I think that’s true. And I think, you know, there is a lot that can be very effectively done at the community level. And people do feel they’re, you know, attuned to issues that are arising at that level. And I think a lot of the disinformation, even some of the, you know, kind of larger-scale stuff about the big lie, it actually, you know, a lot of the actual examples of things that people are hearing about might be, you know, related to, somebody gave the example of, like, you know, when the absentee ballots were dropped out at a voting location, you know, they were described on somebody’s Facebook post as, like, ballots being dumped behind a building, and made to sound very questionable. And so, you know, there is a lot of—the local pre-bunking is really important.   But I do think in terms of, you know, tracking kind of what are some of the national issues that are likely to—that are either likely to spark disinformation or just kind of confusion and questions among the public, or where we start to see narratives emerging. I do think there’s probably, you know, more we could do to, you know, identify ways that journalists can kind of track that more easily, and see some of that as it’s starting to emerge.  ROBBINS: So Andrew, you have a very depressing comment. Do you want to make it quickly? (Laughs.) So Andrew Abel’s comment is—  Q: Yeah, I’ll let you read it. I’ll let you read it. You probably have a better microphone. Why don’t you go ahead?  ROBBINS: Well, Andrew Abel, how is an editor and report at the Mercersburg Pennsylvania Journal says the problem he sees in his community is that disinformation is keyed to the struggle for power. And that people in his rural community feel powerless. Many are not interested in accurate reporting. The question is merely what messages empower. And he wonders if the approaches we currently use to counter disinformation are based on the faulty assumption that truth matters equally across social groups.  LOPEZ: Yeah. (Laughs.) You know, I think—I mean, I think the answer’s a little bit in your question, right? I mean, I think that people—the fact that people feel powerless makes them, to some degree, you know, more vulnerable to being targeted with disinformation, because they are looking for things that give them a sense that they have power and that they have, you know, again, the ability to kind of determine—to better understand COVID themselves, than the CDC does, or something like that. And, you know, again, I think it ties into the sense of anxiety that people have just generally about society and the world we live in right now.   And so, you know, it’s not—again, I repeat the point about kind of coming at this with empathy. This is not about people, you know, just being uneducated or anything in particular. It’s really just that we’re all vulnerable to disinformation. Any of us could be duped. Probably most of us have been at some point. And but people are exploiting people’s, you know, sense of anxiety, and people’s sense of powerlessness in a really kind of traumatic moment. And so I think finding—you know, finding ways to make people feel like they, you know, have some ability to make these decisions themselves, to assess truth from falsehood.   I mean, that—as PEN America, we don’t believe the solution to disinformation is censorship, right? (Laughs.) We’re not all about taking everything down. That’s bad. We’re about empowering people to, you know, have the tools and the knowledge to assess the information that they’re consuming and make informed decisions. And I think that’s a really critical way to frame it as well. And even talking about disinformation can be very fraught for a lot of people at this point. But if you’re talking about, you know, empowering people as information consumers, talking about access to credible information, I think that can be a much more effective narrative that helps people feel they have a sense of power.  I don’t think we’ve really reckoned with the fact that people consume information now in a completely different way than ever before in human history, and that that’s happened in the last fifteen years. And we haven’t really adjusted our lives to it significantly. So, you know, I think it’s very real, what we’re experiencing right now. And it’s understandable. And so I think we have to, you know, help people feel that they can be part of the solution as well.  ROBBINS: Well, Summer, I wanted to thank you. And we’re going to turn it back to Irina, but just keeping in mind—and thank everyone for coming in strong in the end with questions. And we’re going to share all sorts of links with you that Summer’s going to share with us. And hope that you will share with us any questions that you have, and suggestions of support that you can use in your newsroom. So, Summer, thank you so much for doing this. And back to Irina.  FASKIANOS: Thank you very much, Carla and Summer. This was a really good conversation. And thanks to all of you. We couldn’t get to all of your comments, but we did the best that we could. As Carla said, we will send out the link to the webinar and transcript, and resources. You can follow Summer on Twitter at @summerelopez and Carla at @robbinscarla. And as always, we encourage you to visit CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they are affecting the U.S. And please do write us to share suggestions for future webinar topics or speakers. You can email us at [email protected]. So thank you, again, for today’s conversation.   (END)