To understand today's guest and our conversation. First, I need you to listen to the voice of Elon Musk.
And I'm not saying that I have all the answers here.
This is Musk back in April of 2022, when he had just started his quest to take over the company formerly known as Twitter. And he gave this interview with the head of TED Talks, Chris Anderson. Anderson asked how Musk, a self-described free speech absolutist, would handle hate speech on the platform.
It won't be perfect, but I think we wanted to really have speech as free as reasonably possible and a good sign as to whether there is free speech is is someone you don't like allowed to say something you don't like. It's damn annoying. That is a sign of a healthy, functioning free speech situation.
And you probably heard a bit about what happened after Elon started running the show. He reportedly brought back accounts that were banned for hate speech that the company allows the spread of tweets containing content that is racist, homophobic, neo-Nazi, anti-Semitic. Especially from the new people paying $8 a month for so-called blue check status. And when you hear a news report that explains these things citing research, it most likely came from Imran Ahmed. And today, we're going to tell you his story because Elon Musk is suing him and the organization he founded, the Center for Countering Digital Hate.
Which is an organization that studies hate speech and disinformation online and tries to create costs for the production and distribution of that hate and disinformation to sort of disincentivize its spread.
This is the man in the middle of the misinformation wars. We'll talk to him about his twin battles with Elon Musk and House Republicans who accuse him of censorship. How he deals with the growing body of political partisans skeptical of his work and what it all means for the next few years of addressing hate and conspiracy theories online. I'm Audie Cornish. And this is The Assignment. Behind every crusade, behind every pursuit of justice. There is an origin story for Imran Ahmad. It all started in 2016.
When I was working in the British Parliament, so I was actually a special adviser, as their called to the shadow foreign secretary.
He was working for the Labour Party, which at the time was being accused of serious failings with how they dealt with allegations of anti-Semitism among some members of the party.
And we realized that they were being influenced by folks who were infiltrating Facebook groups. He was spreading anti-Semitic lies within there. So reshaping, resocializing the Labor Party.
Then came the battle that shook the political landscape of the UK: Brexit.
So I was lent out to the guy who was running the campaign for the Labour Party for us to remain in the European Union. And what I was seeing there was a wave of conspiracy theories and hate coming from the other political side, from the right. And that culminated that referendum in for me anyway, in the assassination of my colleague Jo Cox MP, who was the 35 year old mother of two. She was the most brilliant politician I'd met.
British member of Parliament Jo Cox was attacked in her constituency in northern England shortly after noon on Thursday. Local media say a man shot and stabbed her outside a library in Birstall, near Leeds, where she was meeting local people.
So this heartbreaking moment of violence, this is where it started for Imran.
I think I'm still grieving. I think I'm still that this.... The Center for Countering Digital Hate is one, in one respect, an expression of that grief. I mean, I could tell you about the time I spent, you know, talking to my therapist or talking to my family or talking to my wife and talking to others and crying. And I remember getting into the lift after I knew that — the initial reporting was that she'd been shot. No one knew that she died but I'd been told because I was advising the guy in charge of the campaign, the police had told us that she died. And I got into the lift and I just sobbed. And a guy who was a few floors down from me got into the lift, and I don't think he'd ever seen another man sobbing, sort of sitting on, sitting at the floor on the floor of the lift. And he asked me what happened. And I said, "They killed one of us. They killed one of us." But I didn't know who they were at that time, and I didn't know who was responsible. I don't remember the immediate days after that, the immediate days after that, you know, having to work, having to grieve, having to sort through all these emotions and thoughts and...
But I hear a person who doesn't stop.
Well. No. And you know, it is in my personality. I am the eldest of seven. I grew up in a very poor working class family in the UK. You know, I had to do a lot of parenting of my siblings. And whenever something needed to be done, I'd always go, okay, I'll do it.
Right. And you know what it's like you you have that instinct to when everyone's arguing about who's going to go to the shop to buy the cookies, you're like, I'll just do it.
I'll get the cookies. (Laugh)
I'll go and do it. And so when I saw this problem, I was like, Well, look, no one can see what I can. No one can see what I think I can see, which is at that time, in a time when in 2016, if you told people, I think Facebook is partly responsible for what's happened, they'd go, "What on Earth are you talking about that's online. It's not the real world." And so you know.
Or that it's harmless, right. I think there's a sense of like, you know, people just sort of chattering online.
Online and offline worlds are completely separate to their minds.
And even the idea of trolling, I think people see it as it's just words and it's actually just trying to get a rise out of you. And if you don't take it seriously, then it won't have effects or which we hear not anymore, but the idea of Twitter is not real life, right? It was like just because a thing is happening online, doesn't mean it really has impact.
But of course we have these squishy simian brains that can't necessarily distinguish between the fear and the anxiety that we might feel because someone is abusing us and calling us bad names. And, you know, since actually during the pandemic, I worked with vaccinologists who were being trolled online, and I remember saying to them, the reason that they're trolling you is not because you're a bad person. They're trolling you because you're a good person, and they want you to stop doing what you do. And I remember one in particular, burst into tears and said, "Oh my God, like it feels like I'm being given permission to get on with my job, to sort of to just say to these people, actually, my greatest revenge is to continue doing my job. My greatest the most effective rebuttal I can think of is to do my job." You know, the bad guys know that online environments have an impact on our psychology offline, that they can significantly ...they can they can resh....
Shift how we think about things.
Shift how we think individually, but also societally as well, that socially it can reshape what we think is important, that can create social proof for fringe ideologies and theories and ideas. They can use terror to try and persuade journalists not to cover certain issues because they just think, well, crumbs, do I really want to be exposed to a nonstop month of trolling after this.
Right. Or doxing or, you know, etc..
And that's where it started. What it started was, well, how about actors weaponizing digital environments to create harm in the real world? That's what the project started. And here's the funny thing. In 2016, when I started it, I started working with the social media companies. So I knew executives there and I called them up and I said, Hey, look, I think something really serious is happening.
So you create this center for countering digital hate. I'm going to ask you something that sounds really simple, but I think I need to hear it out loud. What do you consider online hate? How do you define it?
You know, there's a temptation sometimes to echo the words of a Supreme Court justice who, when asked what.
Pornography was, said, I know it when I see it. But what we tend to do is examples is show how platforms treat the most extreme forms of hatred. So, for example, organized violent extremist groups who talk about going and causing violence to black people or to Jewish people or to others, we talk about the most extreme types of content. So stuff that glorifies Hitler and says that we should we should finish the job that he started or that says that gay people should be shot and killed because of who they love. So we we...
But that takes online the form of what memes, jokes, forums where people are discussing their fantasy action. So what are you looking for?
There is this you know, there's this assumption that what we do is we go and look through the dark reaches of the internet. We don't. What we do is we look at, for example, on Twitter or on Facebook. So Twitter and Facebook have rules, right?
Formerly known as Twitter, now known as X.
They have community standards and they are the responsibility.... so we all sign up to abide by the community standards. We all sign up to the terms of service when we join these companies and start posting there. That's our responsibility as users that we're meant to abide by those rules. But every responsibility has a reciprocal right attached to it, which is that we expect others to abide by those rules when they engage with us, and we expect someone to enforce those rules. And that's where the failure happened.
So full disclosure, we've used research from the center for our research on social media and young people and eating disorders. But I've been reminded that you and I have technically met before in an interview. Why do you remember it? Because you do a ton of interviews.
I do. But I remember the interview with you distinctly because I had a real effect. It had a real impact. So at the beginning of the pandemic, right at the beginning March 2020, we decided to shift all of our work from looking at identity based hate into COVID related disinformation. There was a calculation we made that that was going to take more lives than identity based hate content and that we should focus our efforts on protecting life as much as possible. And so we decided to focus on anti... and initially we looked at COVID disinformation then we realized that anti-vaxxers, this organized group that had been around for 20 years, that they were weaponizing those spaces really effectively. And we wrote a series of reports, the anti-vaxx industry, the anti-vax playbook, and then most famously the Disinformation Dozen, which showed that 12 individuals and their companies were producing two thirds of the disinformation that was being shared online. And that report you decided to talk to us about and the actual — and I remember this very distinctly — you talked to us some months after the report came out, but it was only after I talked to you about it that it kind of exploded. And very shortly afterwards, the White House was talking about it and, you know, everyone was talking about it. So in part, thank you. In part, you have made my life bananas.
That's fair. That's fair. But the reason why I'm asking is because it actually shows the power and the relationship between what the center does and the media. Right? So I didn't go into that conversation looking to do that. But the amplification of something coming from a mainstream news organization also adds a kind of legitimacy to the work in what you do.
Oh, yeah, absolutely. And look.
And there's pros and cons to that.
I think one of the things about working with the mainstream media as well is that no one will ever give no one will ever give your work a more thorough check, a more thorough going through it, working out if it's legitimate, a rigorous shake down of whether or not you're doing an effective job than when you brief the media and we you know, I learned a lot of lessons from my time in politics. The first lesson being never lie to a journalist. Second lesson being, if you provide bad data to a journalist once, they will never, ever talk to you again. And I think one of the things that I'm so proud of is that CCDH has maintained a reputation amongst the journalists that we talk to from NPR through to Fox of at the very least, you can cite our research and our findings, and you ain't going to have to correct that later on.
Does what you're doing actually work? Like you put all the information out there, but it like it's not — does it really dissuade the people you consider bad actors? And does it actually affect and I ask this of myself as a journalist, does it actually mean anything to the public when that information reaches them?
I mean, in one respect, what we haven't talked about is the actual the structure of our organization is to try and create costs for the production and distribution of hate and disinformation. And we are and you know, we've spoken a lot about the emotional journey of of CCDH and some of the philosophical aspects of it. But actually in at its core, what we are is a market solution to a market and regulatory failure, that the production and distribution of hate and disinformation are profitable. And that's the problem that social media is creating.
And you want to make it not profitable.
And so how do you create costs? So, for example for...
But to come back to it, it's what you're saying though. Yeah, but does it work?
Like when you counter information.
By encouraging platforms like ... by encouraging platforms through the disinformation doesn't study to not provide a megaphone to people on that list. What we actually did was create costs for them because they were not able to use those platforms anymore.
But we're actually able to turn around and say, Look, I'm under attack from the Democratic slash counter information censorship regime...
But you're talking to your own audience. And this is a political question of like if you're talking to your own audience saying, you know, you are essentially left in your tiny little echo chamber complaining about censorship. But we also know from from one other thing, which is so the other thing that we know is the lawsuits that have been filed against us and the legal threats that have been made. And when someone files a lawsuit against you, they have to explain what harm has been done to them. We get a lot of complaints from people that by doing our research, which brings to a wider audience, and to the public what people are doing, they think in private, online on social media and the harm that's created as a result. And then there are ramifications for them, whether they be social, economic or that deplatformed or whatever else, that they then complain, "Well, how dare you repeat our words?" Well, you shouldn't have said them in the first place. We get a lot of deflection and projection where people are saying it's you. That's the problem. And we're like, No, no, sir, you're the one that said it in the first instance.
My guest today is Imran Ahmed, CEO of the Center for Countering Digital Hate. Now, when we come back, the lawsuit filed against the center by Elon Musk and the rising mistrust by some of watchdog groups like his.
(MUSIC) Remember at the start of this conversation, we played you a clip of Elon Musk giving his definition of free speech. Well, now he's challenging those who say that Twitter lets its users go too far, including the Center for Countering Digital Hate. He's suing CCDH for its reports that reveal an increase in hate speech on Twitter. We spoke with X Corp, not on tape, and they claim that CCDH illegally accessed the data for their studies through a third party and, quote, "created incomplete and misleading research. The CCDH is actively working against organizations it doesn't agree with by targeting their revenue streams." The statement also reads, "Free expression is fundamental to a healthy, functioning global society and X will continue to stand up for people's rights. It says X is a free public service funded largely by advertisers, and the CCDH misleads advertisers into actions that will hurt the public." Alright, so back to our conversation with Imran Ahmed about how the Center for Countering Digital Hate approaches its work.
Let's take an example of what Elon Musk has complained about. What we found was that in the month following his takeover of the platform compared to the year beforehand, that the volume of tweets containing the N-word went up by 202%.
The volume of tweets containing homophobic words...
The bad N-word or the good N-word, like, how do you make those decisions?
Well, so the way that we analyzed it was that there were there are some people who use that in a reclaimed.... African-Americans can use it in a reclaimed fashion.
That would that would form the baseline. That would be the baseline of its use. So there'll be some people who were using it in a malignant way. Some people who are using it a normal way. But the increase, the sudden increase that's caused by the moment of his takeover cannot be explained by people using that in a reclaimed way. When you consider as well in the same study the use of the of the most offensive homophobic term, the most offensive misogynist term, the most offensive transphobic term, anti-Semitic terms all rose in the same instance. And our argument was that's caused by him essentially putting up the signal to bad actors to say this is now somewhere where you are safer using ....behaving in that kind of way.
And not this is now a greater, wider venue for free speech?
Hate speech isn't...hate speech ...I mean it's a greater venue for hate speech. Yes. That's what he was saying, was that this is now somewhere where you can freely use hate speech without consequences. And there are very, very few places in society....
You can hear why I'm asking. Right? Because this is an ongoing, fulsome counter kind of response over the last few years. And in the U.S., there is political backing for it, meaning the Committee on the Judiciary in the House, led by Jim Jordan. You know, they're having an active investigation. Right? They've reached out to you guys specifically to say essentially their argument is this machine, your center and others, is about censorship.
And that in particular, conservatives bear the brunt of that.
I think that this is a it's a really bizarre thing to accuse an organization like CCDH of what we do is put up a mirror to these platforms. And if they don't like the image they see in reflection, if advertisers, if other members of society, then say, well, that's a bad thing we don't want to see more hate speech, then that's not our fault for holding up the mirror. It's their fault for having permitted the proliferation of hate speech and everything else. You asked me the question earlier on, Who are you to impose consequences? We don't impose the consequences. The X Corp lawsuit says we lost tens of millions of dollars in advertising because of your studies showing an increase in hate speech. We did about ten studies that showed various types of hate speech over an 8 to 10 month period. And it's others that impose those consequences.
But the line of questioning I'm putting to you now, you are facing from various other quarters. Right. So the U.S. House has reached out to you. You are now facing a lawsuit. What was it like to be served for that lawsuit?
The irony is that this has become a conversation in which I feel that the lack of specificity that is the diffusness of the charge. You're a censorship organization. What Jim Jordan's specific charge was, was that we had worked with the Biden administration to censor the Disinformation Dozen. So his his argument is that you in some way have a relationship with the federal government and that you are part of an apparatus of government. And our argument in response was we don't have any contract with the government. We don't have any formal relationship with the government. If they want to read our research and then cite it. What's that to do with. I mean, we can't do anything about that. And, you know, I think it's a mistake by Mr. Musk to actually file a lawsuit because on Twitter he could say, these are a bunch of liars and no one will do anything about it. They do not claim in their lawsuit in the charges that they put to us, defamation. So defamation would be whether our data was wrong or untrue. What they've claimed is that the mechanisms that we use to take data breach their terms of service. Now, we are going to respond to that and we will vigorously contest in court and I look forward to doing so. But the beauty of a lawsuit is it's now going to be tested in the one place where facts really, really, really matter, which is a court. And so these facts are going to be put to a judge and he's going to have or he or she is going to have to make a decision on whether or not that case is true. And so I look forward to that.
And I just want to I want to make sure that let me just read a line from it. It says that your organization basically took things out of context, intentionally mischaracterizing data in research reports it prepares to make it appear as if a few specific users, often media organizations and high profile individuals, are overwhelming social media platforms with content that CCDH deems harmful, and then use that contrived narrative to call for companies to stop advertising on X.
Well, if you take the advice of language as if it is broadly true, isn't it, that what we did was we produced research, which then advertisers read and advertisers thought crumbs. We don't really want to advertise on a platform that has lots of hate speech on it. And the question is that ...the irony is that what their problem with us is, is that you use techniques to get that data, which we don't like. Not that the data is untrue. What they seem to be annoyed about is that when we held up a mirror to Twitter, they didn't like the reflection in it, others didn't like the reflection of me and Elon Musk, rather than doing what anyone else when they don't like what they see in the mirror, which is to go on a diet or to, you know, comb their hair or, you know, brush their teeth, is he's suing the mirror.
I want to ask you what it's like to start to feel this pressure, political, legal and I assume personal. Has this not come without cost?
I mean, I can't tell you it's scary because it wasn't scary. What it was was about a month before the lawsuit came in, I'd actually met with Linda Yaccarino.
Who is the marketing executive, who is now the chief executive of X.
And I put to her some of the some of the data that we found, and she asked me if I wanted to come in and have a meeting with her 1 to 1 in San Francisco. And I said, I'll be there in two weeks. I was going to be there in two weeks. And it was while I was in San Francisco that Musk started tweeting, this guy's a rat and his organization is evil. Again, I mean, you know...
And what was the response after he did that?
Some people were abusive, but what I did was I screenshot the tweets and said, look, he keeps up, keeps calling us evil. If you want to slap back, clap back. Why don't you donate to CCDH? And, you know, Mark Ruffalo and a few other folks decided to go and amplify that. That must have really annoyed him because the next day he then called the chair of my board as though I was a naughty child. And he was calling my father to say, you know, you need to tell little Imran to stop playing on my lawn. On the chair of my board said, You can speak to Imran if you want to speak to anyone.
What I'm saying is you are now playing an arena, a very high stakes with very intense players. People are doubting fundamentally the work of you and other...
Oh, I would definitely say in the U.S., a good number of Republican voters. The Pew Research on that is fairly good.
No no, doubting our research?
No doubting the research of fact checkers and misinformation researchers in general. So Pew Research. Okay, 2020 report majorities in both major parties believe censorship is likely occurring, but this belief is especially common and growing among Republicans. Nine in ten Republicans and independents who lean toward the Republican Party say it's at least somewhat likely that social media platforms censor political viewpoints they find objectionable. There is a group of people who find what they consider this a kind of policing.
If you put it to people that that is policing, then I think they're worried about it, if you ask them...
Well, I don't think Pew said that, but I'm just reading you, you know, from the report, I'm trying to get a sense from you. Do you feel a new kind of outward pressure?
Lawsuit? You think this is just literally one or two or three people. You don't see a broader kind of political pushback or backlash?
This is what happens if you are effective. This is the price of success, not failure. I think there is this assumption that maybe we're doubting ourselves, that we are scared, that we are vulnerable right now. But I don't feel scared or vulnerable. Of course, it's natural, it's healthy to doubt yourself all the time. So we've checked our work. We've made sure that we're confident. We're about to have that tested in court. We are, of course, really aware of that. But what I am what I'm particularly aware of is that this is an opportunity for us to speak to even more people so that when people say censorship or they just shout whatever slogan they've got on the tip of their tongue, that we can say, well, hold on a second. You're saying censorship. I'm saying we all agree that hate speech is bad and that if we find that the use of the most extreme anti-Black term increases by 202% in the month after he takes over, surely we think that's a bad thing, right? And no one's ever said to me, not Jim Jordan or Elon Musk or Linda Yaccarino or Mark Zuckerberg, that that's not a bad thing. So we have a perspective that hate speech is wrong. It's evil. It reshapes our society. It's normalization would be retrograde and detrimental, that it would harm the ability of Black people to be able to, for example, tweet online because they just think, well, why would I want to go into an environment where it's just horrible and vicious, where it actually impinges, it reduces the freedom of speech for some people while giving some people the ability to spout hate speech without consequences. And our argument is in every sphere of human interaction, hate speech actually attracts consequences.
And so you want to create a similar kind of consequence in the digital space.
I think that where there is impunity for hate, hate will breed, and hate is corrosive to our society, to democracies, and to our ability to have prosperity, inclusivity, all the things.
What keeps you up at night?
What scares me is that there is a closing window. So we've just finished some research which shows that 14 to 17 year olds we polled a thousand kids, 1000 adults. We found that the level of belief of conspiracy theories in this first generation that's been raised on algorithmically ordered short form video platforms like Twitter, YouTube, TikTok, Instagram, etc.. So 43% of 14 to 17 year olds said that Jews have a disproportionate control of the economy and of our politics. They had a higher level of belief than adults in nine conspiracy theories that we put to them.
Was this a digital poll or a phone call survey?
It was an online poll using a panel that's, you know, approved by all the different authorities. So it was a thousand adults, a thousand children. Robust sample sizes done by Survation, a respectable polling company that was just featured in in the press a couple of days ago. And that research shows that young people who are in the generation that's grown up immersed in social media has a much higher level of conspiracist belief. That makes me worry that the window for change is closing because, you know, with every other type of harm in our society, we always say, well, is the kid still safest? You know, the kids are. They know what's going on. They'll fix this. But what if social media is so affecting our young people and every parent knows that it affects their body image, it affects their self-worth, it affects their mental health. But what if we're actually damaging their grip on reality, on any kind of objective truth? What if we're undermining their ability to participate in democracy.
So there's a window. And what scares me is that I will not succeed before that window has closed, because our democracy is the thing that protects us against everything else I fear about whether that's recession and economic decline, whether that's climate chaos, whether that is civil war or anything else. It's democracy that protects us. It's our it's the strength of our democracy that protects us and the values that underpin it are currently being weakened at a profound level, by the way that social media algorithms work, the way that they amplify the hateful, the contentious over the tolerant.
And build community around these ideas as well.
And so that's my fear, is that we won't be fast enough. And you know, what really annoys me about this litigation is not actually the litigation itself or Jim Jordan. It's not that they are imposing costs on us and it is costing us lots of money. We've had to raise money and we're still trying to raise money on our website right now for these cases because they're expensive. It's they actually take away from my time and I know that I have a job to do because someone someone needs to speak up for the majority of Americans who yes, many of them may worry about censorship and freedom of speech it's the most profound and fundamental of value in democracies, and I come from a liberal democracy, too, from the United Kingdom, one that's very similar in that respect to the United States, but more fundamentally in that they worry about the impact of social media on their kids psychology because they see every day on their ability, on their families, on their communities, on the nation, on democracy. And we've seen those all challenged in recent years. We've seen them all shaken.
There are, as you mentioned, stories that I sometimes I'm pretty nervous to wade into because of the costs, so to speak. Black woman, journalist. It's going to cost to wade into some things. For you personally. What kind of toll is this all taking?
Two things have really affected my life. So I'm 44 years old. I was born and I never I didn't grow up in the Cold War. Not really. I felt safe. I felt no exogenous threat. There was no outside threat that could harm me until 9/11. When I was working at Merrill Lynch, I was an investment banker and 9/11 happened and suddenly the world felt scary and I actually quit everything. I went back to college. I studied politics at Cambridge straight after that. And that was the one thing that made me.... You know, and I realized who I was as a human being. I was someone who, when I saw fear, wanted to confront it, I wanted to deal with it. The second thing that really changed me was the murder of my colleague and the work that we've done since then. I think the problem is that when I start getting upset right at the beginning of the interview, it really affects my ability to think for the rest of it and talking about my colleague. And it just I mean, you could see me in the studio and my eyes were filling up, so I'm still a bit shaken from talking about it.
That's interesting because you you have talked about it publicly a lot.
I cry every time. Every single time. There's two things in the world that make me cry. Talking about Joe and Paddington, the movie. Because he's just a little bear that wants a home!
Oh, I know. See, you're good at plowing through. That's what I hear, right? Somebody who just, you know.
Just get on with it. Come on. Like someone's got to do it. You know, my...
And I wouldn't have known you were upset. Really.
I have good resting calm face.
Resting smile face. Let me ask it a different way. To do the work of bean counting humanity's most toxic comments and ideas seems hard and stressful. And who are you... what do you need to do to release that at the end of the day?
That's the irony, is that for me to be able to release at the end of the day, I need to have done something to confront the things that scare me. I had one spiritual moment in my life. I'm not a religious person. I'm not a spiritual person. But I had one moment where I was in a senator's office and I saw he had that quote from Martin Luther King. The moral arc of history is long, but it bends toward justice. And I'm a scientist by training. And so I kind of went, well, nothing bends toward justice without a force. You need a force. And in that moment, I suddenly saw this arc and I saw hands on that arc pushing it towards justice. And I saw my own hands and mine were pushing one was pushing towards and one was pushing against. And I thought about my behavior as a human being and about myself as a human being and what I dedicated my life to. And I decided in that moment I was going to push towards justice with both hands in everything I do. And I think that by doing that, by fulfilling that sort of moment of clarity, I have given myself the peace that I need to be able to sleep well at night, to be able to love without encumbrance, to be able to live without fear. Because I know that whatever happens, I have spent my life trying to make the world a better place. I'm quite confident of that.
Imran Ahmed is the CEO of the Center for Countering Digital Hate. Now, that's it for this episode of The Assignment. If you liked it, please share it with your friends. If you love it, rate it. Write a review. It matters. The Assignment is a production of CNN Audio. Now, this episode was produced by Laurie Galaretta. Our producers are Carla Javier, Isoke Samuel, Jennifer Lei and Dan Bloom. Our senior producer is Matt Martinez. David Schulman did our mixing and sound design and our technical director is Dan Dzula. Steve Lickteig is our executive producer and special thanks to Katie Hinman. I'm Audie Cornish, and I want to thank you for listening.