The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> JIM PRENDERGAST: All right the good morning, everyone.
I think we'll get started. Let me get to the screen, the appropriate screen.
So thanks, everybody, for coming. Good morning, good afternoon, good evening. Whether you're joining us in person or virtually, welcome. My name is Jim Prendergast, I'm your moderator for this morning or today's session which is titled combating misinformation with election coalitions. If this isn't the session you thought it would be, we'd like you to stay anyway.
So 2024 is a watershed year for elections. The UN called it a super year for elections. 60 plus countries held elections this year, I believe that's an all-time record.
At a time when elections around the globe are increasingly vulnerable to the spread of misinformation, stakes couldn't have been higher. Disinformation campaigns not only undermine electoral integrity, but erode trust in civic institutions and in some cases polarize societies. But there's good news, and that's what we want to talk about.
Today we'll talk about partnerships between governments, Civil Society, private sector, fact checkers, encountering the rise of the tide of misinformation. These coalitions emerged as a promising approach to build trust, promote credible information, and strengthen election resilience.
But their effectiveness depends on a lot of factor, including strong coordination, shared resources, and clear strategies. I'm excited to be joined by a great group of experts on this topic who bring a diverse set of perspectives and extensive experience to the table both in-person and virtually.
First off, I'll introduce Alex Walden who is seated here at the table with me. Wave to everybody online.
Mevan Babakar, who's the news and information credibility lead for Google. She is joining us from London.
David Ajikobi, he's the Nigeria editor for Africa check, he's also remote.
And then finally Daniel Bramatti who's joining us from Brazil. He's an investigative journalist and he wins the prize for the earliest time zone as a presenter -- speaker.
Before we begin, just a couple things to point out. There are our speakers. Our session is going to start off with a couple of brief presentations from our speakers. I'll kick it off with a couple of questions, but we really want this to be interactive and highly participatory. So for those of you online and those of you in person, we'll really encourage questions and conversation and discussion.
With that, let's get started. Alex, you know, could you start of kick us off and help set the scene with explaining why you think election coalitions are important?
>> ALEX WALDEN: Sure, thanks.
I think, you know, this global elections has been a banner year for global elections and Google has taken it seriously in all of these dozens of elections that happened around the world this year. It's timely to have this conversation reflecting around the successes of the approaches that industry and our partners have had and also looking forward to what do we need to do to strengthen those. I'm really glad we're having this conversation today.
I also think it's appropriate that we would be having this conversation at the IGF where we are all focused on the multi-stakeholder model and the importance of that. Everything about what we're doing here at IGF is focused on the necessity of government and Civil Society and companies working together to ensure that we're all sort of realizing the benefits of what technology can deliver and help those relationships and networking together also should inform how we address problems that come before us.
That's true across many types of issues, and in particular that's true across elections. I think, you know, my colleagues across the panel today are the best experts to demonstrate and talk through the ways that we've seen these successes.
At Google we have billions of people who come to our product every day, and in the election context people are coming to find information and information about where to vote. So we have an obligation and responsibility to make sure we're doing the best to deliver information to those users, but also it's incumbent upon us to engage with the rest of the ecosystem to make sure that the things that are not really -- or they're not necessarily in our power to -- to change entirely, we need to be working with the rest of the ecosystem to ensure that there's integrity in the way that we're delivering information to all these billions of users around the world in the election context.
So again, I'll stop there. I think Google's really excited to be having this conversation and hear the input from everybody in the room and online about how we do this work going forward.
>> JIM PRENDERGAST: Great. Thanks a lot, Alex.
I'd now like to turn to Mevan who's going to explain to us a project that she worked on, something she developed called the Elections Playbook. I'll be driving the slides for you, so just let me know when you want to advance.
>> MEVAN BABAKAR: Perfect. Thank you very much. Can you all hear me?
Excellent. Okay, great. Let me just quickly -- hi, I'm Mevan. I work at Google as well. I work in cross-strategy of knowledge and information. That touches on ads and other products that we have.
But previously I used to work in the Google news initiative, and previous to Google, I actually worked in fact checking for a decade. I used to work Full Fact, the UK's independent fact checking charity. And at my time at Full Fact, I'm also at Google I saw the power of election coalitions. And one of the things that became very clear to me is that action coalitions actually quite a magical way of scaling the work of journalist and campaigners around the world, especially during elections.
So I'm going to talk to you today a little bit about the short history of election coalitions. Election coalitions research project that we've done, specifically to capture some of the learnings from around the world. What -- how you can form and organize an election coalition, some of the lessons learned from all of those interviews that we did as well.
We've got ten minutes, it will be a bit of a whistle stop tour, but if anyone has questions, feel free to jump in and ask them.
Next slide, please.
So 2024 was a very big election year, as Jim mentioned. And more than 2 billion people voted in over 60 different countries. But as we all know mis and disinformation has been around for a lot of elections. There are ways to combat it but there is no silver bullet. There are very subtle and sometimes not so subtle nuances between countries that have quite a big difference in how combat misinformation. All of these things actually necessitate a specific country-level intervention.
Over the past decade, journalist and fact checkers have come together to form these election coalitions. What they essentially are just as a very top line, is when journalist fact checkers, community organizers, lawyers, resources and share resources and the impact of the work that they do during a specific event like an election coalition.
So it might be actually sharing the resources of the media monitoring, the actual research that they do. It might be sharing the learnings of the actual fact checks or the journalism. It might be sharing the impact or scaling the actual outcomes of the work itself.
One of the earliest examples was election in 2016. It was a U.S. coalition set up if the was 1,100 journalist working together. It was a nationwide effort to cover voting rights in 2016. It was a narrative going around the election was rigged and that narrative is one that still exist today, but key claims come up each time that the election was rigged.
And historically, at least in 2016, the news rooms were primarily focused on reporting the outcome of what happened on election day and the run-up to the political ins and outs.
And voting issues were sort of relegated to secondary coverage. So a bunch of journalists and news rooms came together and started an election coalition to combat that. Especially in the U.S. the election laws vary drastically from state to state and even country by country. No news room was at the time in a position to cover this. All these news rooms came together and one of the things that they did was actually quite new time was using social media to actually alert the local news rooms and journalists that were taking part on specific claims that were coming up around the election being rigged so that they could actually localize specific narratives and specific claims to certain regions.
And on top of that, they had 1,100 journalists immediately and authoritatively rebut some of the pieces of misinformation that were coming out.
It kind of showed at the time the news organizations, A, could work together. And that they can collaboratively serve as a watchdog for this crucial democratic moment that was taking place. And as it stands in 2016, the election project won an online journalism award for its work.
Since 2016, there have been at least eight more coalitions, I think probably more like 12 at this point, and they have operated not just for elections, sometimes across multiple elections, which one we'll hear more about later has run for multiple years now and more recently in 2024, although their logo isn't on this slide, we've had the coalition in India had is about 40 organizations coming together. In the EU the election 24 check, which was 45 organizations across Europe and working across 37 countries who published 3,000 plus fact checks around the EU elections.
I think it really shows that when news rooms come together or when fact checkers and news rooms come together, the impact can scale drastically. There's something quite special in that model. Next slide, thanks.
Google has a long history of supporting these election coalitions and we wanted to understand how to effectively build them and what they should look like to serve the needs of voters in the countries and the run-up to elections. But more importantly, there had already been so much learning from the past decade, and it's felt a bit like every single time everyone's starting from scratch.
So we wanted to run a six-month research project and talk to all of the election coalitions that had come before to understand exactly what the best ways of setting it up, are what are all the lessons learned over the past decade, and how can we effectively build them going forward.
We ended up talking to 15 global experts, and the countries that we touched on was France, Brazil, Argentina, Mexico, Nigeria, and the Philippines as well as the U.S.
Yeah, one of the key things that I think has come out of the learnings is that there's really no one size fits all approach to building a successful coalition. Because of the fact that each country is very unique in how it's set up, there are often different election laws, different voting systems, there are different news consumption habits like radio, TV, these things are turned up or down, you'll need to change how you do your monitoring.
There are often different types of disinformation taking place. There are instance of claims that maybe are more honest in the fact that it's misinterpreted. But sometimes there's direct foreign interference, and these things would need different approaches.
But having said that, there are things shared across all the successful election coalitions, and by asking the right questions, we can start to build something much quicker and much more viable.
So the things that have come up as sort of stages and needs in election coalitions is to identify the need. Actually understand what is it that you're trying to do in the first instance. Although they're called election coalition the -- a lot of them are around misinformation and disinformation. They kind of share the same model, figure out what it is that they need that you're trying to meet specifically.
It's become clear that we need to identify the lead as well. So a specific organization often takes charge of the larger coalition, not necessarily as the spokesperson but as the organizing lead for any kind of coalition to take place.
I think it's really important that this is one of the things that came out of the interviews. It's really important that that organization that takes the lead in that country context is seen as neutrally as possible. Well, is seen as balanced as possible. Because the really key point of the election coalitions is that you want a broad spectrum of actors and journalists that meet the needs of voters. And depending on how polarized that ecosystem is, you might want to use it as I means of building trust in institutions or building trust in journalism or fact checking whatever it is that's happening in those countries.
So having that as a key aim really, really helps. And identifying a lead that is as neutral as possible helps build that bridge.
Determining the membership, whether it's formal or informal, whether actually you're focusing on subject experts or technology partners, these are all very important steps and things to formalize before the coalition comes together.
And then I think the next two are very important, actually. Implementing capacity building programs is especially important for an election coalition when there are multiple media organizations working together. Because historically, those media organizations work against one another.
They're competitors. And I think that what they're doing here is actually quite unique. They're coming together, sharing resources, they're sharing outputs and working in a much more collaborative way. So trust building is an incredibly important part of these election coalitions.
And trust isn't something that is earned overnight, it is earned through example, it's earned through case studies, it's earned through the experience of working with one another. And the more times that you can bring people in the election coalition together in person, the better it will be for that.
And then on top of that, making sure that people kind of have the same skills and resources available to them.
Developing clear coalition policies is key. There are actually two models for election coalitions that I have seen so far. We call them the collaborative approach and the independent approach. In the collaborative approach, the organizations actually share resources to do the media monitoring together, check together, act together, it becomes one mega news room and they publish the final outputs of the pieces across their multiple media organizations as well. That's the collaborative approach.
In the independent approach, we see this more sometimes when people are -- don't have the trust necessarily to jump in together, yet. In the independent approach, there's no commitment to share the output, so often people will maybe share the media monitoring side of things, but then do the check or the article through their own independent editorial processes.
And then that's kind of shared across news rooms or a platform and organizations can choose whether to share it or not. There's no commitment to share it.
But still there's a lot of value there in understanding what the shared narratives are that are happening in that country and actually what are the gaps that still need to be filled that haven't been filled across the ecosystem.
And then other things like figuring out the branding of the coalition, et cetera, the code of ethics and standards and correction policies is incredibly important when many news rooms come together.
Next slide, please. So some of the key things that came out were about preparation, starting early and planning for scale is incredibly important. Within an election coalition you can't start too early.
I think there's a lot of things do and the sooner you build trust the better. Diversity is a key part. We've mentioned that scale and that width of partners is very important. But often you have a layer of journalists and then that intersects with the community as well.
And so in some places, you actually get media and Civil Society organizations taking part as well. And that's an opportunity to go even broader and more diverse in trying to get the stories out there.
And finally, context. So actually understanding how the context your own country might be changing. In some cases, example, there might be a growth in AI misinformation and do you have tools across your AI election coalition to actually be able to combat that.
Next, please.
I'm going to quickly touch on two case studies and one of those was cross-check in France in 2017. It brought together the 30 organizations and it was led by agents from AFP who took on the editorial leadership of it.
They had 37 partners and across the videos that they all shared, they had 1.2 million video views in total and published hundreds of articles between them.
Greg, the chief editor of the program at AFP said this is for us one of the biggest wins in AFP history. Cross-check will always be special personally. Sometimes I meet colleagues who took part in this project and they say do you remember cross-check, that was so great.
I think that's a really key part is that trust that it builds across journalists is really important. And it lives beyond the election coalition too.
Next slide, please. Then we have FactsFirstPH, which is one of my favourites. Sorry, everyone else. But they had 131 partners working together and they published 1,400 fact checks.
And general Mendoza, the head of research said that the thing with these is that these are experiments. I wouldn't say FactsFirstPH was perfect. At the time we were experimenting and the reason why you wanted to experiment was because there was a huge challenge.
And it's true, there is a huge challenge. And even when we look at these numbers, 131 partners and 1,400 fact checks, it might not feel like it's big enough to meet the scale. But I think one of the important things we need to remember is that with misinformation, there are often just a handful of narratives that are most well-known and well seen narratives and that cause the most harm.
And actually if we focus efforts on those narratives and those pieces that are being seen the most or are most harmful, you can actually go quite a long way to interrupting the flow of misinformation in each country.
Next slide, please.
I think one of the things that they did really well in the Philippines with FactsFirstPH was they introduced something called the mesh. They had all of these authoritative information sources in blue. These were journalist, expert institutions, fact checkers. These were actually producing the research.
And then in red, the mesh, they actually had over a hundred orgs that were separate to the information providers. And these were influencers, NGOs, communities, trusted people in their communities. And they would then go out and share the outputs of the election coalition more broadly.
And I thought that was a really amazing model and the kind of impact that that had in the Philippines in terms of building trust and showing people that there was an answer to misinformation, was actually very, very powerful.
And then just more broadly on top of that, there was also research, so taking all the learnings from that and then finally accountability and change. There are some actors around the world that take the outputs of the authoritative information that's being found or being introduced into the world to combat harms.
And then actually using it as evidence to hold people for -- hold people accountable, for example, in international criminal courts sometimes or in legal cases. And I think that's also a really important part of misinformation. It's about finding that systemic change.
I'm going to leave it there. I appreciate it, I've been talking for a long time, but I'm pleased to share this with you all today. If you want to learn more about all of the case studies and go into depth in any of this stuff and there's this election coalitions playbook that we've published alongside anchor change and actually there's a podcast as well with Clair and Daniel who's here today that took place as well where you can get a summary of everything.
So please download it, enjoy it, use it, and if you ever make an election coalition, get in touch.
Thank you very much.
>> JIM PRENDERGAST: Great. Thank you very much, Mevan. I think I know the answer to this question before I ask it, but I've noticed at least in the room some people are taking camera shots of the slides. Are you willing to share them with anyone who would like them?
>> MEVAN BABAKAR: That's fine.
>> JIM PRENDERGAST: That's what I thought. Perfect. We've had a couple of examples, case studies of country election coalitions. I'm going to now ask David to share his experience with election coalitions in Africa.
Good morning, David. Are you there?
>> DAVID AJIKOBI: Hi, everybody, can you hear me and see me?
>> JIM PRENDERGAST: We can hear you and see you.
>> DAVID AJIKOBI: I think Mevan set up the conversation. I have a few things. I think largely for us at Africa check, we constantly leading first independence of [?]. I was done with election coalition to also help other countries around elections to set up the coalitions. It can be very -- it's very problematic, you know, particularly in a continent like Africa where historically media relationships are often -- media often owned by [?] or politicians, owned by government.
So, but what we've been able to do essentially is to say, look, we would bring everybody on the table, we would have a [?] to say we want our elections to [?] we don't want the information what can be in Africa elections.
First going back to some successes. I'll give you an example. I go back from Ghana where we were able to sort of foster an election coalition group with other partners, and what we essentially saw was that people would do things in their own different, you know, corner, right?
But together it's like, coming together we were able to sort of form [?] and we saw how that panned out in the last elections in Ghana that brought about the election the new president of the country.
And so much so in the sense that the collaboration also helped with, you know, it was inclusive. It was inclusive collaboration. Because for example, in Ghana there were situation rooms in the south. There were situation rooms in the north.
So what that did, we were able to map out not only the [?] but the patterns we were seeing from region to region. And I can say about the Senegal elections that brought about [?] where we had an election coalition, we had the same thing in South Africa. And if we follow what happened in the election, for the first time we had a government [?] because, you know -- and we saw how disinformation played out in all of that. I'll give you my own context in Nigeria where we had election in 2023, which is the elections in Africa where we had more than 15 election.
Having an election coalition was a very [?] we already had elections where we did coalitions to say, we know that the coalition of this election was [?]. On election they would have things like one time I [?] so we can prebunk that. We were able to use the coalitions to introduce AI tools, new tools, capacity building.
For example, the tool developed in collaboration with Africa was provided for free to the coalition members in Ghana, Nigeria, and practically all the countries that had election coalitions in Africa. And that's very, very big because naturally those individuals may not be able to access that. But with the coalitions we're able to sort of give that for free, provide it for free in collaboration with [?]
So for us, it wasn't just about the election, it was the opportunity to collaborate at a larger scale.
And I'll give an example. For my context in Africa, radio plays a very important role. So you cannot talk about election coalitions without talking about the impact on radio. So at the level -- if you look at the structure that we will presented where you have the collaboration, what we did was that apart from the fact checkers and CSOs, we partnered with the people doing the election work. Our contact, we reached a lot more people, a lot of people were in news deserts or the subcommittees or people who were -- what we call media inclusion.
So for us, that's -- that was very, very important. And we think that moving on with elections were a common -- common around [?] it's an opportunity to actually connect.
Thank you very much.
>> JIM PRENDERGAST: David, thank you very much. Turn your camera back on for a second because I want everybody to note that David gets bonus points for color coordinating his outfit with the theme color of the IGF. I want to thank you for calling out the importance of radio.
I think so many people are focused on the next technology and the future of technology and where these problems are happening, that sometimes including myself, we forget about what's already there or how different environments consume news. So your comments about radio are certainly hitting home with me and I'm sure with others.
So Daniel, you've got some experience with this in Brazil and I believe we got you up in the middle of the night to share those with us. So why don't you tell us how it went for you and some lessons learned there.
>> DANIEL BRAMATTI: Well, thanks for having me. I am editor of the fact checking unit of the newspaper. And also a member of the Advisory Board of IFCN, the International Fact Checking Network.
I'm going to talk about the largest and most successful and most durable collaborative project involving journalism in the history of the Brazilian press, which of course is the [?]. In 2018, the Brazilian investigative organization was invited to organize and coordinate a coalition of 24 media outlets to combat misinformation and disinformation in that year's presidential elections.
I was then president. And the invitation came from the researcher Clair Waddell, head of first draft and author of the famous report information disorder from 2017. Google was one of the sponsors of the project.
I have to say that at first, not all media outlets showed enthusiasm for the project. Of course the news market is very competitive in Brazil and there was no culture of collaboration between different companies here. But gradually, resistance was broken down, mainly because there was great concern about the impact of disinformation campaigns during the presidential race.
Everyone knew that the challenge of containing this information was too great to be faced in isolation. And all decisions related to the project were made through consensus building without imposing directions or rules.
Even the name of the project was chosen by the participants themselves. In Portuguese [?] means to verify or to check and also sounds like the words (non-English language), which means with proof. There's a word play here.
An important decision we made was to limit our verification work to content generated by social media users. We didn't check the candidates' speeches for statements. As one of the candidates lied a lot more than the others, it was probable that he would be the most contradicted. So many media outlets were concerned about the possibility of conveying the idea that there were -- they were against this candidate or that they wanted to benefit their opponents.
The vast majority of the media outlets invited to take part of (non-English language) did not have fact checking units in their news rooms. So dozens of journalist had to be trained using the methodology provided by first draft. These professionals were from TV stations, radio stations, newspapers, magazines, and digital native media. Organizations of different sizes that reached different audiences in different parts of Brazil.
In essence, (non-English language) put journalist from different companies to work together to debunk misleading content and the final result was only published after a cross-checking process. Meaning that at least free media outlet's not involved in the original fact checking had to give their approval to the work done by other colleagues.
In addition to working together, another important aspect was the amplifying power of the media outlets involved. The fact checks were almost always published by all 24 participants in the project.
So after a first face-to-face meeting in May 2018, it was officially launched in June and during the Congress of [?] and in August we started publishing our first fact checks.
The election campaign which ended in October confirmed our worst fears. There was a huge circulation of misleading content and this content generated enormous engagement with the public that wasn't prepared to deal with the problem.
We had a lot of work, but also a lot of enthusiasm. All the work was done remotely, I'm talking about two years before the pandemic here. And to coordinate our activities, we used Whatsapp group. The amount of messages exchanged in this group was immense. In six months, around 50 journalist exchanged more than 18,000 messages in the group. And I did a word count on these messages and found that more than 350,000 words were written. For comparison, that's more text than any book of the Harry Potter saga.
So relearned some lessons. Number 1, a shared purpose motivates journalist much more than competition.
Number 2, as a result of collaboration works best if there is central coordination also.
Number 3, the role of the central coordinator, the project editor, is not to give orders like a boss, but to act as a diplomat who seeks to build consensus and break down resistance when needed.
And we learned fundamentally that fact checking is hard, very hard. Sometimes it took us days to get information we needed to disprove a piece of content that clearly had been created in minutes. We managed to publish around 12 fact checks per week, or 147 in total.
Organizers and participants were very satisfied with the experience and as a result, it did not end in 2018 as originally planned. The consortium remains active to this day with the mission of fact checking rumors related to public policies, health, climate change, and other topics.
We also worked together during the pandemic fact checking false rumors about vaccines and the virus. And the electoral campaigns of 2020, '22, and '24. The number of participants grew to more than 40.
Our work in 2022 was especially important because in that year, there was a wave of attacks on the integrity of the Brazilian electoral process. There was a lot of content citing false vulnerabilities in the electronic voting machines, and suggesting that there was fraud to benefit one of the candidates.
We didn't know at the time, but many of these rumors were created and spread by state actors. We against agents from the Brazilian government with the aim of destabilizing our democracy.
Recent investigations by the Brazilian federal police have revealed that we almost suffered a (non-English language) that year and that disinformation campaigns were part of the plan.
We still have democracy in Brazil, and I don't want to exaggerate our role, but I think I can say without fear of being wrong that journalism contributed to this result.
Thank you very much.
>> JIM PRENDERGAST: Great, Daniel, thank you.
So unlike many of the sessions you may have been at at the IGF to date, we have this room -- this room both physically and Zoom room until 11:00 a.m. that's 45 minutes that we have set aside for comments, questions, and discussion. As I told you at the outset, we wanted this to be as interactive and engaging as possible. I see we have a question already online.
While I get my act together on that, I'm going to throw one out to the group to sort of let folks in the room think about it. But you know, one of the things that I was struck by was -- and Dan, you talked about this, is pivoting from an election to other things like the pandemic where you're doing fact checking.
I guess for everybody, you know, how do you keep the momentum going?
You know, I'm biased, I just came out of a national and Congressional election in the U.S. where we were bombarded nonstop with election ads and all sorts of stuff. And frankly, we're tired. I can't imagine how journalists feel coming off of a cycle like that. How do you keep the momentum going from elections and other issues, either state or local or other events like a pandemic?
So whoever wants to take that first, go ahead.
>> DANIEL BRAMATTI: I can go first. In the case we had, the media participants were the same but not the journalist. We have -- we rotate the team, so that more people can -- can get together and learn from the others.
So basically we have a fresh team working together every year so that this is not a problem to us.
>> JIM PRENDERGAST: Mevan or David, any comments on that one? Go ahead, Mevan.
>> MEVAN BABAKAR: Sure. I think it's really important in my experiences of being in election coalitions myself before Google and being a journalist for a long time, is it's really important to look after yourself in those situations and to look after the team. And it's important to also step away when it becomes too much.
I think that actually the emotional burden that a lot of people take on in these situations is quite high. Especially, you know, when we talk about elections, that's one experience. But a lot of people are fact checking during conflict, in war zones, or doing work that actually where you end up seeing things that are quite harmful yourself.
And I think the well-being of the team and the people is actually the thing that must be preserved and looked after beyond anything else.
I wanted to recommend a hand book that was written called -- it's about vicarious trauma and how to look after people in a news room specifically. And I'm going to put a link to it in the chat.
But it has really great recommendations for how to look after journalist, news rooms and campaigners so they don't experience vicarious trauma through the work that they do. I think it's a really great resource for answering that question.
>> JIM PRENDERGAST: Thank you. David, I saw you flash your camera on. You want to weigh in?
>> DAVID AJIKOBI: Yeah. So first, we had very interesting case in 2019. So in 2019, we had an election coalition, and it was [?]. And because the language or the reason, the thinking behind was also to some the partners, what happened is that you expect them to be paid. And I could tell you that when they got paid and when money dried up, I mean, people left the conversation.
Only Africa and fact check [?]
But what we did with 2023 elections in Nigeria, we said it was people who understood the role of media in a democracy like Nigeria. For example, we had decades of [?] so having elections done properly and the outcome in a country like this is very important.
And you know, the media are going to clean it up. And I give you the example of by doing so, we have partners who were coming to us and in the election 2023, there was no single [?]. What we did was in our collaboration work into the coalition work.
And we also proved to that group, spoke to that point, Nigeria has state elections. So we had state elections in four or five states in 2023. And we come together again to sit in the coalition room, we set up the one in Naples, we set up the one in Abuja for example. But we're seeing a lot of funding and support coming from other funders, coalitions, you know.
But we think we have journalists and fact checkers and media partners who have a common sense of understanding the rule of media in the more places, it wouldn't be a problem. It's not been a problem in Africa, and some of these election coalitions are [?]. I think that's one of the key things [?].
>> JIM PRENDERGAST: Thanks a lot, David.
So we do have an online question which I'll read out. It's from hazel. Any thoughts on children as fact checkers or part of disinformation campaigns or as the target audience of these campaigns? Who would like to take that?
>> MEVAN BABAKAR: I can jump in really quickly. I haven't seen any young people being included in an election coalition specifically. But maybe David and Daniel will know more than I do on this.
But I do know that there have been, like, media literacy efforts that include young people for sure. And one that comes to mind is the teen fact checking network, the media-wise run that's run out of a point in the U.S. They would actually go and work with teenagers and actually teach them what does it look like to even fact check. What is a fact check? How can you go out and check something that you see on social media?
And a while ago now, check Yado in Argentina used to run a really big schools network as well of fact checking and fact checkers and they had a series of videos that actually had nothing to do with politics, it was a lot about you have seen a post on the Internet and it's about your friend. And someone is pretending that your friend has done something that they haven't. And they set it up in a way that was like a series of -- you were a detective trying to figure out what could happen and you could do, you reverse image search and run a couple of certainly has that would help you get contextual information.
And it really helped the young people who took part, not just learn those skills, but also to ask the right questions. And I think that's a really important part of it. It's not necessarily to just learn fact checking through the lens of politics, it's actually just being critical in your day to day when you see something.
And I think that that question part of it is the most important. I'll link those two projects as well in the chat so you can see them.
>> JIM PRENDERGAST: Thanks. David, did you want to add something?
>> DAVID AJIKOBI: Yes. I want to add quickly that specifically for election coalition work, we -- we involved Nigeria compost journalist is that actually students who are based on compasses, pretty much like press clubs. We invited them to [?] for a week or two to see how we are putting everything and how the fact checking process works. And also in that context.
Then two the fact that we had student volunteers who would come out to say can we join you guys?
So that was very important. But beyond the coalition work like we did, we, for example, in Africa, we also did -- we were doing the finish model where we were trying to get across [?] who were below a certain age. We had a project with the agency where we went to schools to actually teach them how to fact check simple exercises and we incorporated games. Because we had people, you have to keep their attention with games and things like that. The feedback has been fantastic, particularly -- because these school learners will turn 18 and will be [?] and we thought that we give them skills now will help them access or navigate the election when next election coming up [?]. Thank you.
>> JIM PRENDERGAST: Thank you.
We've got a couple questions in the room. So I'm going to pass this microphone to the woman to my -- hopefully it's still working.
>> Yeah, hi. My name is Lina, I'm with search for common ground and the council on tech and social cohesion. I just want to congratulate all of you, because this is exemplary work. It also aligns with what Google as has signed on to, this is the video election guidelines for tech companies that I worked with you and many other agency partners where there's hopefully a momentum to try to put these kinds of things in practice.
So just want to really acknowledge that it's hard work, goodness work.
But I have four questions that I want to raise, and I'm very curious to hear what you think.
The first is that there is a lot of evidence about Google's ad policy month advertising mis and disinformation. While the fact checking work is critical, you actually have an upstream driver of misinformation that doesn't seem to be discussed in these kinds of conversations.
Secondly, we see how generative AI is very quickly taking over search, and that includes your own AI summaries and all of the competition between all of the AI companies which could seriously disrupt the things that Google has done so well in terms of upranking higher quality information over years. It's been a big point of credibility.
But this risks to disintegrate.
Number 3, we had 80 election, 60 to 80 elections and rework in places that are struggling, that are conflict affected, and these kinds of coalitions don't happen in places like Chad. They don't happen in other places where -- between Civil Society and government is so incredibly deep that it's difficult.
So the question is, how does Google act when, in fact, there isn't a coalition? Do you try and take the initiative?
And the last question is similar in the sense that I believe Google was part of the prebunking effort in Europe to try to tackle misinformation through prebunking. It was an initiative you took, but you haven't take than initiative in all the other places, notably in the Global South where we have similar issues. And sometimes the consequences of misinformation in these conflict affected societies is deadly.
>> JIM PRENDERGAST: Thank you very much. There's a lot to digest there. Mevan, do you want to take the first shot at some of it, what you can? I guess one of the things I would ask is -- and it was one of the questions I actually had for David because he used the term first, prebunking. Up until this week, I had never heard of what that -- I had never heard that word before. So explain what that part of your answer.
>> MEVAN BABAKAR: Sure. Let me first say I think those are all very important questions. And I'm grateful that, you know, they have a forum to be asked.
Prebunking is when there is a narrative that is trending in a country or there's like a series of claims that adapt to a narrative that might be seen at the shop and of a news outlet, et cetera. And instead of dealing with it after it's been published and after it's actually, you know, trending and viral, prebunking deals with it beforehand.
So for example, in the UK I know that every single election based on my years of fact checking is going to be a claim that comes up around the election, around, like, Day 1 of voting that -- or the day before voting that will say, if you use a pencil to actually mark your X on your piece of paper, then your vote is invalid. And that's a claim that comes up every single year.
It's not true. But it comes up and it's used to disenfranchise people sometimes.
Another claim that comes up is something that will say, if you're voting for this party, you vote on this day. If you're voting to this other party, you vote on this day.
And it might feel innocuous, but these are things that we know are going to come up. Sometimes they can cause harm. And they might disenfranchise specific populations.
So instead of dealing with that fact -- that piece of misinformation after it's actually going viral, a prebunk will actually warn people that this is going to happen maybe weeks and months in advance. You know, it will say, one of the kind of tactics that we've seen is these kinds of claims being used to disenfranchise people. Or we'll teach people about straw man arguments or the kind of tactics and manipulations that take place so that people are sort of inoculated or vaccinated against the misinformation when they see it.
So that's a prebunk.
I think a really important part of a prebunk is that it's not Google trying to push this out there. It's actually the community organizations that have the relationships. And I think this kind of goes to some of the -- the questions.
In a lot of these cases, I think it's really important, and this is why we do the work on election coalitions, that it's not just one organization pushing out a narrative, it's actually communities identifying misinformation that affects them, and those same communities being empowered to combat that misinformation themselves.
It's one thing for somebody in that community to fix it, it's another thing for an external party to come in and say this is how it should be.
And we both know which one's going to engender more trust. That's a really important part of this puzzle.
In the case of prebunking, it's still a relatively new effort, and it's one that's led by jigsaw. And actually just last week -- well, at the beginning of December they graduated prebunking into the real world and actually handed it over to a series of community organizations.
And the idea is that those community organizations will be the ones that kind of further it and grow it.
And that includes people from across the globe. So it shouldn't be just an EU centric effort. It should be something that exist around the world. But I think it is resource intensive and it requires infrastructure. And I think that part of this election coalition's work is building that infrastructure for things like prebunking to actually jump off of. Because having that layer of community organizations and journalist working together is the scaffolding that we need for things like prebunking to actually take effect.
Your other questions were about how did Google act when there isn't an election coalition or where there isn't that kind of infrastructure already in place. Like you mentioned Chad, for example.
I think that's a really important question. And I also think that there's an element of that that's supporting understanding what are the prerequisites for an election coalition.
In some cases, yes, it does require community organizations to already exist. It requires services, it requires a certain amount of media organizations to be present. And I think in the cases when those things aren't present, we have societal conversations, we have societal challenges that we need to tackle.
And I think that's not something that Google should do in isolation, it's something we all need to do talking together and Google plays a part in it, but figuring out how to create those structures in a completely different environment.
And then finally, on Gen AI and ads, on gen AI, I think it's a really important question and a lot of my work at Google these days is about building tools to support fact checkers working with gen AI.
I think it's important to say that a lot of the fact checkers are excited about using gen AI and AI tools, and I think that that's sometimes missed as a part of this conversation. The scale of the misinformation that already exists is quite high, and I think we're all aware that manual efforts alone are not enough to fix it. So there's an opportunity there to use AI to actually help the battle.
And that's not to replace anybody, but actually to just support the efforts. And David already mentioned the Full Fact AI tools that are used by over 50 organizations around the world now. And that helps fact checkers actually spot repeat instances of misinformation to actually do some primary checking, and it doesn't take the fact checker or journalist out of the equation, but it supercharges them to do more and more at scale.
I think that's really important. And I think finally on gen AI, the EU election coalition that I mentioned that election 24 check, we funded them to do research with the 3,000 fact checks they did this time around to tag the ones that came out with -- that were gen AI.
Actually it was surprisingly low the number of instances of gen AI that caused harm in an election cycle. I'm not saying that it won't cause harm, obviously it has the potential to. That's an obvious thing. But I also think that it's interesting to consider that at this moment in time it's not doing that.
So how can we actually find instances of where prebunking might actually help with gen AI. So in Taiwan, for example, one of the ministers would put out fake -- deep fakes of themselves ahead of an election coalition to inoculate the population against it.
And I think that's a really interesting case study. But having said that, the harms of gen AI are still quite high and there's a lot of effort at the moment across Google to combat potential harm from AI election information especially.
So there's something called sig ID there is water marking where we add a signature that's added by Google and we would be able to flag it. We're also part of the CTPA coalition, it actually is an industry standard for assessing when information has come from and the provenance of information. And those are being added into our tools right now.
So actually if you see an image, you'd be able to say this is where it came from. And beyond provenance, we're actually also working on a series of tools that are about giving people more context. So when you actually see a piece of misinformation or you see this it's AI generated, you can also go to things like about this image or about this page on search which tells you how -- how old is this image, where has it come from, who first published it, are there any fact checks about it.
And so encouraging people to do that while reading around it is a really important part of this. So that's the kind of user intervention side of it.
And then finally, there are a whole host of policies that remove hundreds and thousands of ads, like thousands and thousands of ads every single day. And whether or not those thresholds are, like, are exactly in the right places, that's a conversation that's constantly being had. And changes in each country and changes with different laws and regulations.
I think it's an important challenge, but what I'd like to leave you all here with this isn't a thing where it's just one answer, it's different in every single country. It's different in every single threshold. The context keeps changing. The tools keep changing. And actually it's something where the seesaw of it is hopefully going in the right direction. But we do keep seesawing, if that makes sense. I've spoken a lot, but I'll leave it there.
>> JIM PRENDERGAST: You're entitled to another sip of tea there. We do have a couple questions in the room, please identify yourself and your affiliation. I'll turn it over to Milton.
>> Milton: I'm Milton, I'm at Georgia Tech. I want to begin by challenging the term misinformation. I'm in sort of a Computer Science algorithmically driven university. And the term tends to encourage the idea that misinformation is something that has a signature that you can just recognise and somehow kick out of the bit stream.
And I think the Google speaker was very perceptive in pointing out that it's really -- it's narratives, it's interpretations. And I don't know why we don't just say false or misleading information, because that makes it clearer that when you interfere in these discourses, you are essentially setting yourself as an arbiter of truth. Now I love it idea of coalitions of journalist who are coalesce doing fact checking, because that's fully in line in the liberal democratic idea of the role of the press in a free society. Right?
You are -- you are not forcing anybody do anything, you are just simply responding to bad speech with correct speech or good speech.
But there's an elephant in the room here that I though see addressed and that I want to ask you about. I'm sure the Google people are very aware of this. There's a high degree of concentration of communication and discourse around platforms, and as a result of that, contestation over what those platforms suppress and what they promote is -- the stakes are raised very high.
In particular when government get involved in trying to influence those decisions, you get problems. You also get problems with perceptions of bias from the platforms, which are well known as being situated in liberal California and Silicon Valley has not being exactly in red state territory.
And perhaps the Hunter Biden laptop story is a perfect example of where you think you're suppressing misinformation but you're actually responding to maybe political pressure from people who think that a certain amount of information that might actually be true is going to harm the chances of their favorite candidate in the election.
So I'm concerned about how you set the threshold for where you actually intervene in -- in these false narratives or misleading narratives. I don't want to use the word misinformation. And I'm particularly concerned about how you handle the role of government.
We have series of court cases again, we have the Supreme Court case that went all the way to the Supreme Court. We have state legislation in Florida which is trying to regulate the way you make these decisions and impose common carrier obligations.
So this issue is really a lot more -- I mean, it's great to have these journalistic coalitions, but legally and economically this issue is a lot more wrought than you've made it out to be here. And I'd like to know how you handle those situations.
Particularly, again, when, you know, the government is an interested actor in the outcome of an election, obviously, right?
So what happens when you get pressure from governments to suppress information that may be damaging to them or that may be an extension of their policy?
>> I didn't any if you wanted to pass it to somebody else or take one at a time.
>> JIM PRENDERGAST: Sorry, we'll take a couple of questions from the room and then we can sort of bounce them around.
>> That's a good start. Good morning, everybody, I'm from the University of Amsterdam and also on the Executive Board of the European digital media observatory.
I have they quick questions. I really appreciate the I for the local context in which these coalitions built. But I wonder if you could speak more as to how you choose partners who are in or outside of these coalitions since there are so many new and relatively unknown actors when building these coalitions in different election content.
So the sort of how to build these partners in and outside of the scope.
The second question would be what in your Google playbook, the best advices that you give the coalitions in dealing with critiques that they might get that they're trying either to stifle free speech or to intervene in the elections, which is a common critique coming from different advantage points and different kinds of elections.
And then the third question would be how is Google trying to be proactive in actually building a coalition that would also have multiple big tech platforms at the table so that you would see a coalition that is rather driven maybe by industry interest that S'more maybe than one or two individual companies. So those would be my three questions. Thank you.
>> JIM PRENDERGAST: Mevan, I hope you refreshed your palate because a lot of them are directed to you.
>> MEVAN BABAKAR: I'm actually out of tea, unfortunately. I feel like I need to top off. I want to make it clear based on that last question that these election coalitions are not Google-run election coalitions. These are communities of journalists and fact checkers and like social organizations that have come together, created the coalition, and then gone for funding.
And Google just happens to be one of the people that funded it.
And so I think that's a really important part of this. These are interested people in their own countries coming together to build something that serves the voters of those countries. And the only thing really that Google has done is supported them either with resource or funding, and done this research project to kind of collect some of the learnings from them so that if another group of organizations comes to us with an election coalition, we can say hey, here are the lessons of the other coalitions that have come before you. And you can learn from them.
And so I really want to just make that very clear. So we don't choose who are the partners that are in the election coalition. It's not us picking and choosing, it's actually the organizations themselves coming up with their own ecosystem, their own collaboration, their own policies, their own membership models, their own capacity building programs.
And I think that's a really important part of that.
You had a question about how did we -- how did they really, like, stop the challenges of being said to sensor and free speech. I'm going to take off my Google hat for one second and put on my fact checker out. When I worked at Full Fact, we would get the UK fact checking share the, we used to get a lot of fact checks are censoring speech kind of conversations. And our response at the time and still is, used to be fact checking is the free speech response to misinformation. We're not taking anything down as fact checkers, we're adding more context. We're giving you more information so that you can make up your own mind.
And I think that's how we used to deal with it as fact checkers. But Daniel and David can probably give you a much better answer for where things are these days.
>> JIM PRENDERGAST: Milton, I'll ask, we'll get a response from David and Daniel first and then we'll cycle back to you.
>> DANIEL BRAMATTI: I think that Mevan's response is perfect. I really contest this idea that fact checking is censorship. You have to have content moderation in platforms. You have content moderation regarding violence, regarding pornography, regarding other things, and also you have to have content moderation regarding the flow of bad information, information that contributes to polluting our media ecosystem, our information ecosystem.
So the other thing -- there were so many questions. I just want to mention briefly that how we decide that who enters the coalition and in the beginning and after.
So at the beginning, our goal was to reach -- reach a large percentage of the Brazilian public. So we -- we invited to the table all the big players in the media here. And also we tried to balance different media organizations according -- according to their editorial orientation, more to the left and more to the right. And also contemplating local players.
So it was a very diverse group, in my opinion. And since then, since we decided to keep it going, all the new -- the new participants are -- are -- they ask to enter the coalition. And we decide collaboratively and everybody has a veto power if we give the okay or not to that applicant.
And to this day, I, in my -- to my knowledge, we never close the door to anybody.
>> JIM PRENDERGAST: So coming back to you, Mevan, to Milton's question about thresholds and how do you determine when you take action and when you don't, did I -- one of them, okay.
>> ALEX WALDEN: I can jump in quickly on the challenges of how we engage with government and the government pressure and how that does not impact us and I'll kick it over to you Mevan, to talk more about the definition of misinformation and the challenges around that. Although I will say on that piece, you know, having worked on this from the beginning when we were -- like when fake news was the term and then we all decided fake news was not the right term, there are so many conversations happening all over where, you know, mis, dis, malinformation. I think for us at a company we have to land on something and figure out how to operationalize it while we're managing these harms. And we also will continue to be part of conversations around what's the appropriate lexicon for how we're describing what are just sort of abuse or exploitation of the services we're providing.
But when it comes to sort of the challenges of government, on the one hand, obviously, we are deeply committed to partnering with governments across a lot of the work that we do, and increasingly so that's the case.
But also, you know, I'm agreeing with you that also it's the case that governments and parties are -- they're interested parties in the outcomes of elections. And so we have to be mindful of the role that we have in engaging neutrally. That gets back to the importance of us having clear and robust policies in place to make sure that we're consistently addressing any of these issues as they come to us.
On the one hand that's about having clear policies that are the product policies. How did we define election misinformation or misrepresentation or the variety of other things that might come up. How did we make sure that that's clear.
And then we really do have to enforce that consistently across all of our policies in every country.
And being clear about for legal reasons we might need to remove something under a national law. It's perfectly legitimate for any government to say this content violates our local law and here's an order and you need to remove it. In that case we would evaluate it under -- under our standards. And then if the -- under our analysis it's consistent with the local law and we've received the appropriate process from the proper authority, we may remove that and then that would be something that we put in our transparency report and make clear that we've complied with the national requirement.
So I think those things -- that's what we rely on everywhere we're operating even though it's true we'll get pressure to from government. But we have to have that to fall back on.
>> JIM PRENDERGAST: Thank you, Alex. Mevan.
>> MEVAN BABAKAR: I think that was a great summary. The only thing I would add on terms is like Alex, I've seen a lot of work done on the different terms in this space and I think that sometimes different terms are helpful for different types of things. But I think the way that I like to think about it personally that makes it very real and reminds me and others of the importance of this work is thinking about it through the lens of harms. And I think that there is some really excellent work being done at the moment by a professor called Peter Jones at Westminster University to develop a harms framework. And the European fact checking standards network, well a couple organizations in Europe are looking to use that in a meaningful way. It's one thing to say there's misinformation and to say there's been 5% interference in this consistent. Or vaccine misinformation that's the level of harm. And that's the granularity we need to get to, and at that point we bypass some of the issues of the words and we get more to those claims and those narratives and those harms.
And it's only at that kind of detail question start to understand what interventions are meaningful and who should take them. Should that intervention come from the government? Should that intervention come from a platform? Should that intervention from the community or the people affected?
Because I think actually we need interventions at all of those levels.
Thanks.
>> JIM PRENDERGAST: Alex , did you have anything? I've just been shown that we have five minutes to wrap up. So what I'm going to do is ask each of our panelist, if you have one piece of advice that you could offer people who might be interested in or starting their own elections coalition, what would that piece of advice be? And then a call to action coming out of today, what would you like to see happen? We'll start with David, please.
>> DAVID AJIKOBI: (Muffled audio).
>> JIM PRENDERGAST: Thank you, David. Daniel, turn to you.
>> DANIEL BRAMATTI: Yes, sorry, my camera was off.
My advice is choose wisely the organization that is going to coordinate your coalition. As Mevan said, it has to be organization that is if not neutral, the more neutral possible in terms of political stance, independent from parties or government and from private sector pressures.
I know that sometimes it's difficult to find an organization with that characteristic, but it is essential to gain trust and to lead the work.
>> JIM PRENDERGAST: Great. Thank you, Daniel. Mevan.
>> MEVAN BABAKAR: I'd say the relationships are everything in election coalitions. So similar to kind of Daniel's point. But it's really important to not underestimate the amount of work it will take to actually build those relationships across those media organizations and across those people. I think that when you get a group of people like that coming together who trust each other, that's when something special can happen. But that takes time.
And I'd also add, because relationships take time to build, maybe it's not the first election that's the best one, maybe it's the second one or the third one. And I think that the one in Brazil is a good example of those. And as those opportunities grow, so does the scale of those coalitions. Thank you.
>> JIM PRENDERGAST: Great. Thank you. Finally here in the room, Alex? No, okay.
I want to thank everybody, it was a great presentation, very good questions, very pointed questions, frankly and good discussion. For those in the room that want a copy of the slides, come see me, I've sent them to somebody online who wanted them. I do want to thank our panelist who all got up at varying degrees of the night continue to join us and Alex for joining you us in person. Thanks everybody for joining us online. And for those in the room. Mevan's been dropping various links in the chat. So when it's posted on the website, go back and that information is there for you waiting.
Thank you, everyone, and enjoy the rest of your day.