The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> MODERATOR: Hello, everyone. My name is Martin Samaan. I'm with the United Nations Department of Globe Community Indications.
I'm Here with Isabelle Lois. Seen I'm with Julia Has, Office of the OSCE for Short. We Are at the Internet Governance Forum. This panel Will Be on Leveraging Technology for Healthy Online Information Spaces. And I will have the panel themselves.
>> ISABELLE LOIS: Hello, I'm Isabelle Lois, I'm a Senior Policy Officer at the Swiss Federal Service. We talk about freedom informing online and these topics.
>> AWS AL-SAADI: Hi, I'm Aws Al-Saadi. We are founder of Tech4Peace for the social security and highlights. I'm in the region usually. Thank you.
>> NIGHAT DAD: Hi, everyone I'm Nighat Dad. I'm founder and executive director for the Digital Rights Foundation. I sit on oversight borders looking at the content decisions of the company and holding them accountable.
>> MODERATOR: Thank you very much and welcome to the people online and those who have joined us here in the room. Today we will hear about the main challenges of big tech and the information landscape, and We will identify different stakeholders addressing the many challenges that arise with the ever evolving and quickly evolving technology.
The Internet Governance Forum have tackled the digital opportunities and challenges that face our world. And as the largest stakeholder gathering they're bringing to society and the tech community so we can talk together to shape a more inclusive and effectible future. It's great to have you here at the Internet Governance Forum. And I will join Julia Has Online. She Is also part of the OSCE Team.
>> REMOTE MODERATOR: It's great to join you online. Thank you, Martin. It's great to be with you. This multistakeholder reflected on this panel. It's great to see we have people from Civil Society and from International Society and the state and the board that is a particular body that stands in between.
But all fulfill a very important role. So I truly believe this will be a very important conversation, even if many people will be at lunch. In any case, I'm very excited to see you all here. And also to hear about this UN context and the importance of IGF, which of course the OSCE would agree that the challenges we are talking about can only be addressed in a multistakeholder manner.
So we briefly heard the title of this session but also to give a little more context that our intention with the session was really to discuss both kind of the impact of big tech on the information landscape and hour states can limit any undue power or control we see over the information space or the power concentration but then also on the other side what can be done in context where maybe states would not be so eager to intervene or where maybe we don't have a lot of very strong democratic institution or rule of law standards. So how can we find a way forward to address the challenges and ideally in the last steps, so to speak, so see whether and how technology can be learned for a healthy I don't know Lynn information space which is an overarching of today's session. And that also the name of a project we are currently work on at the OSCE frame.
And the framework of this project we are currently working on developing recommendation for states on media and big tech. So within -- I mean as you, Martin, already rightly referred to of in the title of our organisation, representative of the freedom of the media.
The media has a particular role to I practice which we also acknowledge in the information landscape. It is about freedom of suppression and privacy but ad journalism has a particular role to I practice. So we are currently also exploring how media and big tech are interacting, intersecting. What are the particular challenges for adjourn Is and how to address it.
And in this context, the context we are particularly looking at media sustainability. The accessibility and availability of public interest information. So also questions of visible by platforms. And these kind of thing and protections.
And I'm just mentioning this not for self-promotion but to indicate the conversation we will be having now is really feeding into this process. This is a multi-month process and we are currently holding several round tables and discussions including the conversation we are having today and we really want to build on all of your knowledge and expertise and experience to feed into the guidance we will be developing for states before summer next year.
So having said that I'm really very much looking forward to hearing from all of you and learning from you. And see how we can empower the information space as also actors that are particularly important for the information space sufficient as journalists in that case.
But maybe before we dive into specifics or into the media question already it would be very useful I think for all of us to get a little bit of help for setting the scene. And if I may ask you Miguel, you not only have great expertise from the Civil Society, from the organisation you forked and leading but have a lot of insights from the oversight board.
What are the key concerns and what are the challenges. Can you maybe in the outset of this session tell us -- would you consider from your expertise and your experiences are the main challenges of big tech power, so to speak over the information spaces. Thank you.
>> NIGHAT DAD: Thank you very much, Julia, for setting the scene. We are wearing another on Meta's board. It's been four years, and I'm one of the founding members and I think board an institute, first of its own kind in terms of the accountability and oversight mechanism when it comes to big tech platforms, where in our case. it's me. And I think I come from a very unique perspective where you know the jurisdiction that I have come from, the rule of law is not really strong, right.
So many of us who have been advocating for open, safe online accessible spaces, we have seen emergence of regulatory frameworks. And I think we need to keep one thing in mind that these regulatory framework emerging from different jurisdictions are not all the same, right. So we are a very good example of DSA which many of us really look toward looking forward to the enforcement of the regulatory framework from the Europe on the big platforms.
But when we looking at our own examples we are kind of concerned and worried in terms of -- there is not really meaningful consultations when it comes to multistakeholderism, so including Civil Society when these frameworks are being designed or debated by the government or parties receipt related to these regulatory frameworks and in our case regulators as well.
And then to look at other mechanisms. We are in between two power actors. One is a state. And another one is tech giant. So what kind of other mechanism you have -- especially when it comes to the users, that you can use to hold tech companies accountable.
And I think that's why I feel that oversight board is in a way in a unique position where it is independent. It is an oversight body. It's set out as a unique experience. And now it has become an institution.
And I would encourage people to look into the cases that we have decided we diverse group of people. We are not -- you know, from Global North or folks from Silicon Valley. We are from different regions, diverse backgrounds, diverse experts. And we actually take up cases that sort of like appeals that are filed by the users.
But also we are a company struggles. For instance in some of the cases we have looked into journalist cases. For instance, on top my head this one particular case that we decided in the very beginning, it was the word Taliban which was kind of wrongfully removed -- kind of forced on Meta platforms.
And this is very much related to south Asia -- and I don't think many folks know about this in Global North because they were putting hurdles into journalist reporting who were using these platforms. And we took this up and deliberated in this case, and they told me that there was a enforcement of this work. So you really need to into the terms of your community standards and the policies you have in terms of -- you know sort of regulating such kind of terms.
But it was basically to make it easier for media and journalists, especially in South Asia. This is just one example. But having said that, what I mean to say when we looking into the role of state, governments and the power of tech giant, we should also broaden our own idea that they are not just regulatory frameworks that we need to look into.
But we need to look into other institutions as well. Where the jurisdictions, where states are actually regulating in on their own terms and not in a meaningful way. And what are the other institutions who can come to help for the users when it comes to protecting user's rights on these platforms.
>> REMOTE MODERATOR: Thank you so much for giving us the oversight on the board work. It's super interesting to hear the broad variety of the work.
And I think what is particularly useful also for the work of people who are in this field and for the community is to look at the guidance being developed. It's not only about individual cases and individual protections which of course are crucially important as you rightly point out. But also to learn from that and take it a step furtherer and say how should policies be maybe adapted and what can be done in addition to it.
Another thing I found extremely useful that you mentioned is really this point for the need for diversity and outstanding local perspectives because it's really different. And this is something that we see over and over again with technology being deployed globally, the impact is very different, especially if we don't have the same capacities or even language capacities. Then it is a bit difficult. Or the implications may be worse.
I see the fifth speaker has joined but couldn't turn on her camera. So maybe while hopefully our technical colleague can help us, because we already spoke about this aspect of understanding local context where sometimes with specific words you might have an overenforcement or right enforcement where the context is crucially important.
And this is something I would like to bring in -- as you mentioned also when we had the preliminary conversation on this, that you have this experience, also in the context you work on. It's really difficult the big tech challenges. Because the context may be less strong democratic institutions or less checks and balances both from the state-side as well from the platforms who are very active in area but don't deploy the same kind of resources and attention to the regions.
Can you maybe build on this a little bit and tell us what your experience is or the specific challenges you would see in this context?
>> Thank you for the question. In general we are working in both languages which is Arabic and Kurdish. So the generally the platforms are not supporting any Kurdish anything. There is no programme. And Meta as a global fact checking programme working in a fact checking. The last October when we had an elections in the Pakistan area, there was a lot of misinformation.
And even when we exposed it you condition flag it because there is no programme supporting this language. Even there is more than 60 languages in their programme. But still they are not covering this area. And for the Arabic content, they have, for example, I give you an example, in Ukraine they have -- there is nine organizations working as a fact checker. If you go to Spain there's five organizations as fact checkers but if you go with the whole region you just see two organizations only.
One of them from a France organisation. And another one is for Tibania (?). And even there, there are searchers. One of the rules from Meta or TikTok as they want to be a partner for the fact checking you need to be a signatory from the international fact checking network.
There are search organizations from the Arabic region. They are signatory. Just one of them, As a third party fact checker. On the other side, also for the gaps that we have in the Arabic language after the 7Th of October, 2023, what has happened between Israeli and Gaza, there is a lot of construction on the content. Even as fact checkers when we are exposing fake news, they are celebrating we are doing faking news, but they take the pictures that we are exposing to posing it, and they flag it as fake news.
And the fake news is not flagged, so instead of fighting the flagging the fake news they are flagging the organisation Arabic that is working to expose it.
And because we are partners with Meta from 2015, we send it to them. We are exposing, and we are not publishing. And they just restored the content. And they didn't solve it completely for the Arabic content in the region. But if you have a relationship -- if you have connections with them, they Will restore your content.
If you don't have it, I Know a lot of organizations and journalists, and their accounts stop because they take them down. And when they appeal, it's automation. They do the reviewing it. There is no human person reviewing it, and it's again the jurisdiction. And other things which is the AI tools. As a technology, usually most of the AI tools can you find it started in the English language. And then for the fact checking. But in general for the Arabic it's not supported.
So this is difficult because if you want to build your own tools, you need funding. You need to work a lot with this technology issues. And it's difficult.
Also on talking about the media literacy, media literacy in our region. And around 30% of people inside the region, they are not connected to the internet. And I give you an example, Arabic most of the parties depend on the channels so if they have campaign misinformation, the people will believe it, and the election they will go to work for them.
I will stop from here. There's a lot offed point but it's better to have more conversation from the outside. Thank you.
>> REMOTE MODERATOR: Thank you very much. I think you're great with the interconnectedness of it all. You say there's so many different players from the challenges, but then technology is not necessarily helping but might perpetuate challenges if there are biases or no language knowledge or attention or even fact checking partners and trusted partners and all of that.
So it's really important to see how different -- what angles can be leveraged and how can the entire ecosystem ideally benefit from it.
The second point I found really important that you mentioned is the necessity to have a human in the loop. This is really this kind of thing that has been called for by Civil Society and many actors for the entirety of the conversation around consummation. But I think you pointed out particularly in context where the language or the automation is less specific with regard for example to the Arabic language it's a bit difficult or even worse and more difficult. So thank you for outlining this.
I wanted in this first round of better understanding the check list of the issues when we speak about information spaces and big tech and technology, also really bring particular components or the particular aspects of journalists. Because this is something that -- both of you thought upon already a little bit in the sense of how journalists are affected in your respective work and your respective areas.
But we know also from the global media scene that big tech provides challenges for journalists on other lefts because dependence about increasing generations, and we all know. So I wanted to bring in Elena Perotti. We had a very brief introduction before you joined. So you can also start with briefly saying who you are and what your role is.
And also tell us this particular role or where you see -- because it's global organisation that you are working on -- if there is some kind of specific implications that you see big tech has on journalism.
And in addition to this overarching challenges that we discussed so far. Or specific regional and language specific challenges. Is there something you could help us to better understand what are the key overarching concerns by current power that we see from big tech and the concentration of power that impacts journalists and journalism as such? Thanks.
>> ELENA PEROTTI: I'm very sorry for being late. I'm Elena Perotti. I'm with the Intergovernmental Organisation, a trade source for publishers all around the world. Our main constituencies are the national associations of news media publishers. Therefore the bodies that try to lobby their countries or the European commission to obtain better conditions for the business of the publishers again.
And my constituency, in particular, is exactly one of the directors of the association. So I'm privileged enough to have a real global Outlook on what the concerns are in the industry all around the world and how incredibly similar they became. Particularly ever since the digital stars which I would place at the start of Google in the 2000s and everything changed again when Facebook became mainstream.
Well, Julia asked specifically about journalism. Again my expertise is with the publishing business. But I do, of course, have also joined (?) and one interesting points for you to remember is at the very beginning when publishers were already starting to be worried about this increasing power of big tech, journalists were generally not. They just weren't. Because interests did not align, you see.
Whereas publishers it was clear right away that big tech was about to eat away at most of the revenues of the industry. For the journalists, the journalist still saw -- I'm speaking 15 years ago, 10 years ago, I would. They were still seeing as much more relevant on scale, the fact that Google or Facebook or the others who came afterwards would allow their content to be disseminated more widely.
So the interest really of journalists and publishers did not align at that point. What I've seen now, ever sin three or four years is now also joining us is a problem which is -- in a word the problem is intimidation.
So big tech has the power of giving to the person who is not extremely interested in news, enough news on their platforms. So that they would not click through to go to the publisher's websites. And therefore the publishers will lose money directly. And therefore they will not have money enough to fund the journalism of a professional journalist. That is the macro problem.
The other macro problem, of course, is that in the last 15 years the advertising revenues of publishers all around the world have halved, literally halved -- I actually prepared data to fire out to you, Of course remembering it is very hard because I'm reading.
But in 2024 there would be 1 trillion in advertising that would be interacted in 2024. 1 trillion. Which is an 80% increase compared to 2019 prepandemic. Of that 1 trillion, legacy media -- so the professor publishers of news will have about 30% of that ad spend and two-thirds will go to Google and Facebook and so on.
And I'm not exasperating more than 80% go to platforms. I know I'm only speaking money, but by speaking money I'm speaking democracy and professional journalism and the ability to do investigations and also speaking security of journalists being sent in war zones. Because all of that cannot happen if the publishers are not sustainable.
So there is really -- I would say Julia to answer your question. The main threat that big tech brings to journalism is the defunding, basically. That is what it is. And I don't know if big tech is (?) I have no idea but they have tried to work with news around the world with signing contracts and of course, and so on. They have given hand outs sometimes but that is just not enough.
It is I think a democracy responsible of government, but also stakeholders like us to find solutions so that the defunding of professional journalist does not happen because that is dangerous for democracy. I don't know how much time I have, Julia. Could I speak of this literally for hours. So just let me know.
>> REMOTE MODERATOR: Excellent.
This was a really good overview. I think it's important that you underline how the question of stainability or funning is not linked to the question of running a business of a media outlet, right. But it's really a question of what kind of information is available and what kind of investigations can take place from where and to what degree. So it's really a democratic question as you rightly pointed out.
And if we want to discuss on how can we fight disinformation as we heard about or how can we avoid it. And people or tech online, and how do we make sure people have electric integrity and have information available?
All of this is linked to the available of public interest information. So we can only speak about visible and accessible of such information if it is available. And it can only be available if there is sustainability and funding in the background. Yes?
>> You asked if there was any concern, very briefly. The main concern in society -- a very specific language. So of course not English but not even it's yap or French or Spanish or German. Of course the problem is misinformation. That is, I would say, the number one problem. Because classrooms do not invest at all in fact checking and in checking misinformation in small languages.
Please pass me the term. 80% goes to English languages, and as a consequence, we know this can have deadly consequence. And that is why local news is important. Because if you have to do cops takes you have to do journalism.
>> REMOTE MODERATOR: Absolutely. It's like what you said before how difficult it is to have fact checking if they are not trusted flag checkers. You have all of these different layers that add up to the challenges and local journalism is, of course, also at the forefront of being -- not only on the they very often from different actors but also the first ones who struggle with funning and advertising and all of that. So again this interrelatedness. And when we speak about this interrelatedness, and also, Elena, you talked about integrity.
And I want to hand it over to Isabelle. You are coming from the Swiss office of communication. You do a lot of work in the Swiss context. And also on a more global level, Switzerland has been very engaged and constructively engaged in regional and global initiatives that tried to work toward this healthy online information spatial that we are talking about. With this journalistic component, but also with regard to fact checking and discarding misinformation and user rights -- all of the things really that we talked about -- is there something still before we move?
I know we are already now moving into the direction of what can we do and what can states do also. But can you either already refer to that? Or say a few word still about how you see as a state really those challenges? And from the state perspective how they are incident linked from this global perspective. What does it really mean from your point. View?
>> ISABELLE LOIS: Absolutely. And thank you for this very interesting question and very interesting panel. I wanted to move a little bit away from the notion of big tech. Because, ultimately, I think when we are discussing these issues they exist independently of how big the economic actors behind them are. And we work a lot with media. So when we are working with traditional media would we want to promote as a additional landscape that would allow real debate in the public sphere. This is real the core aspect of having an important media.
So the largest act are the grate keepers because they control what information is amplified or suppress. This may not be by design but it is something we have seen happen. So the issue here is that there isn't enough transparency and accountable from bigger platforms of controlling or seeing how the information is flowing.
So that means that the public, who is reading and being online, does not have the knowledge of how and why some information is put forward and why some posts are put forward and reread and why others are not.
And this gives these bigger platforms a de facto agenda setting power. And this is where we can see the biggest issue. This new power of setting the agenda, of putting certain issues on the map reshapes the public debate.
Because we are, of course, I guess all connected and using social media and other platforms. So it will prioritize certain debates and certain issues and hide others. And the engagement or the content that is often engaged most with is polarizing content or misleading content instead of having information content or other ones.
This is where we see imbalance, and we need more scrutiny. I think this is the main perspective that he with try to come with as a government.
And in Switzerland we have identified there is very important to -- it's important to work with the media sector or the traditional media sector if I put it this way. Because of course -- and this was mentioned before -- significant challenges as the business model changes with big technology platforms dominating the market.
We talked about advertising revenue that is diverse from certain platforms to others and media organisations are struggling to stain themselves financially and it can lead to a lot of problems.
One of the things that we have seep in Switzerland is that there is a sort of constable days ago of the media market. So many of the smaller or immediate sized media outlets will have been because up by larger media structures.
And it is difficult to keep the news that is very local alive. So we try to do our best in this. But this is a complicated issue to deal with. But the major problem we are seeing here is really reducing the diversity and available of local news. It is also reducing the equal of the reporting, having less journalists who can work in it or less money for investigative research. And this means that we have fewer in-depth stories being produced.
So this isn't one of the points we have identified as well as online harassment that is getting bigger and bigger when we are getting to these polarizing conflicts.
I'm moderating a session tomorrow that will delve into this point, but I wanted to highlight the importance of media and having it as a fundamental part of democracy.
Switzerland has done a study recently that I found very interesting and also quite shocking. It has shown that only half of the Swiss population believes that intermediate I can't is essential for democracy.
So only half of the population believes. That rest are a sort of either thinking that it does not matter or that it has to. About 14%, I believe, thought it was not essential. So this is something that is quite scary to think about.
Because this detachment of understanding how important true media is and as a pillar, as a fourth pillar, of democracy. If we don't see the importance of it, then we cannot safeguard it. And I think this is where we are working a lot on capacity building. And I know capacity building is something that is discussed a lot at IGF in many different sectors but capacity building on accessing and getting the information and understanding the information and why it's important to have a certain source or another.
I think that is really a strong point we need to work on. And I'm happy to delve into some ways to address these challenges. I don't know if you want me to continue immediately or if you want to pass it on to somebody else. I will let you decide, Julia.
>> REMOTE MODERATOR: I do have an online question, but it's almost two-fold about literacy. Not only information literacy on the sense of understanding why something is shown, as you pointed out, but really understand the importance of media flurry and flurryism.
This is also something we try to work on and try to look at this technology of freedom literacy to really say the states -- I mean, it's not only about individuals, as you say, but also states sometimes misunderstanding of how the link is between media freedom, intermediary pluralism, and then the link to democracy. But also more broadly to stability, to peace, to safety and security.
So this is a very important point indeed. Thanks for underlining it. And the second point I think that was also very important that you pointed out this engaging factor, right. That currently content is more shown if it's engaging but we know people might be more engaging on polarizing content.
And then if we speak about healthy information spaces and democratic deliberation, also a context we have obligations with respect to diversity and regional diversity and all of that that might not gain the same attraction if different contents are being provided but that are crucial.
And so I think it's really an important question to ask build we should have similar obligations in the digital context and on the online space where we -- we both democratic states but all stakeholders should tell platforms it's not acceptable to prioritize context just because it's engaging and drives more tying.
And this is maybe a followup question, the lead-up to the followup question, where you see as a democratic state or coming from this perspective how states can work in that direction while acknowledging that of course Switzerland still remains a fairly small market from a media perspective but big tech perspective and all of that. So what are the avenues and possibilities that states can take?
And bring it back to you. Sorry to that. I also just want to offer people -- if there are people in the room who have a question or have a comment or want to say anything, please indicate it to us. We also already put the same in the chat for the online participants because there was somebody mentioning. So please feel free to also think about questions after we hear from Isabelle, please.
>> ISABELLE LOIS: Thank you. It is a very challenging question to answer -- how can states or what should states do. And I think states definitely have a role to play in addressing the challenges posed by big tech, both at the international level.
So what we could do in our on countries and at the international level through coordinated national every efforts. I think it's important to say -- and at multistakeholder platforms -- I think it's in collaboration with other stakeholders we could do the best work. So this is a first point as a general view what have doves can do. I think we can do things, but we should not do it alone.
So on the national level, sort of this Swiss perspective is that we have to protect the public's access to equal information and ensure some sort of accountability from platforms. And this can include and this can be done through some sort of regulatory measures that really ensures transparency in the platform's organizational system, clarify how and which obligations and empower the users to make informed decisions. Because at the end of the day I that users are at the center of this.
So staged inventions one of the ways to address the sort of power imbalance over information spaces but it is not the only way. And it is not -- it should be done in a very carefully balanced way to respect fundamental right and freedoms. There is a lot of potential harm that can be done with a strong state control over which information is accessible and how it is done.
So I think it is a very specific line and complicated line to navigate and there has to be a lot of careful considering brought into this. So in return we have sort of a on approach of strengthening the user's right and increasing the transparency of the platforms instead of moderating the content. So this would ensure that the state intervention is it not compromising the freedom intervention or step over other right that are essential for us.
We are currently developing our own regulatory framework for large online platforms and it is largely based by the European legislation so the digital service act, the DSA. But the law we are envisioning it is not yet in place, focuses on due diligence requirements, while strengthening the transparency rules and the user rights.
So it's not really about controlling from above but sort of getting the users to have all of the information. So -- I can get more into details on this if you are interesting but there are significant differences with the EU approach where our scope is much narrower.
We only focus on very large online platforms and very large online search engines, and we are also limited to only what we call communication platforms. And this is because we want to enable the public dissemination information between users for the purpose of opinion forming, entertainment and education and we are inclusive to any market users or platforms sites that are included in the DSA.
So really our most important point is to protect the fundamental communication rights of the users online. And this can be made to ensure that we have a well informed public. Because -- and this goes to the -- really the main point that we have in Switzerland where we are trying to give the tools to the users to the public and not control it from above.
And I think this is where we could work more with other stakeholders, with the media agencies, with other countries of not looking at it in a top-down way or a bottom-up way. And I will say one more thing on this. We also have a Swiss National Act plan for safety of ad journalists that was published last year. Where we have a whole different set of measures for most of them to address safety for the journalist, not only online, but mostly offline.
But our main focus is to bring aware on the importance of journalism for a functions democracy. So this is really where we can bring an added value. And we are also working very closely with the council of Europe who has a great campaign on the safety of journalists that I can only encourage to you look into.
But, yeah, this is the ways that we see that we can do something and that states could do something. Of course if we looking into regulation, we have to make it balanced. We have to make hour that we are empowering the public and not controlling what is put out there. And so this is where we are happy to discuss and engage with any stakeholders and meetings like today to have a bit of the other ideas. Thank you.
>> REMOTE MODERATOR: Again, thank you so much for this. I think there are a lot of takeaways with empowering and not controlling is a way of phrasing it. And this is something that we as the OSCE have been talking about. So we don't think about individual pieces of content when we think about process.
It's the same when we talk about misinformation and fact checking. How can we make sure the processes work better? That we have, as you say, transparency and accountability and all of these things. Multistakeholder engagement in a meaningful way, I think, this is also something we heard already today but it's not -- if we want to address the undo power. We want to do this with undue state power. It becomes important in context where we don't have the same democratic institutions.
We only have 10 minutes left. So again I want to ask if somebody in the audience would have a question. But if not, I would first ask Elena -- because we had a few sentences with journalism, but with the public's fear on pluralism. If you can build on that to think briefly what you think should be the role of the media in the industry and to respond to the power of big tech over information spaces and if you see maybe -- even an opportunity or possibility, how technology can help in this regard.
>> ELENA PEROTTI: Yes, yes. thank you, Julia. And indeed this question comes at such a good time because I just had an idea for a new project to read about that.
But first of all, I wanted to thank Isabelle for her intervention. And I would like to add to what she just said. Indeed it is only states, I think, can have the responsible to choose what stays online and what goes offline. Because only democratic states can -- I should say only democratic state can have that responsibility. I believe the responsibility to platforms will dig down anything that could be even potentially bring lawsuits and economic problems for them.
So the role of the says is very important. We saw how important it was in all the ballots and also for stainability media with all the law and antitrust decisions that have been taken around the word, Australia, Canada and so on.
But I agree, Julia, that it is time for the media to do something on internationally also, in addition to what state can do to help its sustainability, and as a consequence, the democratic discourse, which is fostered by the British media.
I believe, we believe in England that stakeholders should pull together the publishers but also the advertisers and also Civil Society and so on, to try and sustain, again, a media environment which is safe because it brings good information to the public. What I mean is that in my first intervention I spoke of how problematic it is for publishers that all of these advertising revenues to go to big tech instead to news media.
But it's true also for advertisers which very often find their advertising being placed by bots alongside content that is not flattering for their brand at all. It is actually a disaster. So in the end we have a double interest for advertisers and publishers to ensure good advertising, at least the good personal advertising goes to good professional brands.
So what I'm trying to foster -- what I'm trying to work on is an alliance between big stakeholders which include professional media, not only the big professional media. In particular the locals small media and advertisers and people who can advise us like OSCE, for example as appear example to find a way to make sure a certain portion of advertising goes back to media done in a professional way.
And in the brief that you gave me at the beginning before this session, you said is there anything technical that should be floored as well. Well yes, there is. It's really important. I just found out recently that local regional media very often has websites that are not optimized that we see problematic advertising.
So even if we for our successful in producing some sort of project which would drive more problematic advertising which -- I'm sure everybody knows what it is but problematic advertising is automatically thrown on to website which say I'm waiting for advertising. That is basically what it is.
But websites need to be -- to have some technical specifications to allow this to happen. So we will try and put in place in the next months, 18 months, maximum, a process to ensure that more problematic advertising goes to professional media sources, also local.
But we are also going to find a way to ensure that as many as possible of these websites are in the best condition possible to receive technically this problematic advertising.
>> REMOTE MODERATOR: Thank you very much. It's, first of all, a call to action, which is very good. Because this is also what we aim to come up with at the end. And it's important to also speak about advertisers which is part of this multistakeholder approach, of course.
So maybe in the last 5 minutes I want to turn now to the question also of Civil Society, right. Both Elena and Aws you talked about what you are doing in the work. And you are also in the advisory body on AI. I know it's not a lot of time left, Nighat. I want to ask for a specific society experience, and Nighat maybe can you close it with a call to action. Thanks.
>> AWS AL-SAADI: Thank you very much. We are talking about the organisation and we will put pressure on the tech companies to be third party fact checker or to combine the misinformation because as one organized used, they ignore because it's coming from the Arabic region. The other thing we are doing raising awareness for people in different ways.
Online we have 2.3 million followers we have the biggest fact checking in the region. And we have a platform that we have different levels of fact checks and digital security. And also we are doing webinars, some kind of riding awareness on the ground with IDBs, interdisciplinary placed people on the camps and also we are building some initiatives in the organisation to do fact checking.
Like, for example, we did -- in Yemen and in Tunisia, and also in Libya, and we -- for the future land for other countries. And also as an aside we are building too. So for peace application mobile. That is not only depending on our articles for fact checking. But also we are building this for people.
So anyone can pull this up. Three languages, Arabic and Kurdish you can do this by fact checking and by text. And by videos. Because most of the tools for fact checking for videos you need a computer to use it. So we use it by phone. And we have now more than 100,000. And still growing really fast. And I will stop from here.
>> NIGHAT DAD: Thank you so much. Julia, a little bit about the UN secretary general H-lab and the recommendation we have begun. But before that, I would also like to talk a little bit about the white paper on AI around content moderation we released at the board.
And some of the really important things that our panelists have already highlighted. For instance -- and we as a board delved into -- so many cases are engaged with so many stakeholders, including Civil Society around the world. So far, we have received 10,000 comments on our cases.
And I would really encourage people look into those deliberations and the case decisions we have released and the recommendations. Because those recommendations go deeper inside that the tools and policies of the platform.
So some of the key lessons that we learned as a board that actually Civil Society and other tech platforms can also learn when we talk about AI and content moderation are basically automated moderation and a curation system must be rigorously and continually for users who are most vulnerable and most addressed.
Another one is global rights, freedom of expression and ethical experts should be looked at with tools early in the process.
And then we always talk about the transparency is Paramount of giving access to third party or researchers is something we have been talking a lot about in our recommendation as well to me.
Now coming to the high level thing, which is UN secretary general's H-lab. So we give several recommendations. And two of our recommendations actually -- after negotiated by 193 states became part of GDC Global Digital Compact. One was setting up an AI scientific panel and another one was looking a global governance on AI.
I think it's so important now for Civil Society to actually keep watching this space. Because especially set up this dialogue line IGF -- it wasn't a designed line that -- and Civil Society will have a lot of space to engage with other stakeholders like government or stakeholders or AI companies to actually design from the beginning in terms of the global governance of AI be the global governance or at the state level. I will stop here. Thank you.
>> REMOTE MODERATOR:Thank you so much. I acknowledge that there's no time left. So we are really on spot. And there's really such a rich conversation with so many takeaways, and we will now certainly go through our notes and take out the calls to actions that we took from your crucial input and also reported back to the IGF.
So we can live up to what you said, like learning lessons from one another, Learning from one another's experiences. And it will also feed into our work. I just want to thank you at this stage. All of you for your insights and your crucial contributions and also, Martin, for doing the local moderation, even if there were not so many questions on the floor.
I don't know if you still want to have a closing sentences, but in any case thank you all very, very much from our side. Very crucial input and will be very useful for our continued work on this topic. Thank you very much.
>> MODERATOR: Thank you, Julia. A really great conversation, and thanks to the panelists and the people in the room.
>> REMOTE MODERATOR: Thank you all, and enjoy the rest of IGF.
>> ELENA PEROTTI: Bye.
>> REMOTE MODERATOR: Bye.