The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> FAWAZ SHAHEEN: Welcome. You can get audio on channel number two. As the session progresses, you can check out the workshop policy. There's a shared document that we can all be work on. We'll talk more about it as the session goes. At front of the desk is my colleague, Nadine. She has the QR codes. If anyone wants to scan and get the documents, you can do that.
Before we start, I would like to do some housekeeping and check if the online participants are able to speak and come on. Maitreya, can you come on? Or can you unmute them one by one and see if they are able to speak. If you can hear them.
>> TITHI NEOGI: We are audible. Thanks.
>> FAWAZ SHAHEEN: Thanks.
As I said, we would like to talk about some of the ways to make the Internet more inclusive in a broader sense and more particularly as we move towards establishing data protection and privacy regimes, including those of us in India. This is a chance to have the conversation about how the data protection and privacy regimes can make the Internet more inclusive instead of exclusive. That's the basic idea and the basic sense for which we started.
For the session today, we have a workshop policy to make sure that people are able to access people who are joining us, both online and onsite. When diverse abilities and different kinds of abilities are able to experience just as the rest of us. We are requesting all of the speakers to briefly describe their physical attributes in their own words. I'm Fawaz. I'm a bearded man. Kind of big. I'm wearing a gray jacket today. I have short hair. Just something brief like that when you are speaking. It will be helpful to bring everybody in and make it more of an inclusive conversation. Thank you in advance for that.
Apart from that, we have, if you want, a detailed look at the workshop policy. Although I'm sure if there's nothing very difficult, there's nothing extraordinary that we're asking. It is just basic stuff. Being respectful and inclusive and bringing everybody to the conversation. We have the workshop. The QR code is with Nadine. Scan that and just fill it out. We have a shared Google document as well. We like to build some code of processes. That's what we invite you to participate. The online colleagues will be sharing them in the chat. For the onsite, there's a QR code to scan and get access to.
Before we begin and start, we are going to introduce. You can see them on the screen. They've offered a policy brief on disability and data protection which looks particularly at the new law in India which ‑‑ and it looks at its interaction with the persons of disability act and it is trying to get a sense of where we are standing and in all of the positions.
Before we begin, I would like to request that you stay ten minutes and walk us through the brief and lay out what you envision for the session. After that, we have an excellent panel of speakers. I will be introduce them one by one. We'll take this forward. Thank you.
>> TITHI NEOGI: Thank you, Fawaz, for that introduction. I'm Tithi Neogi. My pronouns are she/her. I wear glasses. I have wavy hair. I'm wearing a red hoodie today.
Over to you, Angelina.
>> ANGELINA DASH: Hi, everyone. I'm Angelina Dash. My pronouns are she/her. I'm wearing a red jacket. I have wavy hair.
Over to you.
>> TITHI NEOGI: Today's session is based on the policy brief on data protection for personals with disabilities in India for, like Fawaz mentioned, Angelina and I have co‑authored and have been working on for a while. While India has taken a step towards inclusion in the data protection framework online, we have identified some gaps in the nuances and we have tried to plug in some loopholes to sort of advance the digital inclusion for Persons with Disabilities. Some common themes that we have identified in the data protection framework is digital inclusion and data protection.
I'll start off with something on what we've found on digital accessibility inclusion. Specifically in our results, we have identified the digital accessibility is a way to give Person with Disabilities consent on the Internet. While they guarantee the right to accessible information and communication technologies, Persons with Disabilities, the data protection law does not really mandate the data to operationalise or implement the mechanisms that have accessible interface and can be used easily with Persons with Disabilities.
Also, another thing that we have identified in the research is: data protection law allows Persons with Disabilities to sort of give consent on their behalf. That sort of reduces or takes away the onerous of giving consent with the person that ships that to the guardian of person with disability. That in turn takes away the incentive that data fiduciaries might have had.
Based on the findings for digital accessibility, we recommend making mechanisms compliant with accessibility standards and compatible with assistive technologies and audio‑visual formats, electronic et cetera.
I'll hand it over to continue the discussion on digital accessibility. Over to you.
>> ANGELINA DASH: Thanks, Tithi.
Another concern that we have in terms of access and inclusion is: in the context of limited access to digital services without the consent of their lawful guardian. It is required the consent of the lawful guardian in order to access digital services on the Internet. This is concerning. Because of the two scenarios.
In the first case, we have Persons with Disabilities who cannot access digital services where the support and consent of the lawful guardian may not be required at all. Like the digital encyclopedia. We have Person with Disabilities who are hindered from accessing digital services which provide the resources like perhaps online communities for Persons with Disabilities.
Access may be in cases where a conflict of interest may arise between the lawful guardian and the person with disability. This could include access to helplines. Persons with Disabilities must be allowed to access digital services on the Internet without the consent of their lawful guardian in certain cases.
Over to you, Tithi.
>> TITHI NEOGI: Thanks, Angelina.
I'll now speak about the second thing that we have identified and that is of data autonomy. India to another person is selected from two kind of data principles. That's children and Persons with Disabilities. In the Indian data protection law, children and Persons with Disabilities have been clumped together and subject to similar treatment. That's the lawful guardian for the person with disabilities is able to speak on their behalf.
Since the guardians are now being parents to the status of parents. Which is not the case. If you look at the disability rights framework in India, a lawful guardian for a person with disabilities is envisioned to be somebody who helps them or assists them with the decision‑making process in the mutually concentrated framework. So in the data protection law that India has right now, lawful guardians of Persons with Disabilities have the ability to give consent on their behalf completely without accounting for any mutual decision making.
In our research, we have recommended Persons with Disabilities, including children, and the separate provision that defines Persons with Disabilities. This definition for Persons with Disabilities the various degrees of support. We have subjected that any collection processes online be informed by a concentrated framework with the Persons with Disabilities and their guardians, which is driven by principles of mutual decision making.
I'll now hand it over to Angelina to speak about the third thing. Over to you.
>> ANGELINA DASH: Thanks, Tithi.
I think as you mentioned it was in the context of data protection. Specifically with regard to the absence of personal data, the data protection law that was introduced in India came after a long process. Previous situations of the law had carved out sensitive personal data. This was not included in the final law.
What is SPD? It was a distinct category of personal data warranting additional safeguards. What did the safeguards look like? Specific grounds for processing, including grounds like consent or specific purposes of state action it is important to question why the need for SPD arises in the first place. There's some concerns regarding the sensitivity of data being used as a basis.
However, we feel that India's data protection framework is a recent stage. Additionally, certain data of Persons with Disabilities can be more available, like health data or financial data and this data is more susceptible to misuse for the purpose of discriminating against Persons with Disabilities. Particularly in terms of employment, health care, and social welfare. Therefore our recommendation and our policy brief is that sensitive personal data is a category should be introduced for personal data within India's data protection law.
Now, we would now like to talk about moving from the policy brief to today's IGF session where we will be continuing the discourse on centering the disability and Internet Governance more broadly. Through our insights from working on the policy brief and limited experience from stakeholders for Persons with Disabilities, we have gained an understanding of how there are certain gaps in Internet Governance with regard to disability.
I would like to hand it over to Tithi at this point. She will elaborate on the gaps we identified. Over to you.
>> TITHI NEOGI: Thank you, Angelina.
Some of the loopholes that with we have discovered in the discourse on data governance globally as well as in India and we would really like insights from participants today to share their experiences and what they feel about this. So the first kind of law that we saw was the explicit Persons of Disabilities as data principles in the Indian Data Protection Law.
This happens to be sorts. We haven't really come across as the distinction categorization. This is something that we've been discussing as to whether this separate categorization of Persons with Disabilities, whether this was a good measure if you are going to have the process for disabilities.
Is this the right approach or wrong approach? We are aware that the GDP refers to certain vulnerable classes of data principles. It does not mention Persons with Disabilities exclusively. We would like to get insights from the participants as to whether they feel the GDPR approach is the way to go or whether the Indian approach of specifying a certain category of data principles as Persons with Disabilities and having specific measures with respect to consent is the approach to go.
We are also aware of discussions on the present and effect on data governance and whether the GDPR would be a good influence or the normal approach that the Indian law is speaking. Is this the way forward? We would really like to hear some discussions on this from our speakers as well as from participants.
I will now hand it over to Angelina to discuss some of the loopholes. Over to you, Angelina.
>> ANGELINA DASH: Thanks, Tithi.
I think another loophole that we identified was the lack of a global majority perspective in the course regarding the disability and Internet Governance. Persons with Disabilities are a homogenous group. It may apply in terms of gender and as well as broader infrastructure concerns and digital divide. That’s where we come in and aim to highlight some of the issues in the discussions with the workshop today.
With the context and the background that we've provided and the gaps that we have identified, we intend to build upon the work in the policy brief and extend the conversation to also address disability and data protection in the context of AI and automated decision‑making systems. We hope to use the session as a forum to facilitate multistakeholder conversations and collaboration. This will enable the co‑creation of best practices towards digital inclusion for Persons with Disabilities.
With this, I would like to conclude our presentation. I now open the floor for any questions. Fawaz will be assisting us in the Q & A.
>> FAWAZ SHAHEEN: Thank you.
We're doing a slight change in the format. We're running short of time. We will take all of the questions. We'll have two rounds. The presentation questions after the first round. We have to change on the fly. Sorry.
Now without further ado, we'll move towards the conversation. Join me on the stage. We also have two excellent speakers joining us online. We have Eleni and Maitreya. Just to remind all of you, the session we would like to have it as conversational as possible, as open ended as possible. We have the first round of questions. We have four to five minutes each. Then we open it for questions.
I encourage all of the onsite participants to ask all of the speakers as well as Tithi and Angelina about the topic about this issue. Also all of the online participants feel free to put your questions in the chat. They are taking the questions and replaying them to the speakers.
After the first round of speakers and question, we'll have another round. One hour left. I will first invite Elani Boursinou who works in the information sector on the universal access to information. What role does sustainability play? Working from your own experience, what's the role of digital accessibility and inclusive design in enhancing digital autonomy.
>> ELENI BOURSINOU: Thank you.
I'm Eleni. Today, I'm wearing an Indian shirt and I have wavy hair. I don't know if my camera can be switched on.
>> FAWAZ SHAHEEN: I think we can switch on the camera. Can you check?
>> ELENI BOURSINOU: Sure. Thank you very much for this.
So, for the very meaningful question. Digital accessibility plays a critical role by fostering inclusion and equity. Particularly for what you call Persons with Disabilities. But in general, any marginalised groups.
By removing the barriers, it bridges the digital divide in the digital economy in what we call the UNESCO communication sector, the knowledge societies. So this supports the IGF and the agenda leaving no one behind, promoting social justice and equity. There are also accessible tools. Such as OER and UDN that play a significant role in empowering education. This is the G4 reducing the rates from school and enhancing education on the outcomes and promoting lifelong learning.
Additionally, accessible, digital solutions can address challenges related to gender and disability as the G5 and can empower women and girls with disabilities to access education, employment, and literacy opportunities. The accessibility standards can provide the open solutions that contribute to the global corporation, fostering access to information and promoting partnerships.
In the context, digital autonomy and inclusive design allow individuals to control their data. The platforms allow users to interpret the time, ensuring that everyone, regardless of the ability, can participate in data proven decision making. What we call UNESCO open solutions that these solutions that are cost effective with the open licencing. It can be free and open source. Where the open and educational content and OER platforms provide resources that empower individuals to control their data and engage in life‑long learning. Accessible formats and assistive technologies enhance understanding and trust in the data‑driven systems while universal and inclusive design mitigate biases in automated decision making, ensuring fairness and safeguarding marginalised communities from discrimination.
So, the key takeaway for embedding is that the embedding digital accessibility and principles in policies and practices can ensure equitable participation in the digital economy and knowledge society.
First, by improving the data collection and analysis, AI can support more equitable and decision making, making sure that marginalised communities aimed at achieving the SDG. Second, a huge approach to AI and digital tools essential for ensuring that benefits of AI are distributed equitably and contributed to Sustainable Development Goals, including promoting that autonomy and accessibility.
That all for me for now. I'll be able to answer any questions that you might have.
>> FAWAZ SHAHEEN: Thank you. Welcome back to you.
Now I would like to go to the onsite speakers. Osama Manzar is Director and works with a large community network of digital fellows. We're not just working on improving access to Internet, but also on whoever mandated effectively to train people to work with people and make the Internet a safer and more inclusive space.
Today particularly, we are happy to have Osama with us. Recently there was a report that's looking at IECT for empowerment, inclusivity, and access. It is a report in which this is more than 250 Persons with Disabilities and map the various challenges and abilities. I would like to invite Osama to share some of the things.
Also to start off, I would like you to talk about doing this kind of work. When you look at challenge that is are associated with gathering data on disability, talking to Persons with Disabilities. While maintaining autonomy, maintaining anonymity where it is required. Doing that and balancing it with the need for having good data to work on disability. This becomes an even more urgent question when it comes to issues of censor. Your report is talking about what is the need for the new Disability Census in India. Some of the things. I know it is a very broad question.
I would like to start with that. We can move on from there.
>> OSAMA MANZAR: Yeah. Thank you, Fawaz.
I will focus my discussion more on ground realities and also with the entire community. Whatever you call it. We have five minutes. I will say there are three things that is are important.
One is that we treat our people with disability as subjects. I no longer say object. I would have said object. More like somebody about whom we need to do something where there's no role play from themselves; right? Exactly. We can, you know, exclude it and all of that. That's a very behavioral of the doers. Whether it is corporate sector, government sector, or abled people or philanthropist or anybody. There's something that needs to be done. It has been done for so long.
If you talk to the disabled people or People with Disability, they feel like somebody will do something someday. You know? We are just waiting. You know? So the whole ability to come out to put their demand or ask for accountability is almost negligent. Almost negligent. I would say they are not treated better than any other poor people in the remote areas. Or the people who are cost‑wise or class‑wise treated absolutely downtrodden.
I'm coming also from the perspective of the larger scale. We have about 50 million in the population. Even more. About 5% of the population of India is People with Disabilities. It is huge and as equal as the indigenous community. There's about 5%. That's a huge, huge number in India. With that one and then when we started seeing that in the last 20 years, digital development was becoming more like an enabler. And an automated enabler for People with Disabilities.
Let me also say disability and also say, like, now if I don't have access to information, I'm more able with digital access device; right? If I don't want to talk to people, then I have something to talk online and there's this ‑‑ it enables me. My confidence, my requirement, and so on and so forth. Add to digital involvement foundation, we realise this by going on the ground. Even when we were going on the ground to provide digital access or infrastructure, we were seeing that digital accessibility was not visible. They were not thinking about them. All of the government entitlements that we ought to deliver for them, they are not coming to take it. We don't know how to talk to them.
Then, we also realised there's a lot of dramatic and traditional look down upon kind of behavior in the community. We look at them with a lot of distance, you know? We don't want to be in close conversation with them or touch them or feel them. They are considered as curse on the society, you know? That's the last thing that one can imagine if you want to do a multi‑year stakeholder way of growing things. You know? They are not part of the stakeholdership.
Then next one minute. What we did is we started talking to them and including them in doing. Rather than talking about what's your problem and what's my problem and all of that. Okay. This person is doing connectivity. You can also do connectivity. If this person is running access point or public access point, you can also run the access point. If this person is accessing computer and finding a payment for somebody, you can also do that. Everything that we thought anybody can like anybody if a disabled person can do, we started having and working with them. They become part of the working ecosystem.
Suddenly, they become the social entrepreneur or entrepreneur or provider. Earlier, their life was sleeker. Now, they are more like provider. We just created not one but 300 examples and people who run digital centers, digitalling a security council’s points in village level and they provide service to the all of the able people actually.
Then what we learn from them is that now you can talk about disability. You can talk about their misery. You can talk about their requirements and so on and so forth. That actually brought us at par now on the equal footing; right? Then we thought that can be replicated for the model. What I'm now coming to is the last part is how, if you ‑‑ now there are able people for providing service and for talking and infrastructure development and digital access, everything; right? They also have more knowledge than others, because they know extra about the special ability to serve the disabled. Then we did the research. Government has not done the Census for people with disability for so long. Why don't we have that.
Now, we're asking them you demand it rather than we demand for you, you know. We are going to demand for the Census for the whole country; right? Can we have a special only for that one? It is about 15 or 20 years actually. Two decades. Yeah. Yeah. One decade is lost. None of the decades are coming. How can they want? All of the government facility was on the ground. The enablement for access for people with disability and the whole digital center and the public access point is not able for that. You know, it is not user friendly. Digital inclusion is taking normal people disabled rather than disabled people. They can't type. They can't go to the place. You have to do extra. You have to become audio‑visual. Those kind of things started coming in.
Third, is that people with disability have more network or data about their own community; right? It can be a secondary source of the People with Disabilities and that we must take advantage of one of the recommendations that we have done.
The last part is that data in any case for anybody is very important; right? And about data protection. Why it is more protection oriented approach required for People with Disability is because of their extraordinary deprivation and way of looking at them. You have to be more protective about them. You have to be more and make sure their participation is their sense is rider.
The last part is that do not treat People with Disabilities with mental disability. You know? Like our researchers clearly said that government has created the legislation where you are fantasising their disability. They are very much able. Why somebody else should take a decision on their behalf? You know. Just because they cannot walk or they cannot do it by hand. That's not fair. From that point of view, 40% of the population cannot access the Internet is disabled. From accessing Internet. Just because you are not digitally literate. How can you do that? That's very, very important.
But my last, you know, point in this one is that we must do the conversation, action, intervention with People with Disabilities in everything. Whether you are doing research or data collection or doing something on work. They must be part of the ecosystem. Then it becomes complete, you know? Rather than we say the conversation needs to change. Something needs to change. Don't do panel. Always do panel with a woman. Similarly can we do that? There's always a panel with one person with disability at least. You know? Can you always have your discussion with one person with disability? You've always been part of it. We normalise the participation of this simple in to the normal conversation is what I would like to say now.
>> FAWAZ SHAHEEN: I think that's an excellent point. It is important also to highlight how even sometimes very well intentioned interventions by Civil Society, by Human Rights actors can also be very infantilising.
I would like to move to the next speaker. He has unique access. Go and read some of his writings. You can find a lot of them at his page on the NLU web site.
Thank you for joining us. I would like to start by asking you considering the increasing digitalisation of essential services, what are the gaps of legislation in India regarding the rights of Persons with Disabilities in the context of data protection? And also how can policies that we adopt, encourage user centric approaches in the development of technology that accessing Persons with Disabilities?
Again, sort of broader question. We would like to come in on that and take us through some of these perspectives with your insight? Maitreya, are you able to unmute yourself?
>> MAITREYA SHAH: Yes. Hi, Fawaz. Thank you for inviting me. My pronouns are he/him. And I'm wearing a button-down with the shirt. I'm an Indian with dirty brown hair.
>> FAWAZ SHAHEEN: Yes. We can see you now. Thank you.
>> MAITREYA SHAH: I think this is a great question. I'm wondering if I can talk about what Tithi and Angelina covered. This is the current legislative thing in India. I think I speak about digital accessibility and data protection a little broadly in the Indian context. I think to start with, India has always had the People with Disabilities with it comes to technologies or other forms of emerging technologies. A lot of my has been on how the space and digital ID in India has, you know, not considered the privacy and accessibility implications of People with Disabilities. How the biometrics and People with Disabilities, they use quote unquote People with Disabilities as outliers or normative.
So in India, I think, this issue has been outstanding. I've started, you know, the project that we started way back in 2009. Implementation in 2012. Even as of today, People with Disabilities are facing the technology and also authenticating their identity as, you know, especially with public services such as transfers or programmes. This has been an issue for long standing. Coming together for more legislative and policy. You are executing the right. The data protection law of India treats People with Disabilities as equally as it should. There are several visions with this.
To start with, you know, the Right for Persons With Disabilities Act when it was enacted, there are legislation when it comes to disability. I think my protection is the data protection law with the designed framework and guardianship and others very complex social issues anymore. No, it is the disability law that has done this sufficiently. I think which is the right thing. But the data protection law suddenly starts multiplication of legislative provisions and start adding new complexities for People with Disabilities. Right?
Tithi and Angelina, there's an important question whether the GDPR approach or another approach is correct. India, although borrowed heavily from GDPR, we it did a lot of innovation when we brought this. I think probably as I said we don't need the separate provision for People with Disabilities at all. We might be good with the GDPR. I think the idea is to respect the agency of People with Disabilities. Not develop new concern mechanisms.
Instead, focusing on, you know, making your technology’s privacy visible. Making your technologies more accessible. Seeing that technology is to not unlawfully access a disability or health information or People with Disabilities. I think it is more under; you know, the focus is to shift from user to data or corporations that are building technologies. I think in India, one of the biggest problems for the legislation is that a lot of onerous is placed on People with Disabilities.
And with privacy, you know, a lot of ‑‑ there's been a lot of quality and a lot of criticism on this. Individual privacy frameworks. You know, individual privacy might not be the best way forward, especially with the technologies. Where there is user agency already. I think we need to think about making the technologies more privacy dissolving and other than keeping or putting additional complexities on people with disability to, you know, coordinate with their guardians and then, you know, work on consent to access even the very basic necessities, you know, because I think digital access and Internet access is now a necessity.
I can give you an example of how this is actually playing out on the ground. Reprehensively, I was doing the training session for assistive technology manufacturers in India. It was building technologies for People with Disabilities. And a lot of them told me how this guardianship provision in the complex constraint provision and the data protection law is raising many issues for them. Because it is not giving them a magic opportunity to provide the technologies to people with disability. They also wrote it and wrote to the government saying that this provision needs an amendment.
Specifically, when it comes to assistive technology. I think this is one very broad. The other broad issue is there's been this inherent between accessibility and privacy. At times People with Disabilities are compared to give away their privacy to access digital technologies. To give you an example, earlier this year I had written an article on how these companies are developing automated tools claiming that the automated tools can fix web sites, make them accessible, without changing the source code of it. How these practices are inherently deceptive. So it essentially does is in the making accessible, it violates the user privacy. Because these can refer a set of individuals, collecting data on the screenreader user or the user magnification device and so on.
Although there's been some conversation on this outside India, especially in the United States, this conversation is not happening. Why this is particularly problematic when it comes to privacy is the adoption of these technologies whereas it is in a way detected now in the United States. Companies and even the government are increasingly adopting these problematic technologies in India. To give you an example, you can go back and check this. It is managed by electronic and information technology. A quote unquote the knowledge platform for India is AI economy. It says user and accessibility will relate.
So I think my question is: you know, we need to think about this comprehensively. Digital accessibility is not a ticked box solution. We have to think about this a little comprehensively, especially when it comes to privacy. There are additional challenges due to AI and emerging technologies that we are seeing that are opposing, you know, issues for People with Disabilities.
I think thirdly, I'll briefly touch on the issue of, you know, India's digital accessibility laws are still very absent. We have not adequately made our web site for the government or private accessible. They focus on accessibility for people who are blind, but not adequately for people who have other forms of disabilities. There are challenge for accessibility and privacy for people who have learning disabilities like dyslexia or autistic with intellectual disabilities. And I think it allows us to quite, you know, not adequately thinking about across the disability approach when it comes to our accessibility and privacy. I think in India, there are many issues at a larger policy level.
There are people with disability and not adequately represented in the conversation. That's duplication and lack of harmonization in regulations. You know, I think we are not adequately regulating the larger sector when it comes to resolving privacy for People with Disabilities and ensuring accessibility. I'm happy to talk more about this in the question and answer.
>> FAWAZ SHAHEEN: Thank you. That's a good point to pause and get our first round of questions and observations. I know some of the conversation has been very India centric. These are problems that are universal. We are fortunate to have a lot of onsite participants from different areas and different countries. If questions and observations and intervention from your perspective, we welcome that. Please feel free to tell us any questions from the chat, from the online participate, and, yeah. Anyone have a question here or observation? Yes?
>> AUDIENCE: Thank you. The question is for the in‑person speaker. Since you tried to be practical. I'm coordinating the implementation of the policy framework. A lot of that is the stakeholder sessions as part of supporting some African countries to develop the national data policy. And our project was to be participatory and inclusive. There's a challenge with accessibility. What is your best approach and if there are implementations to try?
>> OSAMA MANZAR: Thank you. I have a good answer. One typical African country. What country are you from?
>> AUDIENCE: I've been to Ghana. There are many villages.
>> OSAMA MANZAR: Listen, imagine the village where you go to just generally people are not connected and they don't know computers. They need to educate themselves on computers. There's people who want their work to be done or draw the money. Hundreds of services is now digitally done. That's the scenario that you can see. We try to see how we can provide all of the service to the people; right? It is very simple. Like whether you go to the school and do a lab. Or you create the common spot. To do that, is there any person there?
Who is ready to learn? Not educated and not necessary. That's not the basic qualification. Are you ready to take the chance and sit on the computer and are you ready to hand your computers and all of this service that we are talking is you provide in the center?
Suddenly, your job was to find out a person with disability. And with an intention to serve people; right? And that item was such an empowering thing. Now the disabled person who can't move and sitting and giving you education on computer and switching on and off computers and letting you know that this is the way that you can hook. This is the way that you can fill the form and access YouTube. This is the way that you can do something. Right? Digital becomes an ammunition from the person to provide the service.
Earlier, they were sitting at home and waiting for someone to do a favor for me. You do this from your home. You don't have to go to the shop. Everybody is coming there, you know, to get the service and for the same service. People used to go to faraway places. If you are highly educated, this person is arrogant. The list and most of the people start coming there. That's one person.
That's an example to become a viral thing for the rest of the people and imagination change. This we did not in one. Now we have out of 2,000 locations. 500 locations are run by the disabled people. The whole narrative change. Then when you ask the questions which I'm not going to share now, maybe if I'll get a chance, I'll share with you that, you know, what are the statistics? 85% of the people are with disability. You don't have to think for 15% to be done. They are the visual representative of the people with disability. Those are the thing that is we did. We can share with you the research with all of the 500 people and how it can become a replicative initiative in many other countries.
>> FAWAZ SHAHEEN: Thank you. We encourage more participants to come up with their questions and interventions.
Before we move to the next round of discussion, any online ‑‑ any questions from the online chat? Otherwise, we can move to the next round of questions. To the next round of discussion and come back also.
>> TITHI NEOGI: There are no questions online in the chat. But there's one coming from Eleni commenting on the point ‑‑
>> FAWAZ SHAHEEN: We're going to ask Eleni to come in.
(Echoing)
(Echoing)
>> ELENI BOURSINOU: I was saying it is very interesting.
>> FAWAZ SHAHEEN: We can't hear the conversation here. Especially, because you asked the question about, you know, those digital ‑‑ the work that you are doing for the digital centre. The next round of discussion is around automated decision making.
(Echoing)
>> FAWAZ SHAHEEN: We wanted to ask a question because ‑‑ is it? Yeah. I think there's some problem. Should we wait a couple of minutes?
(Echoing)
>> FAWAZ SHAHEEN: We have done more for the local initiatives. One aspect is empowering people and making them providers and enablers. There's also the question of the data around disability is also extremely skewered.
So, the picture of who is the disabled person is a very stereotypical, if I must say. Through your experience and through your work and including the interventions as well as the research work, you have, of course, a much better ‑‑
(Silence)
>> FAWAZ SHAHEEN: See at government level. We're trying to design intervention at the larger level. How do you think this diversity among Persons with Disabilities can be approached? How can that be accounted for?
>> OSAMA MANZAR: I would say that it is actually innovation on -- I would say the data collection also and the data contextualisation. We have the data that shows we have a disability for various kind of disability. Right, because you have to write on the abilities with the same person that's having disability. What I'm giving examples is let's say my hands are not work. That's one of the disability. But should we only work on making that hand work or work that your mouth is working, your voice is working, your mind is working, your other parts of the body is working. How those working and the devices and able devices can actually make it not even think about my hand. It is something like we used to use the remote control which is handled by hand.
Now all of the remote controls are voice enabled. Voice control. I'm controlling all of the remote controls by voice. I don't need my hand for managing. I'm just giving one example. Rather than thinking that, you know, I will create, you know, something non‑handheld, you think about what are the things that are enabled? Visual enabled. That's where the real innovation and contextualisation is very important. I would like to say that while I'm not denying it, it must be that our senses or data collection must be done. Besides that, we should have equal effort that what are the already available ability oriented things which is available which can contextualise.
For example, can we have a list of everything that a mobile phone does which is directly relate to the people with disability? Mobile devices, the most able device among all of the other devices in terms of making you work. It can transiterate. It can speak out your messages without seeing it. There are many, many things that mobile gives you. In a secure manner. By keeping your data secure. The mobile does. I don't know. Many other people with disability do not know.
Especially, if I don't know. Which I'm a sort of enabler in some of the community. It is a pity that I don't know. Can there with a whole, you know, the whole write up and checklist and everything that these are the things that we must apply, you know? They are all of the able things that the digital access and public center should be at the village level. It should have 100 other things, which is not done.
And, for example, none of our web sites are voice enabled. None of them are hearing enabled. You know, or visually enabled. That's the first thing that we should do, you know? And to do that, you must have to make it mobile enabled. If you do mobile enabled, then automatically, your content can be read out and everything. Rather than doing on the typical web or traditional web version. I'm saying that in the last sentence is that, you know, we must do the data collection and data solution from the very high‑level contextualisation and making cross pollination of the facilitation of all of the ability oriented thing rather than focusing on disability.
>> MAITREYA SHAH: I would like to make a comment. Thank you. I liked it before with my friend a little bit here. I think the problem with the data collection effort is not the focus on disability or disability. In fact, I feel that it is a very distinct between ability and disability. I think it is a medicalised conception of disability, you know? We only focus on particular organs, senses, or impairments. We've come a long way after a lot of focusing to kind of do away with that in the society.
Especially with governments. who have increasingly been relying on, you know, impairment‑based metrics to collect data about people or categorize People with Disabilities. I think India this is specifically the 16 iPhone. They recognise the disability. Which is a very narrow way of defining disability. You know, if you see those for disabilities, it talks about a really broader approach. It talks about the social barriers and disability is defined from more of a social model perspective. Where you think about how societies, you know, posing barriers for People with Disabilities versus, you know, what the ability or disability of a particular individual is. I said I think that's why, you know, the data collection efforts have been failing. People with Disabilities are 2‑5% of the population. But according to the World Health Organization, 15% is from disability.
If you ask me, I think the number is only going to be more in the context. We also have high rates and high numbers of people who are poor. People who face other forms of marginalisation, including lack of access to health care. So the number of People with Disabilities is going to be more.
But, you know, our data collection efforts have been failing? We don't want to count those. You know, we don't want to count all of those who might be, you know, invisible to identify as a Person with Disabilities. I think the ability distinction is very medicalised. I think we need to probably take the UN approach and more social model approach and think about the universal access issues and when we think about disability and data collection. Thank you.
>> FAWAZ SHAHEEN: All right. Thank you.
(Echoing)
>> FAWAZ SHAHEEN: That's most of it. We can continue the engagement.
>> OSAMA MANZAR: We can have this. I'm not in disagreement. Because maybe the different people dealing with different data and laws and order. I don't know how to do it. I agree there's a perception of looking at disability from the very medical perspective. That's the reason why I mention that most of our activities are related to not looking at it. Looking at it with several other, you know, abilities and take positive stance on that one. To make it more work.
>> FAWAZ SHAHEEN: Sure. This is an interesting conversation. We're down to the last 20 minutes. We have one more question to Maitreya and Eleni. I would like to remind all of the participants who just joined us that we have a document around suggested best practices for the inclusive Internet. This is a collaborative document.
Before leaving, get access from my colleague standing here. We will continue to take suggestions until the end of the day. We try to get some of the suggestions included in the call to action.
Now without further ado, Maitreya, I would like to give the question to you. It is a conversation that needs to include Persons with Disabilities much more. And in the last answer, actually, we're talking about how most of the automated. Some of that is we know because of as we said the corruption of the data set itself and the fact that it doesn't account for the diversity that is there among Persons with Disabilities. I guess the question right now is given the context. How do you think we can ask for accountability from automated decision making systems, especially with the perspective of Persons with Disabilities?
>> MAITREYA SHAH: Thank you so much. That's a good question.
I would like to break down my answer in to sort of two ‑‑ I think I'll break it based on the true kind of broader technologies that I see in the market. One is quote unquote technologies designed for People with Disabilities.
The first category that we think about automated and assistive technologies, assistive technologies particularly, that are designed for People with Disabilities. I think when we think about the accountability here, I think my first argument is there's a lot of total times and a lot of these broader, you know, unchecked documents with AI power and assistive technologies. You know, I think everyone these days I feel is coming out with an AI power of assistive technologies in the market, without understanding what other implications would be on people disabilities. It is care giving support, such as people with other type of disabilities who might require based on the severity of the disability might require caregiver support. A lot of the reports have been adopted across the world, seeing that these technologies will be transformative of people. Not realising that you don't need the technologies. You don't need a robot.
There are many other issues. There are assistive technologies that claim to fix certain disabilities like autism. That's something that is very deeply problematic. Because the disability is not some ‑‑ not an ailment or disease that needs to be fixed or abnormality that needs to be fixed. There are many AI‑powered technologies currently in the market that claim to quote unquote cure disabilities. They ask if you need a certain technologies for People with Disabilities at all.
If we need them, you know, who decides what is for People with Disabilities? Should it be a certain cooperation sitting somewhere in the U.S. run by non‑disabled people. Thinking the technology would be good or bad for a person with disability. Should it be a person with disabilities thinking about technologies themselves.
The second broader point is around mainstream technologies. I think that's so much AI and so much automation around us. I think that's People with Disabilities that very marginalised in the conversation today. There's very little research on how they impact People with Disabilities and how the metrics can cater to disability. A lot of my recent research at Harvard has been on how AI famous metrics and governance policy both explicitly exclude disability.
To give you an example, a lot of LLM, like ChatGPT are trained to not discriminate against people of colour and majorities. In my work, I've illustrated how discrimination manifests ChatGPT and like Gemini for People with Disabilities. I think there are several issues with the data. Because, you know, this is a long standing issue. We don't have enough data on People with Disabilities. The data that we have is often very bias. That's the problem with data.
The other problem is People with Disabilities usually are never considered. They are not talked about technologies that are designed or deployed. People with Disabilities are usually never on the table. Workforce participation of People with Disabilities, especially the technology sector. So the presentation.
And the third is when you think about governance policy. They also do not adequately consider disability. When you think about how technology should be bias and so on, disability is not considered. That's something that you can see in other jurisdictions.
The European EAA Act, for example, does not consider how it might affect People with Disabilities. They put environmental technologies on the low credit. There are technologies in the future. There are many issues with the conversations and AI conversations when it comes to disability. I think to kind of end with -- I think I would like to put this in the broad themes of one lack of presentation of People with Disabilities in the conversation. A lot of false optimism and a lot of hype around AI‑powered assistive technologies without understanding if they are even effective and what they are seeking to do and what is the intent of the companies around manufacturing the technologies.
I think the third is the marginalisation of disability from different technological stages or the different stages of technology life cycle. Right from design to deployment and then to governance. This is broadly my understanding of the larger landscape.
>> FAWAZ SHAHEEN: Thank you. That's very detailed. I know you were trying to contain and put it into very little. Thank you so much for doing that. We have ten minutes now.
Very quickly, Eleni, if we could go to you in three minutes or four minutes. UNESCO has done so much work on open distance and open learning and distance learning.
From your own experience, if you could talk a little bit about the challenges of automated decision making, especially in education and some guidelines or safeguards, and principles that we should keep in mind when approaches AI systems for persons for that impact persons with disabilities. Of course, when I say AI systems, I mean AI system that is are absolutely necessary as Maitreya would say. Not just any systems, because we want to make them. They are necessary and going to impact persons with disabilities.
Eleni, if you could respond to that. Three or four minutes.
>> ELENI BOURSINOU: Thank you. The work that UNESCO has been doing. We see there are a lot of challenges by learners with disabilities in the online education. It is in content and the platforms that have to be recognised. There are tools that remain inaccessible and compatible with technologies that the learners disability excluded. These barrier eccentric. We have trainers for educators for universal design for learning that further exacerbates the issues. Automated decision making systems in education pose even more risk for person with disabilities, because biases in the algorithms often result in discriminatory outcomes, such as bias admissions, decisions, assessments, or resource allocation, transparency, and accountability remain significant. ADM systems often make unjust decisions. And they fail to consider the diverse needs of Persons with Disabilities and what we are trying to do by addressing all of the challenges is to work with the member states on guidelines and implementation policies to include ethical AI development in education and also guidelines to include qualities for not only Persons with Disability. They said in the comment now the definition has to include all marginalised communities and vulnerable groups. Even autism and people with dyslexia and learning difficulties like this.
What we think is I'm going to share in the chat all of the links and some documents that UNESCO has been working on. I also want to end by the example that we had with the education board. We worked with the education board and the light for the world NGO to foster the skills development with teachers that have disabilities. We really are sure it is really, really important to have teachers with disabilities be represented within the community and the whole ecosystem. We had a very, very interesting case study based on that. And they provided the most constructive feedback on the guidelines. We actually enriched the guidelines for the governments based on their feedback. That's all for me. I will share all of the links in the chat.
>> FAWAZ SHAHEEN: Yes. Thank you. We have observation from participants. I think we can ‑‑ yeah. We'll just give the mic to you, sir.
>> AUDIENCE: Thank you. I'm Dr. Al‑Ammar. I work with accessibility and disability. I'm interested by the discussion. I'm sorry that I joined a little bit late due to another session that I was speaking at.
As the coordinator of dynamic relation on accessibility and disability, I would just want to flag that we have accessibility guidelines on disability and it is one of the old setups within the IGF system. It has been advising IGF on organising accessible meetings for People with Disabilities. We have otherwise guidelines out there on the disabled. Which if a person and participants can go and see there. They can get them from there. If someone needs a Braille copy, that's available there.
Secondly, I would agree with the speaker, I think Maitreya. Pardon if I'm pronouncing the name wrong. There are deceptive practices within the data systems. Particularly when we talk about AI and algorithm‑based systems. Many of the things have been said. One concern that I have as a person with disability, we all use a number of AI‑based systems. We all know that data is the oil of AI is being used to train the systems. As person with disabilities, sometimes we use these application systems for many of our personal documents, personal work and extra transport. There's a lot of private data needed to these applications and systems of Persons with Disabilities. Same as the case for people who are hard of hearing. Those who use these technologies for interpretation and translation will be in sign language interpretation or otherwise in the conversations. A lot of data is going in to these applications. We don't know, because there are privacy policies. They are to my understanding except for we might have eyes or some other applications. They have not and many of the applications have not changed their privacy policies. So these also need to be looked at from a friendly perspective.
One last point that ‑‑ it does not need a response. I want to leave this to you as an afterthought. There's a disability definition by the GRPD. People have impairments and disability occurs when the impairment interact with the society or societal barriers. If those barriers are removed, disabilities are removed. But broadening this definition and this includes people with learning disabilities, autism, and stuff. But my friend from India would understand there's a lot of efforts of abuse of this definitions by the disabilities. Getting advantages on behalf of Persons with Disabilities posing as Persons with Disabilities. So on one hand, while I agree that we need to encompass all into the definition. Impersonators could be kept out of the facilities. People with Disabilities are facilitated. Some impersonators come and get advantages in the name of persons with disabilities. Thank you so much.
>> FAWAZ SHAHEEN: Thank you.
There's more scope for conversation. Unfortunately, we're out of time. Thank you for joining us. Even for a little bit, professor. We're still here.
If anyone wants to chat or have a conversation, the document is still up. You can comment on it. Leave your suggestions. I would like to particularly thank our onsite speaker for joining us. I would like to thank Maitreya for joining us from such a different timeline and bringing your insights into this. I'm sure we have a lot to learn from you.
I encourage everyone to check out Maitreya online and check out the page. It is very interesting articles. I was simply reading one on how the AI conversation needs to include voices with disabilities. So go and check out those. Also thank you so much for joining us and sharing your valuable insights. Thank you, everyone, for taking the time. We keep the conversation going. See you around. Thank you.
>> ANGELINA DASH: Thanks, everyone.
>> TITHI NEOGI: Thank you, everyone.