The Harvard Gazette: AI fairness conversations must include disabled people. Comment by canisgallicus.com. Having sustained a TBI with amnesia and other difficulties (bipolar, anxiety), aphantasia, I regard the exposure to knowledge of computer, access to education Trinity College Dublin and life since as having at all times made possible by the onset of computer technology, including internet and social media X. X is my prompt for the day and emails too. I even managed to write a book “Fortune Favours the Brave” about journey through breast cancer complicated by memory deficits.

AI fairness conversations must include disabled people

Naomi Saphra, Lawrence Weru, Maitreya Shah.Photos by Jon Chase and Kris Snibbe/Harvard Staff Photographers and courtesy of Maitreya Shah; photo illustration by Judy Blomquist/Harvard Staff

Eileen O’Grady

Harvard Staff Writer

April 3, 2024 7 min read

Tech offers promise to help yet too often perpetuates ableism, say researchers. It doesn’t have to be this way.

Third in a four-part series on non-apparent disabilities.

AI researcher Naomi Saphra faced “a programmer’s worst nightmare” in 2015. After a decade of coding and just as she was about to start a Ph.D. program in Scotland, neuropathy in her hands rendered typing too painful.

Seeking a solution, Saphra turned to the very technology she was studying. She began the long process of teaching herself how to code using voice-to-text dictation technologies. Today, a system called Talon, which Saphra has heavily customized to complete specific tasks, allows her to code and to write papers for her research on language models.

“I rely on it completely,” said the research fellow at the Kempner Institute for the Study of Natural and Artificial Intelligence. “I would absolutely not have a career if we didn’t have AI for speech-to-text. These days it’s pretty hard to exist in the world if you’re not able to use a computer much. And as things have advanced, it’s been very important that the word error rate has gotten lower over the years.”

AI technology can be a powerful assistive tool for people like Saphra with non-apparent disabilities — physical or mental conditions that are not immediately obvious to others. But disability advocates say these tools have a long way to go to become truly accessible. Experts say including disabled people in conversations on AI fairness and the development process is key.

Lawrence Weru.
“If we’re creating tools that we know are fed with information that can bias against certain groups, and we integrate those into very crucial aspects of our lives, what’s going to be the impact of that?” asks Lawrence Weru, an associate in biomedical informatics at Harvard Medical School.Kris Snibbe/Harvard Staff Photographer

Lawrence Weru, an associate in biomedical informatics at Harvard Medical School, was initially excited when voice-activated AI tools such as Siri and Alexa were released in the early 2010s. As someone who learned to code from a young age on public library computers before personal computers were common, he has long been fascinated by advances in digital technology. But Weru, who has a stutter, quickly found voice-activated technology more frustrating than helpful. When asking Siri for directions, the digital assistant would not understand the question if he stuttered.

While this was hardly a new experience — before AI, Weru remembers the frustration of trying to contact his bank and not being able to get past the automated phone system — it was disappointing to realize that the AI likely had not been trained on data from people with disabilities like his.

“People create things and people always have a vision in mind of who is going to be using their thing,” Weru said. “Sometimes not everybody is included in those personas.”

His experience with Siri makes Weru concerned about the future of voice-activated AI technology, envisioning a world in which critical tasks — making doctor appointments, applying for jobs, accessing education — are powered not by humans, but by technologies that can’t be used by everyone.

“If we’re creating tools that we know are fed with information that can bias against certain groups, and we integrate those into very crucial aspects of our lives, what’s going to be the impact of that?” Weru said. “That’s a concern that I hope people would be having enough foresight to try to address in advance, but historically accessibility is usually something that’s treated as an afterthought.”

“I would absolutely not have a career if we didn’t have AI for speech-to-text,” said Naomi Saphra, a research fellow at the Kempner Institute. Jon Chase/Harvard Staff Photographer

Maitreya Shah, a fellow at the Berkman Klein Center for Internet and Society, recently launched a research project analyzing different schools of thought on “AI fairness,” or movements seeking to mitigate AI bias against people in marginalized groups. Shah, a blind lawyer and researcher, wants to go beyond conversations about accessibility and examine what he believes is the root of the issue: People with disabilities are not being included in conversations about AI, even in conversations about AI fairness.

“A lot of research so far has focused on how AI technologies discriminate against people with disabilities, how algorithms harm people with disabilities,” Shah said. “My aim for this project is to talk about how even the conversation on AI fairness, which was purportedly commenced to fix AI systems and to mitigate harms, also does not adequately account for the rights, challenges, and lived experiences of people with disabilities.”

For his research, he’s interviewing scholars who have studied the issue and evaluating frameworks designed to maintain AI fairness proposed by governments and the AI industry.

Shah said developers often consider disability data to be “outlier data,” or data that differs greatly from the overall pattern and is sometimes excluded. But even when it’s included, there are some disabilities — like non-apparent disabilities — that are overlooked more than others. If an AI is trained on a narrow “definition” of disability (like if data from people who stutter is not used to train a voice-activated AI tool) the outcome will be that the tool is not accessible.

“There is a paradox,” Shah said. “If you don’t incorporate disability data, your algorithms would be open to discriminating against people with disabilities because they don’t fit the normative ideas of your algorithms. If you incorporate the data, a lot of people with disabilities would still be missed out because inherently, the way you incorporate datasets, you divide data on the axes of identity.”

“Do people with autism or other disabilities even want these technologies? No one asks them.”Maitreya Shah

In his own life, Shah uses some AI technologies as assistive tools including “Be My AI,” which describes images, and “Seeing AI,” which provides users with visual information such as text, color, light, and scenery. Blind people were very involved in the development and testing process for both those tools.

But Shah said too often people with disabilities are not included in the high-level decision-making and development processes for AI that is purported to benefit them. He cited, as an example, technology designed to diagnose autism or address learning disabilities.

“The question is: Do people with autism or other disabilities even want these technologies? No one asks them,” Shah said.

In his research, Shah proposes adopting perspectives from disability justice principles, such as participation.

“Let people with disabilities participate in the development and the deployment of technologies. Let them decide what is good for them, let them decide how they want to define or shape their own identities.”

Saphra agrees, which is why she believes any developer creating assistive AI should make it easily customizable, not just by AI experts or coders, but by people who may not be tech experts. That way, users can set up the system to perform specific, essential tasks like Saphra did for writing code.

“It’s very important to make sure that everything you release is hackable, everything is open-source, and that everything has an accessible interface to start with,” Saphra said. “These things are going to make it more useful for the greatest number of disabled people.”


Resources

Also in this series

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a comment