This company used AI for its website and ended up in court
Automating the work of complying with these guidelines could make the web more welcoming. But more than 600 accessibility experts signed up a document ask website operators not to use such automation tools, including AccessiBe. Signatories include contributors to W3C guidelines and employees of Microsoft, Apple and Google. “Automated detection and repair of accessibility issues is not reliable enough to bring a site into compliance,” the document says, accusing some vendors of “deceptive marketing.”
The site was started by Karl Groves, founder of accessibility consultancy Tenon.io, who provided a 35-page analysis of AccessiBe software to Murphy’s lawsuit against Eyebobs. Groves said he studied a total of about 1,000 pages from 50 websites using the startup’s technology and found a median of 2,300 violations of W3C guidelines for each site. Groves says this is a significant undercount because most guidelines can only be verified by expert manual analysis. “Artificial intelligence doesn’t work like that yet,” he says.
In its report on AccessiBe, Groves cited an image of a model wearing a white dress for sale on an e-commerce site. The alternative text provided, apparently generated by AccessiBe’s technology, was “Grass nature and summer”. In other cases, he reported, AccessiBe failed to correctly add labels to forms and buttons.
On the home page of its website, AccessiBe promises “automated web accessibility”. But vouchers warn customers that its machine learning technology may not accurately interpret web page functionality if it “has not encountered these elements sufficiently before”.
AccessiBe community relations manager Joshua Basile, a paralyzed quadriplegic below, says since joining the company earlier this year he has become more engaged with disability advocacy groups and clarified that it offers “manual remediation” alongside automatic fixes. “It’s an ever-evolving technology and we’re getting better and better,” he says.
In a statement, AccessiBe marketing manager Gil Magen said the company has analyzed Eyebobs’ website and found it to meet accessibility standards. AccessiBe offers its customers assistance in the event of a dispute, but Eyebobs refused, according to the press release.
In its own statement, Eyebobs said it “no longer works with AccessiBe and we will not do so in the future”.
Although the Eyebobs settlement, which is due to be finalized next year, does not include an admission that its site has had problems, it does require the company to pay for an external expert audit and dedicate one or more employees at work accessibility. “Eyebobs is committed to ADA compliance and to supporting all visitors who visit our website,” said Chief Marketing Officer Megan McMoInau.
Haben Girma, a deafblind disability rights lawyer, says she hopes the Eyebobs lawsuit will discourage companies from using AccessiBe or similar tools. She thinks tech companies or regulators like the U.S. Federal Trade Commission should take action against the inaccurate marketing of accessibility tools. “Governments, Google and social media companies can stop the spread of misinformation,” she says.
Experts who criticize automated accessibility tools usually don’t claim that the technology is completely worthless. Rather, they say that placing too much trust in the software is likely to cause harm.
A paper 2018 by W3C employees praised the potential of using AI to help people with poor vision or other needs, but also cautioned about its limitations. He pointed to a Facebook project using machine learning to generate text descriptions for images posted by users as an example. The system won an award from the American Foundation for the Blind in 2017. But its descriptions can be difficult to interpret. Sassy Outwater-Wright, director of the Massachusetts Association for the Blind and Visually Impaired, noticed that the system sometimes displayed concern for body parts – “two people standing, beard, feet, exterior, water” – that she dubbed it “the beard dilemma”. .”