Blanquivioletas EN
  • Economy
  • Mobility
  • News
  • Science
  • Technology
Blanquivioletas EN

Confirmed – artificial intelligence doesn’t understand sign language… and the errors are already worrisome

This is one more example of technology ignoring the community they are meant to serve

by Andrea C
July 15, 2025
artificial intelligence doesn't understand sign language... and the errors are already worrisome

artificial intelligence doesn't understand sign language... and the errors are already worrisome

Goodbye to durable washing machines—manufacturers make them impossible to repair in order to sell more, according to a veteran technician

I couldn’t take 10 days of vacation anymore’—the viral story of the woman who escaped her job in the US to start over in Barcelona

Confirmed—a bottle containing a message written more than half a century ago by a fisherman from New Jersey has been found in the Bahamas

Artificial Intelligence is everywhere, and many companies have taken the lead and run with it without realizing that, although the technology is amazing, there is still a lot that cannot be done with it. This technology is still in its infancy and despite growing leaps and bounds in just a few short months, complicated tasks like sign language are still well above its capabilities. That is what China is finding out the hard way, as they have tried to use it to interpret for their Deaf population to not great results.

Zheng Xuan, a professor at Beijing Normal University’s Faculty of Education is one of the people that have realized that the efforts are not really going all that well. China has around 20.5 million people with hearing disabilities, and while the government is trying to accommodate them by using avatars and virtual presenters to provide real-time translation of some programs, the translations are subpar and often times inaccurate to say the least.

It seems like most of the problems started at the 2022 Beijing Winter Olympic Games, which were fully translated into sign language by Artificial Intelligence. While the world looked on impressed, Zheng studied the translations and was not as thrilled with the result.

“We transcribed and back-translated the sign language created by the avatars, then compared the results with the original audio, finding that a significant amount of key information was lost or distorted in the AI-generated version,” she wrote. “On closer inspection, the movements of the avatars differed considerably from everyday sign language in terms of hand shape, position, direction, and movement. Other issues were even more prominent — the avatars’ facial expressions and body language were off, and their mouth movements were distorted.”

Although it looked good to those not versed in the language, viewers reported “they generally couldn’t understand the avatars’ movements and noted that they seemed to have a limited vocabulary, while struggling to handle words with multiple meanings.”

Why Artificial Intelligence failed at using Chinese sign language

Well, that might be a bit complex to explain. As with spoken Chinese, sign language is quite complex and, as Zheng explains, “Chinese words cannot be found for the meanings expressed by 50 percent of gestures in Chinese sign language.” Developers overlooked “the difference between signed and spoken language. In particular, many perceive sign language as an accessory to spoken language, or else believe that translating between the two is similar to translating between two spoken languages. But the modalities of spoken language and sign language are quite different. The former is an oral-auditory language, while the latter is a visual-gestural or visual-spatial language. The term ‘gestural’ is a relatively broad concept that includes not just hand movements, but also facial expressions and body language. Full utilization of the body in space allows sign language users to express the meaning of an entire sentence – such as ‘a person walks into a room’ – with just one action.”

And there is one more barrier that was not accounted for, Chinese sign language includes “‘natural sign language’,” which is the quick everyday version used by the Deaf community as well as ‘signed Chinese language’, which is an expression of Chinese characters using signs.” Both are complex and used in tandem, creating dialects and nuances that Artificial Intelligence developers just could not hope to get right.

This is especially outrageous given that Chinese technology companies “have not involved sign language linguists or Deaf people in any great depth. Even in the cases where sign language teachers or interpreters are included, developers often only slot them into supporting roles, instead of taking the opinions of Deaf users as the final arbiter of their products’ effectiveness.”

She even worked as a consultant and was sorely disappointed by the lack of preparation “They seemed to underestimate the difficulty involved, overestimated the power of tech to solve problems, and lacked the necessary experience, resources, and ability to judge the quality of work done by third-party companies. By the time I joined the project, these shortcomings had already become apparent. Although the development team welcomed my participation, I felt that this respect was more for my technical knowledge as a university professor rather than my identity as a Deaf person.”

And when she pointed out the flaws and shortcomings of the product for the Deaf community, she was not taken seriously. She felt that “my feedback was not fully embraced, as the developers seemed unable to fully empathize with my frustrations.” And there are “fundamental issues with the way tech approaches the problem of sign language translation” because “Tech companies are used to first launching a version that has a lot of bugs, then optimizing it through a large amount of user feedback.”

She is even fearful that the community will reject these products because of how dismissive they are towards the Deaf experience “That’s not to mention the fact that some companies mislead users by promoting their products using real humans rather than avatars, and then release an immature generative AI version. Techno-optimists may believe that these flaws will all be solved with time, but we shouldn’t ignore the irreversible ethical harm: If the real needs of Deaf users are not responded to, they’ll feel that they’re being treated as guinea pigs.”

  • Privacy Policy & Cookies
  • Legal Notice

© 2025 Blanquivioletas

  • Economy
  • Mobility
  • News
  • Science
  • Technology

© 2025 Blanquivioletas