NEW YORK — Imagine a language tutor who is always on, available anytime to teach new vocabulary or check up on a student’s progress.
Recently, Nvidia introduced a language learning platform that utilizes artificial intelligence to support American Sign Language learners. This initiative was developed in collaboration with the American Society for Deaf Children and creative agency Hello Monday.
The platform, named Signs, utilizes a 3-D avatar to demonstrate various signs. Users are required to have their video cameras turned on while engaging with the platform. An AI tool provides feedback to users as they practice the signs. Currently, Signs features 100 different signs, with Nvidia aiming to increase this number to 1,000 in the future.
Signs is just one example of how AI technologies are contributing to the advancement of assistive technologies. These tools are designed to aid individuals with disabilities, elderly individuals, or their caregivers. Companies like Meta, Google, and OpenAI have utilized AI to enhance features for individuals with visual impairments. Apple, for instance, has introduced AI-powered eye-tracking technology to assist physically disabled users in navigating their iPhones. Feedback from blind users indicates that these advancements have already begun to simplify their daily lives and professional tasks.
American Sign Language is the third most prevalent language in the United States, behind English and Spanish, according to the groups behind Signs.
The ASL-learning platform is also a reminder that Nvidia has been trying to branch out into more than just the hardware behind AI. Nvidia has become a major supplier to the AI industry by building the chips that most companies use to run the technology, in addition to its own AI models and software platforms. The company’s stock has soared more than 100% in the past year as AI companies who proclaim the technology’s future promise buy up vast amounts of Nvidia’s chips, bringing it to a more than $3.4 trillion valuation.
Michael Boone, Nvidia’s manager for trustworthy AI product, said the company is committed to building AI products not just for corporate customers, but to also foster practical applications for the technology. “It’s important for us to produce efforts like Signs, because we want to enable not just one company or a set of companies, but we want to enable the ecosystem,” Boone said in an interview with CNN.
And, ultimately, more people using AI in any form is good for Nvidia’s core chip-making business. Some investors have in recent months raised concerns about whether tech giants have been overspending on AI infrastructure, including chips, and questioned how long it might take to earn a return on that investment.
Signs is free to use, and it will allow ASL speakers to contribute videos of signs that are not already available on the platform to grow its vocabulary. That data could also enable Nvidia to develop new ASL-related products down the road – for example, to improve sign recognition in video-conferencing software or gesture control in cars. The company says it will also make the data repository publicly available for other developers.
And for future iterations of Signs, Nvidia said the team behind the platform is exploring how to include non-manual signals that are crucial to ASL, such as facial expressions and head movements, as well as slang and regional variations in the language.
“Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old,” Cheri Dowling, executive director of the American Society for Deaf Children, said in a statement about the new project.