Chip making firm Nvidia has launched an AI-powered platform design to enable users to learn sign language, known as Signs. The Signs tool supports American Sign Language (ASL) learning and accessible AI development.
According to Quartz, the initiative supports learners with the third most prevalently used language in the US for deaf community, across age groups. The platform is accessible for free and features a 3D avatar that demonstrates the signs to users.
Nvidia wants to expand the signs on the platform
Developed in collaboration with the American Society for Deaf Children and creative agency Hello Monday, the web-based platform is meant to support ASL learning. According to a Quartz article the platform will also have a dataset validated by fluent ASL users and interpreters which enables developers to build more accessible AI applications.
The tool, has a library of ASL signs for learners to improve their vocabulary as well as a 3D avatar teacher. Signs also allows learners get real-time feedback on their signing via an AI tool that analyses webcam footage.
According to Quartz, the Signs initially has 100 signs, and is focused on hand movements and finger positions. Apart from learning meanings of facial expressions, users can also derive the meanings of head movements, which are essential components of sign language.
Nvidia indicated that it wants to grow the Signs library to 400,000 video clips for 1,000 signed words. The open-sourced Signs dataset allows users to contribute by submitting their own video. Additionally, Nvidia is also planning to make the dataset available to the public for building accessible AI agents, video conferencing features and other AI tools.
“Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old,” Cheri Dowling, executive director of the American Society for Deaf Children, said in a statement.
“And knowing that professional ASL teachers have validated all the vocabulary on the platform, users can be confident in what they’re learning.”
Dowling.
Nvidia is currently working with Rochester Institute of Technology’s Center for Accessibility and Inclusion Research to enhance the AI-powered platform.
Nvidia is working on adding slang to the Sign library
It is also looking at including regional and slang terms in the Signs library. Nvidia also indicated that the dataset is expected to be released this year while the ASL learning service is now live.
Nvidia’s manager of trustworthy AI product, Michael Boone told CNN the company’s thrust and commitment to developing products for both corporate and individual customers.
“It’s important for us to produce efforts like Signs, because we want to enable not just one company or set of companies, but we want to enable the ecosystem.”
Boone.
Meanwhile, the chip making giant is preparing for its annual GPU Technology Conference in March. At this conference, attendees can also use sign language. Nvidia is also expected to report its fourth quarter earnings next week.
Investors are keeping a close watch on Nvidia’s upcoming Q4 earnings report, expected to be released soon, which may impact the stock’s movement further. Investors are keeping a keen eye on the stock’s upcoming Q4 earnings results, which may impact the stock’s movement further.
John Vinh, analyst at KeyBanc recently upgraded Nvidia’s target price from $180 to $190, reflecting confidence in the company’s growth.
Analysts are hopeful that the Nvidia will outperform consensus estimate in earnings report and project a $38.2 billion in revenue and an EPS of $0.85. This anticipated positive performance is coming from increased capital spending from tech giants like Amazon and Google as they invest heavily in Nvidia’s hardware and data processing units.
The ongoing investment could provide Nvidia with sustained revenue growth throughout 2025.
Cryptopolitan Academy: Coming Soon - A New Way to Earn Passive Income with DeFi in 2025. Learn More