NEW YORK -- Imagine having access to a language tutor available at any hour, ready to teach you new vocabulary and monitor your progress. On Thursday, Nvidia unveiled its innovative language learning platform, Signs, which utilizes artificial intelligence to assist learners of American Sign Language (ASL). Developed in collaboration with the American Society for Deaf Children and creative agency Hello Monday, this platform aims to bridge communication gaps within the deaf community.
Signs features a 3D avatar, providing real-time demonstrations of signs to users. With their video cameras activated, learners practice the signs, receiving instant feedback from the AI tool. Initially, Signs includes 100 distinct ASL signs; Nvidia envisions increasing this number to 1,000.
This initiative is just one example of how AI technology is making headway through assistive tools aimed at helping disabled or elderly individuals, alongside their caretakers. Major tech companies, including Meta, Google, and OpenAI, have likewise ventured to improve technologies for those who are blind or have low vision. Even Apple introduced AI-enabled eye tracking to facilitate navigation on iPhones for physically disabled users. These advancements are already proving beneficial for blind individuals, enabling easier navigation through work and life.
According to the organizations involved, ASL ranks as the third most prevalent language within the United States, trailing behind English and Spanish. The launch of Signs also signifies Nvidia's efforts to extend beyond its core business of AI hardware; the company has established itself as a key supplier to the AI industry through the chips utilized by many companies to operate their technologies.
Michael Boone, Nvidia's manager for trustworthy AI products, shared his insights on the endeavor: “It's important for us to produce efforts like Signs, because we want to enable not just one company or a set of companies, but we want to enable the ecosystem.” This commitment reflects Nvidia’s ambition to create practical applications for AI technology, reaching beyond corporate clients.
Signs is free to users and encourages ASL speakers to contribute video clips demonstrating signs not yet captured on the platform. This user-generated content has the potential to empower Nvidia to develop future ASL-related products, such as improving sign recognition for video conferencing tools and gesture controls within automobiles.
Nvidia plans to provide access to its growing data repository for other developers, enhancing ASL learning and integration across various sectors. Future versions of the Signs platform are considering the feasibility of incorporating non-manual signals, including facial expressions and head movements, which are key components of ASL. This will also include exploring slang terms and regional dialects to enrich the learning experience.
Cheri Dowling, Executive Director of the American Society for Deaf Children, noted the significance of accessible tools like Signs for families: “Most deaf children are born to hearing parents. Giving family members accessible tools like Signs to start learning ASL early enables them to open an effective communication channel with children as young as six to eight months old.” Her remarks highlight the platform's role as not only educational but also foundational for families seeking to engage with their deaf children early.
The functionality of Signs allows any skill-level student to access resources for enhancing their ASL vocabulary. The platform currently facilitates opportunities for learning and practicing 100 signs, which can be instantly referenced and practiced. Dowling emphasized its utility for families, stating, “The Signs learning platform could help families with deaf children quickly search for a specific word and see how to make the corresponding sign. It’s a tool to support their everyday use of ASL outside of more formal classes.”
While initially focused on hand movements and finger positions, the team behind Signs acknowledges the complexity of conveying meaning through non-manual signals within ASL. They are exploring ways to accurately track and integrate these additional elements, as well as collaborating with researchers from the Rochester Institute of Technology’s Center for Accessibility and Inclusion Research to improve user experience for the deaf and hard-of-hearing community.
“Improving ASL accessibility is an on-going effort,” remarked Anders Jessen, Founding Partner of Hello Monday, who built the Signs web platform. He underscored the need for advanced AI tools to help overcome communication barriers between deaf and hearing populations. “Signs can serve the need for advanced AI tools meant to transcend communication boundaries.”
With plans set to release the comprehensive dataset later this year, those interested can begin learning or contributing to this innovative project at signs-ai.com. Notably, attendees of the upcoming Nvidia GTC conference, scheduled for March 17-21, 2025, will have the chance to engage with the platform live, marking another milestone in the bridge between technology and accessibility.