Conversing with an infant is vital to strengthening their vocabulary bank and grammar skills, especially among the deaf.
According to Matt Simon, “What’s interesting about developing infant minds is that natural language, no matter if it’s spoken or signed, stimulates the same areas of the brain.”
To combat this, researchers at Gallaudet University collaborated with three other universities to build a robot-avatar system. As the culmination of robotics and brain science research, the robot bolsters face-to-face interaction with deaf children, attracting the child’s gaze through its body language, then shifting this gaze to the avatar. This avatar then signs a nursery rhyme. It employs a thermal camera that identifies temperature shifts, which signals increased awareness. This is coupled with a face-tracking feature that directs the child’s vision to the avatar – while also monitoring their engagement. This learning tool is so effective that some children have even signed back to the avatar!
Alongside this, Johnson, a bioengineering student at Northeastern University, created a robotic arm to help those who are both deaf and blind. Unable to hear or see, they must resort to touch, using a human translator to help them communicate. Her robot acts as an alternative translator, making tactile sign language. Currently, Lard, a deaf-blind individual, is aiding the development of this technology by using it himself and pinpointing areas of improvement.
As of right now, Johnson is training the robot to finger-spell simple words. Johnson envisions the robot in places requiring private conversations, such as at the doctors’ or at home. She hopes it can eventually master American Sign Language, facilitating digital communication by connecting to social media platforms too. Furthermore, Johnson wants to integrate customizability into the robot – as sign language varies across regions.
Written by Amanda Y