The title says it: the two inventions that need to happen right away for the deaf community.
We really just want to understand what is being said around us. Imagine, if someone who uses American Sign Language as his primary mode of communication was able to press a button on a smart phone or a smart watch or a smart toothpick, and a holographic interpreter popped into existence?
“Ready for interpretation?” asks the interpreter, who is entirely composed of light.
“Yes,” signs the deaf man, pointing to the hearing person he’d like to communicate with. Commence interpretation.
It would be absolutely a must to customize your own personal interpreter’s specialized vocabulary to match your own. Mine will look like a robot from a 1960s sci-fi show.
But that’ll take a while before such technology can happen.
You know what I feel can happen today, if Google was motivated? The CC Glasses. We know that they have Google Glass already. We know that CC technology exists. It’s imperfect and often unreliable, but imagine if Google was motivated to perfect that technology and load it up into the Google Glass. I’d love that. Pop on the Google Glass and have it caption everything that is said to me. While it’s at it, it can caption the sounds of approaching cars, random dogs, babbling babies, and croaking frogs. Why not? That would be a rich experience. World-changing, honestly.
And it’s within reach. Google only has to do it. Then they can move on to the holographic interpreters, right?
——————-
Disclaimer for my extremely valued interpreters who I think are awesome sauce: It’s not that I’d want to replace you guys. You’re amazing. You make a tremendous different. But I can’t fit you in my smart phone (yet) so I have to be willing to accept such technological advances. But don’t worry-it’ll be at least 50 years before this can happen, so by then you’ll be old with me and we can play chess at the Seniors’ Center. 5pm on Tuesdays.