Biz & IT —

Meet the $800 Windows tablet designed to interpret for deaf people

With limited vocab and slow speeds, will MotionSavvy's new device catch on?

Meet the $800 Windows tablet designed to interpret for deaf people
Cyrus Farivar

SAN FRANCISCO—“My… name… is… Ryan…”

In a world where most electric devices can talk and an increasing amount can listen and answer, a seemingly unassuming tablet speaking these words isn't at all impressive.

But this particular tablet wasn't replaying a recording or broadcasting some typed message. Instead, Ryan Hait-Campbell, the CEO for an Alameda-based company called MotionSavvysigned just inches above the device as it sat flat on a table. Instantly, it interpreted American Sign Language (ASL) into written and spoken English. The tablet is also able to listen to speech and convert it into text. As Stemper looked up and smiled, the "Uni" had impressed again.

As someone who isn't deaf or fluent in ASL, it’s hard to fully appreciate what the Uni tablet could mean for many. Still, the potential implications are clear. MotionSavvy's concept earned the group $25,000 from Leap Motion’s LEAP AXLR8R program, an investment competition run by Leap Motion itself. That money helped MotionSavvy and its Uni—essentially a Dell Venue 8 Pro Tablet with an attached Leap Motion sensor, all wrapped into a single unified case—relocate to the Bay Area this year after time at the Rochester Institute of Technology’s National Technical Institute for the Deaf.

Now the Uni is set to launch, though the tablet will be limited at first. Its initial vocabulary will include about 300 signs, and it also requires ASL speakers to sign at about half their normal speed. Pre-ordered machines will sell for $499 with a planned eventual retail price of $799—but either way, there will be an additional subscription price of $20 per month as a way to include updated dictionaries. The team plans to begin shipping in the fall of 2015, but MotionSavvy launched an Indiegogo online fundraiser to begin pre-orders today.

Talking with your hands

In today's tech landscape, how do most deaf people manage to navigate the hearing world around them? “Paper and pen,” Hait-Campbell explained. Often, typing out messages on smartphones is the easiest method.

Jordan Stemper is the company's chief design officer.
Enlarge / Jordan Stemper is the company's chief design officer.

But for MotionSavvy in particular, how does a deaf startup manage meetings with potential investors when few, if any, speak ASL? “Interpreters—and it makes our company go broke very fast! But talking to your teacher or your co-worker, you don't expect to have an interpreter. That's where we come in here," Hait-Campbell said.

ASL interpreters typically cost $50 per hour or more. And while there are other solutions to help deaf and hearing people talk to one another, they usually involve some sort of intermediary. Think specialized computers crossed with a human interpreter, like Interpretype or Ubiduo. Those solutions usually cost far more than MotionSavvy’s Uni. And while this is a problem many companies have identified as a target—just last year, Microsoft researchers came up with a similar solution to the Uni based on a Microsoft Kinect—it doesn’t appear to have translated into many viable products yet.

Both Hait-Campbell and his colleague, Jordan Stemper the company's chief designer, are deaf. Stemper wears a cochlear implant to be able to hear, while Hait-Campbell decided to stop wearing one a few years ago. So in our meeting, for example, Stemper interpreted Ars' spoken English into ASL for Hait-Campbell, who then responded in spoken English. Stemper also signed his own spoken English so that Hait-Campbell could understand. Something like the Uni may be able to simplify the communication channels.

“Let's say you go to a meeting with the boss. Your boss can talk back and forth easily [by using the Uni]. Or a family situation—you have an outing and you can bring [the Uni] with you and sign across it. It applies to the same situations,” Hait-Campbell added.

The business realm is of particular interest to MotionSavvy. While there are current tax breaks in the United States to encourage companies to hire deaf employees, there are many instances where deaf employees have a much harder time than their hearing counterparts. Just this past August, a West Virginia truck loader sued his former employer, accusing it of firing him because he is deaf.

“No employer would have to pay for this,” Hait-Campbell explained. “Having this product in a deaf person's hands may make a difference between being a dishwasher or something else. I have a deaf friend who is a dishwasher. I'm deaf, but I'm able to talk. For him, he cannot talk—he can just sign—so he can only get a job as a dishwasher, and that speaks volumes. Most [deaf people] have huge potential and they are not able to live up to it because of their current communication barriers.”

“Will it catch what I really want?”

While the Uni's potential is great, the high price tag could still be off-putting for many with bottom lines in mind. Lislee Egbert, a professor of Deaf Studies at California State University, Sacramento, told Ars by e-mail that while she is excited about the proposition of any deaf-owned business and the use of technology within the community, she's concerned that the purchase price may be out of reach for most users.

"However, it could be a powerful temporary tool that companies might invest in, especially in rural areas," she said. "This device should never take the place of a certified interpreter. That being said, it might allow time to communicate until an [interpreter] could arrive or to order food at the McDonald's drive through. It has potential. I look forward to seeing how it plays out."

One Oakland-based professional dancer who is deaf, Antoine Hunter, told Ars by e-mail that he would be interested in trying it out but wouldn't rush out to buy one.

“First, I support all dreams that fit for anyone’s own life to improve [themselves] and their community,” he said. “[But] it's a prototype. Understanding what the limits are, as every sign is communicated differently, [would be important]. My voice and your voice sound different. Same with signs. Will this understand my meaning behind my sign? Will it catch what I really want? I want ‘popcorn ball,’ but it said ‘popcorn’ and ‘ball.’ We sign things that mean different things for the same signs. This is where face expression and body language are involved. This prototype is focused on hand motions, so it's sticky.”

Donald Grushkin, another Deaf Studies professor at Sacramento State, echoed that sentiment in an e-mail to Ars. As with any language, strict interpretation can be problematic when words are used so fluidly within conversation.

"To truly understand sign language, any device would have to not only understand the signs but be capable of facial recognition as well as body position to effectively work with ASL signing. I do not know whether this device is geared toward recognition of signed English (as I suspect it is), but if so, this is not what the deaf community needs. Another issue is that computer-based translation is far from 'there' at this point—one only has to turn on YouTube or Google's auto-captioning feature for voice-based videos to see just how far from reality translation services really are."

MotionSavvy's product video

Channel Ars Technica