New hack lets Xbox Kinect read sign language

New hack lets Xbox Kinect read sign language

Researchers at Georgia Tech pair up the game device with custom software

By Charles Q. Choi

4/4/2011

Soon, you may not necessarily need to be fluent in American Sign Language
(ASL) in order to interpret it. Scientists have hacked Microsoft’s Xbox
Kinect motion control sensor to read ASL.

The real-world feat is reminiscent of Google’s April Fools’ Day prank this
year, in which the company falsely debuted a featured called Gmail Motion
that allowed users to translate bodily gestures into words and email
commands.

The Kinect, which debuted in November, offered a revolutionary way to
interact with computers without pushing any buttons or holding any device
whatsoever, using only body motions to control Microsoft’s Xbox game
console. The add-on, which is essentially a motion-sensing webcam, uses an
infrared scanner to create 3-D models of people as they move, allowing users
to play games by swimming their arms, shimmying their bodies or performing
other so-called natural interactions. The Kinect has proven very popular,
with 8 million sensors sold worldwide within 60 days of its launch.

The Kinect drew the attention not just of gamers but of programmers as well,
with a thriving community of hackers now testing the limits of what the
sensor can be used for, such as helping mobile robots respond to gestural
commands.

Now researchers at Georgia Tech are pairing up the Kinect device with custom
software that can interpret a very limited American Sign Language vocabulary
with greater than 98 percent accuracy.

The scientists initially only used a limited vocabulary of six signs — those
for “alligator,” “spider,” “box,” “wall,” “behind” and “in” — all signs that
involve broad gestures with the arms and body.

“What we’re doing now is working on computer vision algorithms to get more
information on hand shapes from the Kinect,” said researcher Helene
Brashear, a computer scientist and president of Georgia Tech spinoff company
Tin Min Labs in Austin, Texas. In the future, the scientists also hope that
an improved Kinect sensor with even higher-resolution imaging comes out.

The scientists are working on a game called Copycat aimed at helping deaf
children practice sign language. “Ninety-five percent of deaf children are
born to hearing parents, very few of whom are fluent in sign, so we want to
support these children as much as possible,” Brashear explained.

In the long run, sensors such as the Kinect could lead to ways for computers
to understand sign language and translate it to English or other languages.
“That’s far off, but it could happen,” Brashear told TechNewsDaily. “Right
now, advances in technology have really helped the deaf community — video
chat is huge there, for instance.

Source:

http://www.msnbc.msn.com/id/42419609/ns/technology_and_science-tech_and_gadgets

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.