The Sorry State of Closed Captioning

The Sorry State of Closed Captioning

Streaming video now must provide subtitles for the hearing impaired. There’s no guarantee of accuracy, though. One solution: crowdsourcing.

TAMMY H. NAM

JUNE 24 2014

Imagine sitting down to watch an episode of Game of Thrones—and hardly being able to understand anything. That’s the case for non-native English speakers or any of the 36 million deaf or hard-of-hearing Americans. HBO doesn’t expect its viewers to have a knowledge of High Valyrian; that’s why it takes care to offer subtitles to viewers understand exactly how Daenerys intends to free the slaves of Essos.

If only most online streaming companies took as much care in everyday captioning.

Nostalgia for Network TV

Machine translation is responsible for much of today’s closed-captioning and subtitling of broadcast and online streaming video. It can’t register sarcasm, context, or word emphasis. It can’t capture the cacophonous sounds of multiple voices speaking at once, essential for understand the voice of an angry crowd of protestors or a cheering crowd. It just types what it registers. Imagine watching classic baseball comedy Major League and only hearing the sound of one fan shouting from the stands. Or only hearing every other line of lightning-fast dialogue when watching reruns of the now-classic sitcom 30 Rock.

As of April 30, streaming video companies are now required to provide closed captioning. On all programming. There’s no doubt that we’re in a better place than we were even five years ago, when streaming video companies weren’t required to closed-caption any of its content. But, there still is a long way to go in improving the accuracy of subtitles. Netflix and Amazon Prime users have bemoaned the quality of the streaming companies’ closed captions, citing nonsense words, transcription errors, and endless “fails.” These companies blame the studios for not wanting to pay for accurate translations but excuses aren’t flying with paying streaming video subscribers.

Marlee Matlin, the Oscar-winning actress and longtime advocate for better closed captions for the deaf and hard of hearing, recently mentioned in an interview that she knows that she’s missing out on most of the action when she’s watching streaming video. “I rely on closed captioning to tell me the entire story,” she says. “I constantly spot mistakes in the closed-captions. Words are missing or something just doesn’t make sense. My kids spot it too, they’re aware of sloppy captions and the pieces of information that I’m not being given.”

Context and Knowledge of Cultural Nuance Matters

Machines also fall short when it comes to translating one language into another. It isn’t sufficient to merely exchange words in one language for its equivalent. When it comes to translating emotional writing and an actor’s subtle delivery of a piece of dialogue, there’s no substitute for the human touch. Let’s take it a step further. Imagine being a Japanese person watching the 1995 film Trainspotting and having to rely on a word-for-word translation of heavy Scottish dialect and slang. To say that much would get lost in translation is an understatement.

Good translators will have lived in the countries where the respective languages are spoken, and will be aware of cultural and linguistic nuances. They’ll keep up to date with current affairs and the introduction of new words and phrases. Most importantly, they will have an intuitive sense for the languages. A good translator will understand how important these details are, because she wants others to be as excited and horrified about everything that’s unfolding on the screen as she is.

The Heart of Language and Understanding

Humans can ensure quality and quantity when it comes to giving beloved films and TV shows proper translations. Machines can’t be fans in the same way that people can. They don’t go back and add more details just to enrich the experience, or think carefully about whether an audience will understand why a certain word sounds silly in one language but is a deep and unforgivable insult in another.

Crowdsourced subtitling platforms like Viki, a TV site powered by a community of fans who translate shows into multiple languages, allow you to add closed captions to YouTube videos without limits on the number of languages a show or film can be translated into and how quickly and accurately the content can be made available to viewers all over the world.

The demand for better closed captioning is yielding positive results. Congress recently passed a law requiring broadcasters to caption Internet-distributed video files if the content was broadcast on TV with captions. Netflix is working to comply with the Americans with Disabilities Act to make sure that all its streamed content has subtitles. Amazon Prime is also putting efforts behind making sure all of its Instant Video is closed-caption ready. YouTube has also improved its closed captioning and crowdsourcing and capabilities. On the human translation front, other companies, such as the new transcribe-and-translate platform Amara, are seeing the value of using people to provide better options than machines.

Up in the air, there are changes coming as well. Last week, Sen. Tom Harkin (D-Iowa) demanded that US airlines add closed captioning to in-flight movies for the benefit of hearing-impaired airline passengers. Closed captioning for all is a fantastic goal, but until there’s accurate closed captioning for all, there’s still work to be done.

SOURCE:

http://www.theatlantic.com/entertainment/archive/2014/06/why-tv-captions-are-so-terrible/373283/

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.