Network: Sign of progress for deaf viewers

Meg Carter on a radical sign language system for television

Meg Carter
Sunday 28 March 1999 23:02 BST
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Meet Simon the virtual man. Dark, handsome and a wonderful way with his hands, he's the latest recruit to the brave new world of digital television - the public face of a system now in development to provide deaf viewers with live and simultaneous sign language alongside TV programmes.

Simon was born out of a collaboration between Televirtual (a Norwich- based 3D graphics specialist), linguistic and facial processing specialists at the University of East Anglia, and the Independent Television Commission (ITC) which funds and co-ordinates the project.

The aim is to develop a real-time, computer-generated deaf-signing system to meet sign language requirements laid down in the Broadcasting Act of 1996. The Act set a number of targets for broadcasters operating digital terrestrial television services, including the transmission of 50 per cent of output with both signing and subtitling within their first 10 years of operation.

Broadcasters have provided subtitling for the deaf for some time, but live signers are used only selectively - because of cost, the additional facilities that are needed, and the concern that "in-vision" signing is often distracting to the majority of other viewers, says Dr Nick Lodge, ITC head of standards and technology.

"The system now being developed is intended to generate deaf signing at the viewer's end of the broadcast chain, using the teletext subtitle signal which already accompanies many broadcast programmes," he explains.

It comprises three elements: a linguistic translation software program, a lexicon of sign language gathered by motion capture techniques, and virtual human animation.

Linguistic translation software takes in and processes information from the subtitles accompanying a programme, analysing sentences and removing redundant words not used in deaf-signing, such as the definite or indefinite articles or prepositions.

This reconstructs the words in a sequence appropriate for signing. Where subtitling is particularly wordy, such as for news broadcasts, the software ranks words and edits out the least important to enable signing to keep pace with the action on screen.

Once this information has been processed, it is passed to the virtual human which relies on a dictionary of signed words and phrases. These are captured by motion control from a real person.

Televirtual is working with two professional signers to compile this sign language library, attaching markers to their faces and filming moves which are then processed by computer. Initially, the system will run on the mix of Sign Supported English and British Sign Language commonly used by existing in-vision TV signers.

"There are now lots of different ways of capturing body movement data, but this project has been unique in using three different types of motion capture," says Dr Lloyd. "For sign language it's not just hand movements which are important - facial expressions and body position are also critical."

To gather every detailed nuance of the signer's hand gestures, data gloves were used with optical fibres fitted throughout to track finger and hand movements.

Face markers were also used, along with a magnetic tracker that measured the movements of the signer's torso. "Such detail was essential to ensure the result was more than a pair of animated, dismembered hands floating in space," he says.

However, the process is still under way. "It will take quite some time to complete the database of words and phrases needed to make the system fully operational," Dr Lloyd explains, "not least because of the care that must be taken in accommodating colloquialisms and regional differences in signing and ensuring what we end up with is readily understood by everyone."

Once complete, data from the linguistic translation will be put together with the stored hand movements from the system's signing lexicon - which can be called up in any order - and a detailed 3D graphic model of a virtual human which can run on a top-end, home PC.

An important feature of the system is its ability to ensure that the transition between the virtual human's different signed words or phrases is smooth rather than jerky - a typical problem with early prototypes of this approach, which relied on splicing together video clips instead of using computer animation.

The eventual plan is for the real-time generation of this computer animation to occur in the viewer's set top, digital TV decoder box, although early models are yet to accommodate this. In the meantime, the virtual signing service is likely to be broadcast in a closed system direct to viewers via a separate digital TV channel - the advantage being that only the deaf would see the signing.

"It's still early days," Dr Lodge admits. "But we are now discussing when the first generation of this service might launch, and we envisage a phased roll out over the next few years."

The first step, which could come in late 1999, is likely to be the introduction of digital TV signing using a live signer. This would then be upgraded to the full automated system when work is complete.

Cautious of criticism from the deaf community of the decision to go for virtual rather than live real-time signers, Dr Lloyd adds: "We are fully aware that deaf people would like to see more deaf employed by TV and working in TV studios. But if this is the way we can ensure that 50 per cent or even 80 per cent of programmes are signed, then wouldn't that be great?"

Besides, he adds, once complete, the Simon system will also have potential uses in non-broadcast areas - as a signing typewriter wherever anyone wants to communicate with the deaf. The ITC, which owns the rights to the technology, has already licensed its use by the Post Office which is developing an in-store system with UEA for use by counter staff.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in