By Craig Chamberlain It's called lip-reading, but the best of those who do it are reading much more than lips, says Charissa Lansing, a UI researcher. Using the latest technology, and an interdisciplinary approach, she is learning their secrets. What she and her colleagues find could benefit anyone who deals with a hearing impairment. It also could improve hearing-aid design, rehabilitation techniques and even voice-recognition systems. "For a long time, the literature has suggested that lip-reading is very difficult, and people can't be expected to identify more than 30 to 50 percent of spoken words," said Lansing, a professor of speech and hearing science. "But we have seen people who are profoundly hearing impaired, in some cases since birth, who can understand 80 percent or better from unrelated sentences." Lansing, working with colleagues at the university's Beckman Institute for Advanced Science and Technology, is trying to find out what makes these skilled lip-readers successful. The technique the researchers have developed uses sophisticated equipment to track the eye movements of subjects as they attempt to comprehend the silent words spoken by different people on a recorded video disc. The research involves people with and without hearing impairments, and with different levels of lip-reading ability. The researchers include specialists in computer vision and engineering; in the psychology of human perception and performance; and in speech pathology and audiology. "We believe that by tracking a subject's eyes, and where they're directing their attention on a speaker's face, will give us important information about the language processing that the subject is doing," she said. The tracking equipment can find the center of a person's pupil, thereby identifying where he or she is focused, and then track the eyes' movements with checks every four milliseconds. By linking the video image and tracking information together in a computer program, Lansing can tell precisely where a subject is looking at any given time and track their eye movement through an entire sequence. With only small groups of subjects tested so far, "What we think right now is that people who are more proficient [at lip-reading] are doing more than just looking at the mouth," Lansing said. "They are in fact almost scanning for information, looking at different areas of the face. In the same period of time, they are making many more gazes around the face than someone who is less proficient." The research is still in its early stages, but it already has earned Lansing a five-year grant of $500,000 from the National Institutes of Health. "For many years, people have taught lip-reading without knowing anything about its effectiveness," she said. "For a very long time, people have said we need to work on where people are looking, where they're directing their attention ... but no one has ever established if that's really going to make any difference."