Classic Computer Magazine Archive COMPUTE! ISSUE 133 / SEPTEMBER 1991 / PAGE 28

Breaking communications barriers. (computers for the physically handicapped) (includes product listing)
by Gail Dutton

Remeber writing term papers? It seemed like a major chore in high school and college. Just imagine writing one blindfolded or without touching your PC, and you'll have an idea of what it's like for blind and quadriplegic students and professionals. Imagine writing it in, say, Chinese, a language foreign to you, and you'll understand the challenge aphasic patients -- those who've lost the ability to use and process language -- face when trying to communicate even simple requests.

Fortunately, software and hardware solutions are available, although they aren't widely known. Often the solution is simply a matter of locating the right pieces and integrating them into a computer.

Scanning and Voice Synthesis for

the Visually Impaired

One system pieced together by two Yale University students relies upon a voice synthesizer and a scanner to let visually impaired students and staff have full access to the information stored in Yale's Sterling Library (where the system is housed) and any other written resources available. Built by Matthew Weed, a blind political science and history major, and Victor Grigorieff, a computer science and psychology major, the system is based on a Macintosh IIfx, although it can run on earlier models, since each Mac program has a similar interface. It uses only commercially available software and hardware.

Because the Macintosh interface has remained consistent, visually impaired users only have to learn one set of concepts to run several different programs. The Mac also has the flexibility Weed and Grigorieff require. With it, they can convert from Mac to IBM text files as needed so users can copy files for use on their own computers.

In addition to the Macintosh IIfx, the system uses the Hewlett-Packard ScanJet Plus, OmniPage, and outSPOKEN software for scanning and voice synthesis, inLARGE for magnifying text, a word processing package, and a 19-monochrome monitor. With outSPOKEN, the visually impaired can use graphical interfaces 95 percent as effectively as sighted users. And system glitches are minor; for example, the ScanJet Plus sees the number 2 as a tilde and the letter l as an n, but it's about 99.5-percent accurate.

The ScanJet Plus is used to scan books, research reports, journal articles, and other printed documents into the Macintosh at a rate of two side-by-side pages per 40 seconds. The text is then converted to sound using OmniPage and outSPOKEN. To listen to the file, the user opens the menu with a mouse or keystrokes and selects the options from the choices spoken by the voice synthesizer.

When the file appears, the voice synthesizer reads it aloud either one line at a time or one word at a time as the user cursors from line to line or word to word. Either method can become tiresome, so Weed often instructs the computer to speak faster -- up to twice as fast as the average human reads aloud. With outSPOKEN, the user can also control type fonts, vocal pitch, and volume, and it offers a word dictionary for user-defined pronunciation, a graphics dictionary for identifying common signs and symbols, and a Find command for locating information on the screen.

By using this system, Weed's need for hundreds of audiotapes and the hours it takes to search them for specific quotes are eliminated. He's cut the time required to write a term paper from four or five hours per page to about 35 minutes per page.

The system is as advantageous for dyslexics as it is for blind users, Grigorieff says. With inLARGE software, individual letters, words, and lines can be enlarged up to three inches in height on the system's 19-inch monitor. Words can still be spelled and words or sentences spoken, making it easier to read new words. To help users keep their places, the system speaks the word the cursor is on and presents text with a ragged right edge and a seriff typeface. inLARGE also offers a full-screen crosshairs cursor to make it easier to locate. Grigorieff says the system's potential is limitless.

Visually impaired users can access networks such as ARPAnet, Internet, and Bitnet -- invaluable aids in technical work. Eventually, Weed and Grigorieff hope an interuniversity electric library will be established so scanned versions of references can be loaned just like printed versions of documents. Right now, though, Weed says copyright laws are a problem. At Yale, there are only about a dozen potential users, and the possibility that any one book that's scanned will be used again is slim, he says. So to save computer memory, he's spending part of his summer erasing the books that have already been scanned into the system.

Yale's system was built last fall with a $15,000 grant from Yale University. Because costs are dropping, Weed estimates the same system could be built today for a little more than $10,000.

Design by Voice and Movement

All the way across the country, Jeff Burnett, an architecture professor at Washington State University, and Technical Applications Group colleagues have built a system that allows quadriplegics to work on electronic CAD projects with the same levels of expertise as their able-bodied colleagues. This system, Burnett says, also works with anything graphically oriented, including spreadsheets.

The project, as yet unnamed, transparently links a DOS machine to the powerful UNIX systems that are needed for CAD and to a telephone. That configuration can then be booted automatically and controlled by speech recognition technology and an infrared headpointer. The system is "glued" together with custom software.

Users can boot up the machine by triggering a sensor --either a pressure pad or a special reflector -- that can only be triggered by their wheelchairs when they roll up to the PC. Once the machine is booted, the menu comes up and can be used either by issuing voice commands or by using a headpointer as a mouse.

The software was written specifically for a headpointer made by Millenium Stride Computers, although others can be used. Because the headpointer uses infrared sensors, users don't have to be tethered to their computers with electronic cables.

The pointer is actually a reflective tape mounted on eyeglasses or even on a pencil tucked behind one ear. It's tracked with an infrared device mounted atop the computer -- just the opposite of a TV remote control.

To select a menu function, users move their heads so the tape's reflection hits the desired icon; then a word is spoken that's the equivalent of clicking a mouse button. The adaptive interface allows users to move the window around rather than moving their heads in awkward positions.

The system has a small vocabulary, oriented toward CAD work, that isn't context sensitive. Individual users can load a personal vocabulary or label documents by spelling the needed words with the phonetic alphabet. To load the word angle, for example, the user would say, "Alpha, November, golf, Lima, echo." Burnett's system uses a Votan voice recognition board, one of the most functional available.

When the phone rings, an answering machine or the computer picks up the call, stopping the CAD program in its place. The user can converse using either a microphone and speaker combination or, for more privacy, a headset. To hang up the phone, the computer's voice recognition system listens for the words hang up and a confirming utterance. Upon hang-up, the user can resume CAD work instantly.

"If users are familiar with CAD, they can be functional on this system within one day, and in only a few weeks, after creating macros and editing the vocabulary, can compete in the same arena and at the same level as their able-bodied colleagues," Burnett says. In practice, success can depend very much upon the user's personal motivation.

Users are now being trained on this system at the University of Washington Center for the Handicapped in Seattle. After the training, they leave with hardware and software tailored specifically to their own work environment.

Images Instead of Alphabet

Researchers at Tufts University School of Medicine in Boston are using computers to tackle a different problem: how to help patients who have lost the ability to use language -- usually as the result of a stroke. The type of brain damage called aphasia affects the portion of the brain where words and speech are processed, leaving patients with the ability to comprehend much of what others say but unable to reply. They can't formulate thoughts into coherent phrases or sentences. Roughly one-fourth of the half-million people who suffer strokes each year also develop aphasia, according to Cheryl Goodenough-Trepagnier, associate professor of rehabilitation medicine at Tufts.

Aphasic patients can, however, learn to organize symbols into a coherent order to form thoughts and sentences. In the 1970s, patients learned to use cards with symbols to express their thoughts. Now those symbols have been expanded and loaded onto an Apple computer, simplifying their use.

Trepagnier's system, called NewVic, features hundreds of symbols -- still called cards and decks -- arranged in categories of people, actions, objects, modifiers, and prepositions. Eight symbols are displayed per computer screen. Patients use a mouse to select cards, scroll through the screens, and move from screen to screen. Decks are flexible enough that they can be designed to allow speed and vocabulary size to match a patient's abilities.

Some people pick up the system almost immediately, while others take a few weeks and still do very well with it. Although they don't know what the limits are, Trepagnier says the patients most likely to benefit are those who take to the system immediately and who are functionally impaired. "We're just beginning to be able to develop an appropriate communication medium for people with severe aphasia. The big problem is slowness," she says, "because people are trying to lay out messages through very impaired motor abilities. I haven't clocked it, but it's faster than three words per minute [for patients who are fluent with NewVic]. One of our major concerns is finding a way to communicate at a rate other people can tolerate, so users actually get to engage in conversation."

Another difficulty is in designing symbols to match verbs since aphasics often have more trouble conceptualizing verbs than nouns. Trepagnier currently uses pictures to suggest verbs but wants to develop an approach where patients can animate a figure throughout an action, actually setting the images in motion. For example, eating an apple could be shown by choosing a hand, apple, and head; putting the hand and apple together; dragging them to the head; and clicking a mouse key. To say, "The girl is running," a user could choose a picture of a girl, click the mouse at her feet, and move the mouse rapidly across the pad. The computer would show it as a girl running across the screen.

Of course, aphasics can only use NewVic if they have it with them. Hopefully someday a true portable machine with a touchscreen will be available, similar to some of the lightweight portables that have surfaced in recent months.

Trepagnier plans to make her system and basic documentation available to the public this year. It uses the NewVic software she developed and runs on a Macintosh SE or SE30, or any other Macintosh machine with at least 512K of RAM.

What About Tomorrow?

Great strides have been made in recent years to bring the challenges of the physically impaired to the forefront of the American consciousness. In fact, many other products geared toward the disabled, in addition to the ones mentioned in this article, are actually on the market now, but most are known only within small circles.

Computer technology promises to make life easier for the disabled. Personal fulfillment and overcoming stumbling blocks have always been the key goals of the personal computer. Now the technology that has leveraged our productivity and filled our leisure hours is helping the silent to speak, the blind to see, and the paralyzed to manipulate their worlds, and it's enabling technicians and research laboratories to perform computer-aided miracles.

Perhaps tomorrow, finding special hardware and software for the physically challenged will be as simple as checking out a disk at the local library or heading to the nearest electronics store for the latest equipment.

Gail Dutton is an indepement writer specializing in science and technology. Her articles have appeared in Science, Sea Frontiers, The World & I, IEEE Software, and other publications. She is based in southern California.