Skip to main content
Brain
Power

How new brain-aided technologies are expanding the reach of medicine and transforming lives

People with disabilities face a range of challenges every day. Those with spinal cord injuries, for example, often need help eating, bathing, and dressing. And those with hearing loss struggle to understand and communicate. But imagine those same people now being able to pick up a cup of coffee or hear the voices of their loved ones again.

Transformative changes like these are happening at the Miller School of Medicine, where researchers and clinicians are linking the power of technology with the capabilities of the human brain to help people overcome the limitations they face because of paralysis or deafness.

From Thought to Action

I n a clinical trial at The Miami Project to Cure Paralysis, researchers are collaborating with scientists from the Miller School’s Department of Neurological Surgery and the College of Engineering’s Department of Biomedical Engineering to restore hand and arm movement in a 22-year-old man who was injured in a motor vehicle accident several years ago. The accident left him unable to walk or to use his hands, and he requires nursing care for many daily activities.

“Having limited function in the upper extremities is a huge quality-of-life issue,” said W. Dalton Dietrich, Ph.D., scientific director of The Miami Project and the Kinetic Concepts Distinguished Chair in Neurosurgery. “We investigate strategies directed at getting people to walk again, but hand and arm function is also very important in everyday life.”

The Miami Project team is testing a novel way to use a computer to interface with the brain and allow a person with a spinal cord injury to pick up objects like a pen or a cup. Researchers are tapping in to the intact capabilities of the brain and the arm muscles, and using technology to replace the damaged signal pathway in the spinal cord. So, when the participant thinks, “I want to move my hand,” he can initiate movement.

The process starts with a strip of electrical contacts that are implanted on the surface of the participant’s brain in the area that controls movement. Those contacts pick up brain activity.

“Once we’ve identified that he’s thinking about opening his hand, there’s a unique signal we can acquire,” said Jonathan Jagid, M.D., the neurological surgeon who implanted the electrical contacts in the participant’s brain in November 2018.

Researchers use a laptop to collect and process that signal. Then, machine-learning algorithms decode the signal and send impulses that activate paralyzed muscles, which enables opening and closing of the participant’s hand.

“When he thinks about it, it triggers an action,” Dr. Jagid said. “He is able to lift objects, feed himself, and write — all things he was unable to do after his injury.” He has also been able to control a robotic walking system.

“Having limited function in the upper extremities is a huge quality-of-life issue,” said W. Dalton Dietrich, Ph.D. “We investigate strategies directed at getting people to walk again, but hand and arm function is also very important in everyday life.”

Signals Bring Back Sound

A bout one in 25,000 people have neurofibromatosis type 2 (NF2), a condition in which tumors or their treatments damage the auditory nerve and cause profound hearing loss. Most people with NF2 aren’t candidates for cochlear implants, a common treatment for deafness, because cochlear implants require a working auditory nerve.

Auditory brainstem implants (ABIs) can provide some access to sound. With an ABI, surgeons place electrodes on the brainstem. Patients wear an external processor that picks up sounds with a microphone, converts those sounds to electrical signals, and sends the signals to the electrodes on the brainstem. Those signals can then be perceived as sound and pitch. Most people with NF2 aren’t candidates for cochlear implants, a common treatment for deafness, because cochlear implants require a working auditory nerve.

The multidisciplinary team at the UHealth Ear Institute was well positioned to launch its auditory brainstem implant program, one of just a handful in the United States.

“We’ve had a cochlear implant program since 1990, so we have a lot of familiarity with implanting and programming electronic hearing devices,” said Fred F. Telischi, M.E.E., M.D. ’85, the James R. Chandler Chair in Otolaryngology and chairman of the Department of Otolaryngology.

“Part of having a successful ABI program is having a successful NF2 program,” said Christine T. Dinh, M.D. ’08, assistant professor of otolaryngology. “And part of having a successful NF2 program is not just having clinicians to treat it, but also researchers to identify better solutions. We’re looking for new therapies for our NF2 patients. Although we use many basic science techniques, what’s special is we’re able to take the tumors from our patients, culture them, study them, implant them in animal models, and test different therapies.”

The skull base surgery team performed the Miller School’s first implant surgery in March 2019, on a woman in her 60s whose NF2 was causing her hearing to decline.

“She was nearly deaf,” Dr. Dinh said. “She only really understood about 20 percent of what was going into one ear.”

During the surgery, state-of-the-art care and monitoring are crucial — hitting the wrong area of the brainstem could trigger dizziness or facial or vocal cord paralysis. After recovering for about a month, the patient returned to have her device turned on.

“The device has 21 metal contacts that are placed along the brainstem,” Dr. Dinh said. “On each contact we were able to obtain hearing responses, so we were very happy with the results.”

The ABI team also included Sandra Velandia, Au.D., and Diane Martinez, Au.D., bilingual audiologists who were able to communicate with the patient in Spanish, her native language. “She did remarkably well at the initial stimulation,” Dr. Telischi said.

Programming an ABI is more challenging than programming a cochlear implant. That’s because the cochlea has a predictable pattern of high and low frequencies.

“With an ABI, we may not know where the high and low frequencies are,” Dr. Dinh said, because as volume increases, pitch can also change. “We rely a lot on the audiologist to program and reprogram so the patient achieves the best hearing outcome. With multiple programming sessions and time to practice, patients do better.”

People with cochlear implants can get excellent speech recognition, but ABIs are less precise. Combined with other techniques like lip reading, though, they can make a big difference in someone’s ability to understand and communicate.

“We did initial testing after activation, then a six-month test to see how well she’s doing,” Dr. Dinh said. “She hears well with it. She can distinguish different sounds like a barking dog or a fire truck.”

The patient understands very little speech with the implant alone, and with lip reading alone she understands about 25 percent. But combining the two gives her a synergy that increases her understanding to about 90 to 100 percent for words and sentences.

The team has since performed implant surgery on two more patients, one by Dr. Telischi and Jacques Morcos, M.D., and another by Dr. Dinh and Michael Ivan, M.D., M.B.S.

A key factor in the ABI program’s success is its multidisciplinary aspect, and the surgery is just the beginning.

“It also depends on audiologists, electrophysiologists, psychologists, and family support,” Dr. Dinh said.

 

 

Brain Power
UNIVERSITY OF MIAMI MEDICINE
SPRING 2020