New computer program predicts cochlear implant success in hearing-impaired children
News Oct 13, 2015
A new computer program that analyzes functional brain MRIs of hearing impaired children can predict whether they will develop effective language skills within two years of cochlear implant surgery, according to a study in the journal Brain and Behavior.
Researchers at Cincinnati Children's Hospital Medical Center say their computer program determines how specific regions of the brain respond to auditory stimulus tests that hearing-impaired infants and toddlers receive before surgical implantation.
With additional research and development, the authors suggest their computer model could become a practical tool that allows clinicians to more effectively screen patients with sensori-neural hearing loss before surgery. This could reduce the number of children who undergo the invasive and costly procedure, only to be disappointed when implants do not deliver hoped-for results.
"This study identifies two features from our computer analysis that are potential biomarkers for predicting cochlear implant outcomes," says Long (Jason) Lu, PhD, a researcher in the Division of Biomedical Informatics at Cincinnati Children's. "We have developed one of the first successful methods for translating research data from functional magnetic resonance imaging (fMRI) of hearing-impaired children into something with potential for practical clinical use with individual patients."
When analyzing results from pre-surgical auditory tests, the researchers identified elevated activity in two regions of the brain that effectively predict which children benefit most from implants, making them possible biomarkers. One is in the speech-recognition and language-association areas of the brain's left hemisphere, in the superior and middle temporal gyri. The second is in the brain's right cerebellar structures. The authors say the second finding is surprising and may provide new insights about neural circuitry that supports language and auditory development in the brain.
Lu's laboratory focuses on designing computer algorithms that interpret structural and functional MRIs of the human brain. His team uses this information to identify image biomarkers that can improve diagnosis and treatment options for children with brain and related neurological disorders.
Along with Scott Holland, PhD, a scientist in the Pediatric Neuroimaging Consortium at Cincinnati Children's, and other collaborators from Cincinnati Children's and the University of Cincinnati College of Medicine, the researchers were able to blend human biology and computer technology in their current study. The mix produced a model in which computers learn how to extract and interpret data from pre-surgery functional MRIs that measure blood flow in infant brains during auditory tests.
After data is collected from the functional MRIs, the computer algorithm uses a process called Bag-of-Words to project the functional MRIs to vectors, which were subsequently used to predict which children are good candidates for cochlear implants.
The study included 44 infants and toddlers between the ages of 8 months and 67 months. Twenty-three of the children were hearing impaired and underwent auditory exams and functional MRIs prior to cochlear implant surgery. Twenty-one children had normal hearing and participated in the study as control subjects, undergoing standardized hearing, speech and cognition tests.
Two years following cochlear implant surgery, the language performance was measured for the cochlear implant recipients, which was used as the gold standard benchmark for the computational analysis.
The authors report that they tested two types of auditory stimuli during pre-surgical tests that are designed to stimulate blood flow and related activity in different areas of the brain. The stimuli included natural language speech and narrow-band noise tones. After analyzing functional MRI data from pre-surgery auditory tests and the two-year, post-surgery language tests, the researchers determined that the brain activation patterns stimulated by natural language speech have greater predictive ability.
Note: Material may have been edited for length and content. For further information, please contact the cited source.
Lu LJ et al. A semi-supervised Support Vector Machine model for predicting the language outcomes following cochlear implantation based on pre-implant brain fMRI imaging. Brain and Behavior, Published Online October 12 2015. doi: 10.1002/brb3.391
All in a Droplet: Atomic Resolution of ALS Protein ResolvedNews
Researchers have described atom-by-atom changes in a family of proteins linked to amyotrophic lateral sclerosis (ALS), a group of brain disorders known as frontotemporal dementia and degenerative diseases of muscle and bone.READ MORE
Pupil Size Couples to Cortical States to Protect Deep Sleep StabilityNews
Researchers have found that mice pupil size fluctuates during sleep. They also show that pupil size is a reliable indicator of sleep states.READ MORE
A Place to Think: Persistent neuronal activity in human prefrontal cortex links perception and actionNews
Neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response to a perception.READ MORE