Home » News » Before symptoms appear, AI can spot early signs of Alzheimer's in speech patterns.

Before symptoms appear, AI can spot early signs of Alzheimer's in speech patterns.

(Image Credit Google)
Image credit : CNN According to a UT Southwestern Medical Centre researcher who led a study published in the Alzheimer's Association journal Diagnosis, Assessment & Disease Monitoring, new technologies that can record subtle changes in a patient's voice may help doctors diagnose cognitive impairment and Alzheimer's disease before symptoms manifest. “Our focus was on identifying subtle language and audio changes that are present in the very early stages of Alzheimer’s disease but not easily recognizable by family members or an individual’s primary care physician,” said Ihab Hajjar, M.D., Professor of Neurology at UT Southwestern’s Peter O’Donnell Jr. Brain Institute. Advanced machine learning and natural language processing (NLP) methods were employed by researchers to analyze the speech patterns of 206 participants, 114 of whom had mild cognitive impairment and 92 of whom did not. The scientists then linked these results to frequently employed biomarkers to ascertain how well they may measure impairment. Before being asked to record a spontaneous 1- to 2-minute description of artwork, study participants—who were involved in a research program at Emory University in Atlanta—underwent a number of common cognitive tests. “The recorded descriptions of the picture provided us with an approximation of conversational abilities that we could study via artificial intelligence to determine speech motor control, idea density, grammatical complexity, and other speech features,” Dr. Hajjar said. [caption id="attachment_170036" align="aligncenter" width="1242"]AI-workplace Image credit : Generation Digital[/caption] To ascertain how well the digital voice biomarkers identified the status and progression of both mild cognitive impairment and Alzheimer's disease, the research team compared the subjects' speech analytics to their cerebral spinal fluid samples and MRI scans. “Prior to the development of machine learning and NLP, the detailed study of speech patterns in patients was extremely labor intensive and often not successful because the changes in the early stages are frequently undetectable to the human ear,” Dr. Hajjar said. “This novel method of testing performed well in detecting those with mild cognitive impairment and more specifically in identifying patients with evidence of Alzheimer’s disease – even when it cannot be easily detected using standard cognitive assessments.” Less than 10 minutes were spent by researchers recording a patient's voice during the investigation. The administration of conventional neuropsychological testing often takes several hours. Also read : US considers limiting investment in Chinese AI companies “If confirmed with larger studies, the use of artificial intelligence and machine learning to study vocal recordings could provide primary care providers with an easy-to-perform screening tool for at-risk individuals,” Dr. Hajjar said. “Earlier diagnoses would give patients and families more time to plan for the future and give clinicians greater flexibility in recommending promising lifestyle interventions.”

By Awanish Kumar

I keep abreast of the latest technological developments to bring you unfiltered information about gadgets.


In the ever-changing world of technology and retai...


In a bid to capture the attention of users and dri...


Apple is preparing for a game-changing move with i...


Google has been making huge headways in artificial...


Elon Musk's artificial intelligence firm, xAI, is ...


In a digital showdown that has captured the attent...