Issues

Interpreting Alzheimer Speech Part 2: Artificial Intelligence Devices?

CC0 geralt@pixabay

Interpreting Alzheimer Speech Part 1 was about trying to understand “garbled” communications from the person with dementia, especially as the speech becomes progressively impaired.  Part 2 will be a look at ideas using artificial intelligence (AI) and machine learning in this field.

Technologies are making machines more “conversational” with human interface programs.  Right, Alexa?  Right, Siri?  Right, Google Assistant? Some feel such devices may be helpful in aspects of caregiving. Specific sub-programs have been developed for dementia, but they also require a certain amount of cognitive skill, speech elaboration and comprehension.  There are problems and limitations.

The New Yorker just published May 24, 2021  “Robots….for the Old and Lonely,” personal perspectives on this subject.  It’s a fascinating account of robots meeting some emotional needs, as expressed by cognitively intact folks, written by K. Engelhart [clicking on the link may take you there.]

One striking quote:  “My last husband was a robot, but he wasn’t as good as her,” Deanna said, with a thin smile [she’s an 80-something who lives alone, describing her desktop robot.] “I know she can’t feel emotions, but that’s O.K. I feel enough for the both of us.”  Not only is Deanna speaking elegantly, with fine diction and cognitive subtlety, but she’s a comedian too!

Persons living with dementia (PLWD) are not usually as adept as Deanna.  And let’s put aside the everyday issue of important meanings in emotion-laden vocal non-verbal communication.  Within families (and organizations),  misunderstandings seem to be the fodder of much cinematic comedy and/or angst [“What we’ve got  here is failure to communicate” ]

Language researchers have used AI and statistical  methods in studying texts ranging from Shakespeare (could he have had a ghost writer?) to the Dead Sea Scrolls, to even earlier texts.  But live speech interpreting has been harder.

Let’s start with Google Translate,  evidently started when a Google founder got this machine translation of an Asian email : “The sliced raw fish shoes it wishes. Google green onion thing!” Remind you of “Colorless green ideas sleep furiously.”?…yeah, that’s right, the grammatically correct but meaningless sentence used by linguist Chomsky to make a point in his book Syntactic Structures (1957).

Google Translate technology is fascinating, especially its development.  The app  found on most smartphones can do limited live speech interpreting.  A Spanish programmer evidently adapted it to accommodate Alzheimer speech in 2018, but reviews and followup are hard to find.

Some researchers use AI to detect changes in speech patterns as a person exhibits cognitive decline ( Engelhart above mentions a Canadian robot).  There are already enough articles for a review paper (2020) of early or “automatic detection” of dementia…perhaps another smart assistant privacy issue?

A review paper of nursing staff communication techniques for PLWD, Jan 2021, did not find anything works better than individualized adaptations.  Another 2021 review, specifically of technology and PLWD, concludes the field is still in an “explorative phase.”

A very comprehensive review of 20 years of published research, along with considerable editorial context, was published last year.  It displays the variability in the field, but calls it promising.  A review that was more conceptual in nature was also published last year.

Augmentative and Alternative Communication (AAC) devices assist folks with language impairments; this is a review from 2019, from an engineering perspective.

At USC, the Narayanan Signal Analysis and Interpretation Laboratory uses AI to study speech, along with gestures and other cues, in an engineering approach to communication issues for persons with conditions such as the autism spectrum.  But they don’t seem to have a current dementia project!

CaregivingOldGuy was hoping to find some AI approach to the seemingly random phonemes produced in AD vocal output, as if there is still some residual meaning detectably coded in it. Too many expectations, perhaps.

So, plenty of folks are working on it, but nothing off the shelf yet;  those with the most communication impairment will continue to need caregiving with empathy, “understanding”, substituted judgment, and ersatz “mind-reading.”

CC0 pixabay