Will simple absent-mindedness soon be classified as pre-Alzheimer’s?
I’m not one to worry much about whether or when my brain is going to turn to mush. I’ve made it this far with a middling IQ and sputtering synapses, so I figure I’ll adjust to whatever intellectual challenges await me. As I remarked to My Lovely Wife the other day when I couldn’t quite remember our daughter’s birthday, “It’s just going to get more entertaining.”
What I do worry about, though, is the effect ongoing dementia research may have on my geezer compatriots as they stumble into the life stage when Alzheimer’s begins to make its presence known. Each new study seems to promise either new tools to predict the disease or various strategies to slow — or even stop — its advance. The sheer volume of information, much of it conflicting and tentative, can be brain-numbing and may inspire folks to make decisions they may later regret.
While much of this research invites skepticism, I am particularly struck by the work of some Canadian neuroscientists who say they have managed to bundle a screening test with a deceptively simple cure.
Scientists have been struggling for years to identify the biomarkers that would lead to early-stage diagnosis of Alzheimer’s, so when Patrick McGeer, PhD, and his team at the University of Alberta in 2015 revealed a simple saliva test that seemed to do the trick, it caused some commotion.
“Though research is still in its infancy,” CNN reported, “the saliva test represents the exciting future of diagnostic tools in development for the detection of the neurodegenerative disease.”
Dean Sherzai, MD, PhD, director of the Alzheimer’s Disease Prevention Program at Cedars-Sinai in New York, called the results “extremely preliminary,” citing the small sample size — fewer than 100 people. And Shraddha Sapkota, PhD, a member of McGeer’s research team, admitted that the study needed to be replicated in a larger population. Nothing I’ve seen since then suggests that has occurred.
Meanwhile, that same team last month declared that it now not only has the tools to tell you — at any age — whether you’re likely to develop the disease, but it can also prescribe a cure: a daily dose of ibuprofen.
“Our discovery is a game-changer,” McGeer boasts in a statement. “We now have a simple test that can indicate if a person is fated to develop Alzheimer’s disease long before it begins to develop. Individuals can prevent that from happening through a simple solution that requires no prescription or visit to a doctor. This is a true breakthrough since it points in a direction where AD can eventually be eliminated.”
These developments raise a couple of intriguing questions: First, would you really want to know at the age of say, 55, that your brain is going to gradually turn into mush? And, second, would you welcome that early diagnosis if you also knew the only possible way to prevent your brain from turning into mush would involve taking a couple of Advil each day for the rest of your life — a protocol the FDA recently warned may lead to heart attacks and strokes?
Life is full of tough choices, but still. . . .
McGeer’s theories may be every bit as sound as any of the others making the rounds of peer-reviewed journals these days. Indeed, the Alzheimer’s Association last week released a report promoting a new research framework for scientists: Stop looking for symptoms and start focusing on biomarkers that may predict the disease.
This determination to diagnose and begin treating Alzheimer’s in its earliest stages is certainly shared by many in the field. But predicting the disease seems to defy all efforts at scientific consensus. Researchers have pointed to everything from an unawareness of memory problems and a poor sense of balance to impaired speech processing and an unfortunate cellular malfunction. Amyloid plaque may be the problem, except when it isn’t; inflammation destroys neurons, except when it doesn’t. (My colleague Michael Dregni explored these and other mysteries more elegantly than I ever could in this Experience Life piece from last year.)
And even if you could predict your dementia destiny, the question still remains: With no proven cure for the disease, would you really want to know? I’m not a big fan of disease screening of any sort, to be honest, but I can certainly understand why folks my age might want to start putting their affairs in order if they were convinced Alzheimer’s was going to be a definite part of their future.
Trouble is, before actual symptoms appear you can never be entirely certain they’re going to arrive; and even if they do arrive, they may not lead to dementia. As consumer advocate Hilda Bastian, a founder of the Cochrane Collaboration, notes in Scientific American, only 20 percent to 40 percent of those with mild cognitive impairment will develop the disease — and as many as 40 percent of those will see their cognition return to normal.
Early detection, Bastian argues, threatens to stigmatize simple forgetfulness as a sign of mental incapacity, triggering a “panic-blame-hype” cycle that does little to serve those trapped by the diagnosis. “The push to redefine cognitive problems as ‘predementia’ is part of a broader problem of expanding disease by creating ‘prediseases,’” she writes. “This variant, though, is one of the worst.”
My late father-in-law had a seemingly endless supply of war stories, which he shared with anyone willing to listen. When he gradually began to lose bits and pieces of the narrative, he knew something was up, but it never dampened his spirits or peeled away any of the layers of optimism that enclosed his daily life. His inexorable slide into Alzheimer’s continued throughout his 70s, and by the end the words that came out of his mouth were no longer related to recognizable ideas.
It’s a terrifying condition, especially for caregivers. But I suspect it would’ve been worse for him — and his loved ones — had we all known a decade earlier how things would end up. Life is best lived moment by moment, after all, without fears of the future or regrets of the past.
So, I’m determined to stay blissfully clueless about the future functionality of my gray matter, despite the ongoing dementia chatter in the scientific community. Maybe my mind will remain relatively lucid for as many years as I have left; maybe I’ll end up like my father-in-law. One thing is certain, though: I’m going to make a note about my daughter’s birthday in my calendar. I may be sanguine, but I’m not stupid.