Home About Us Media Kit Subscriptions Links Forum
EDUCATION UPDATE BLOGS
Dr. Lloyd I. Sederer: June 2012 Archives

June 2012 Archives

Neuroethics: Whose Mind Is It Anyway?

  |   Comment   |   Bookmark and Share
If you lost a limb, would you think twice about getting a mechanical prosthesis that could be operated with your thoughts? If you lost hearing or vision, you might want a cochlear implant or a visual prosthetic. What if you lost a chunk of memory, or lost your ability to tell reality from delusion, or were burdened with a relentless psychic trauma? Would you want a neuroprosthesis to restore (or enhance) brain function, or a smart pill to boost memory, or a microchip implanted in your brain?

From Frankenstein to The Terminator, stories abound that show how man's creative and innovative attempts to overcome his limitations through technology, no matter how well-intentioned, have the potential to backfire. Destructive consequences can issue as readily from acts of altruism as from those born of evil. Far more often, sadly, misfortune is the unintended result of hastily enacted and shallowly considered actions.

In a recent post, entitled "Neurotechnology: Science Fiction or Applied Science?" (Part 1), we noted several emerging neurotechnologies primed to transform the diagnosis and management of neurological and mental illnesses, yet also poised to challenge society's ethical norms and standards. Reading a person's thoughts, implanting machinery into man, and augmenting our neural processing powers with cognitive enhancers are all matters of neuroethics. They bring us face to face with questions about who has access to powerful new technologies and for what purposes. Neuroethics, however, has few laws, unchartered territories, and no sheriff.

Let's consider memory-enhancing treatments. Those with Alzheimer's disease need medications or other neurotechnologies to restore lost brain functions. Victims of TBI (traumatic brain injury) seek a medicine or procedure to erase disabling traumatic memories, those that sear into the brain after a disaster, in combat or from torture or abuse. But beyond such clearly defined therapeutic applications, imagine a military or global corporation seeking drugs or devices that sharpen focus and turbocharge mental operations. What is fair play -- and who decides?

There already is a growing market for off-label "cogniceuticals" -- drugs that promise to "fix" mediocre mental performance or mood. In the 2011 movie Limitless (see Lloyd's HuffPost piece "Limitless: Would the FDA Approve?"), Bradley Cooper's character achieves bountiful success from the fictional drug NZT, but not without near-disastrous consequences for those who take it, and for society at large.

Should doctors prescribe cognitive enhancers like NZT to boost a healthy brain, or implant artificial neurons or stem cells to further the performance of the healthy? Even before doctors get their chance, will the FDA be able to tell the difference between snake oil and the next magic bullet? What about those individuals who actually have an impairment but cannot afford treatment? Will there be a black market?

Ethicists consider these quandaries in terms of "distributive justice," or the principles used to guide the allocation of resources and services in society. What will it mean if market forces of supply and demand determine who will get what is needed and who will not? Will rationing of health care include questions of who should have access to a genius-level I.Q. or spared the mental agony of trauma?

Other applications of neurotechnology tempt us: What about brain scans that could assess a defendant's veracity? Might there be a scientific method to prove the insanity defense? Today's cutting-edge DNA fingerprinting already gives us an example of an advanced technology that's rife with legal ambiguities.

And what about your mind being read in your local shopping center or global market? Product "neuromarketing" -- corporate consumer psychoanalysis, if you will -- has advertisers using functional brain scanning to determine a buyer's unconscious reactions to the latest widget. And then the subliminal pitch developed clinches the deal. A Nielsen brain scan? Yes, and yours for only $299 from Emotiv, which already retails this neurotechnology to consumers. What should be the responsible conduct of researchers on retainer to global corporations or even to rogue governments -- and who will judge?

There is also the matter of an individual's privacy. Imagine if there were spyware that downloads your brain's account as if it were a Web browser's history. Figuring out how the mind works, reading the brain's thoughts, and determining future intentions are the ingredients of social control. This is not that far-fetched an idea.

This brings us, inescapably, to questions of human responsibility, personhood, and individual agency. Computer scientist Ray Kurzweil described, in 2005 in The Singularity Is Near: When Humans Transcend Biology, how man and machine enmesh: when computer information and neural technologies are intermixed with the "plastic" human brain to further human evolution. This is the premise of transhumanism.

Neural implants, from computer chips to optic probes, are already used for epilepsy and Parkinson's disease. Deep Brain Stimulators are being studied for the treatment of obsessive-compulsive disorder, refractory depression, Tourette's syndrome, and addictions, and even to curb obesity. Brain-computer interfaces are in development that will "fix" what might be dysfunctional neural circuits that produce human psychopathology and suffering. As cerebral implant technologies reprogram the brain, will they remake our identity? Will we manufacture the Cylons of Battlestar Gallatica? The HAL of 2001: A Space Odyssey? This is Asimov's "Frankenstein complex," where we fear (yet create) the mechanical man.

We know by now that the hubris of science will not be stopped. The allure is too great, the benefits too seductive, and the rewards potentially astronomical -- all making the risks easily dismissed. But risk must always be weighed against benefit. Progress from science, as has been said before, is science's "most important product." Yet means of mitigating risk can be installed as an equal part of our forward motion. There exist counter forces that prove just as powerful: human kindness, altruism, objectivity, and defiance against the myriad of forces bent upon manipulation and control. These are the old-fashioned protections that, if not lost, can save us from a future where the machines win.

The opinions expressed here are solely those of Drs. Erlich and Sederer, as physicians and public health advocates. Neither receives support from any pharmaceutical or medical device company.

Visit Dr. Sederer's website for questions you want answered, reviews, and stories.

Originally published in The Huffington Post on April 23, 2012.
Enhanced by Zemanta
Education Update, Inc. All material is copyrighted and may not be printed without express consent of the publisher. © 2011.