Advertisement

Military Mind Wars

How neuroscience research can inform military counterintelligence tactics, and the moral responsibilities that accompany such research

By | November 1, 2012

ERIN LEMIEUX

Think of a question or technology of interest to neuroscience and there is an application with military or counterintelligence potential. Brain-machine interfacing can make drones (unmanned vehicles) more efficient; anti-sleep medication could prevent combat errors by men and women at war; calmative aerosols could diffuse a tense hostage situation; and new imaging devices could improve detection of deception, while a boost of certain natural neurohormones could aid in interrogation.

Earlier this year I published an update of my 2006 book Mind Wars: Brain Research and National Defense, in which I had described the implications of neuro-science for military and counterintelligence technologies. The publication of the new version, Mind Wars: Brain Science and the Military in the 21st Century, was more than justified both by the favorable reception the first edition enjoyed (surprisingly, it is still the only book on the topic), and the subsequent burst of interest in the issues raised in the first book. Both within the neuroscience community and among various governmental and nongovernmental policy organizations, the potential applications of brain research to national security are no longer being ignored.

Among neuroscientists there has been a lively discussion about scientists’ responsibility for the ultimate purposes to which their work is put. Of course, this quandary is not limited to neuroscientists, but spans a wide range of disciplines, highlighted by the recent lab-evolved strains of the H5N1 avian flu virus that were transmissible between ferrets. (See “Deliberating Over Danger,” The Scientist, April 2012.) An analogy to the early atomic physicists immediately comes to mind: the theoretical and experimental work of leading physics researchers led to the development of one of the most destructive weapons mankind has ever known. But unlike the bomb itself, which was developed for an explicit purpose under the pressures of national security, discoveries in neuroscience and other life science fields are plagued by a more challenging philosophical problem: how can we hold an individual scientist morally responsible for the ultimate applications of her or his work when such applications are often difficult to predict?

A classic example is Einstein’s 1905 paper on special relativity, which formed the theoretical basis for the subsequent work on the atomic bomb. Should Einstein therefore be held responsible for the devastation at Hiroshima and Nagasaki? Although the distinction between “basic” and “applied” science is fuzzy at the edges, drawing such a line in the sand may be necessary in order to  determine direct ethical responsibility for the application of one’s work. (Note: His later regrets notwithstanding, Einstein himself was prepared to support the bomb project for fear that Germany would get there first.)

Some within the neuroscience community have urged that their colleagues simply pledge not to work on projects that could be of interest to the national security establishment. But one practical problem with this proposal is that researchers don’t always know the precise source, let alone the ultimate purpose, of the funds associated with a request for proposals. In the 1950s, for example, the CIA created a front organization called the Society for the Study of Human Ecology that funded research on hallucinogens and other “brainwashing” experiments. Researchers funded by this Society never knew the CIA was behind it.

And then there is the obvious fact that not all neuroscientists will agree that doing work for a national security agency poses ethical problems. Some may consider it an act of patriotism—for example, some scientists may view their work with such agencies as central to the fight against theocratic regimes—or simply of professional self-interest. Finally, if neuroscientists worried about the unpredictable uses their work might be put to, what it might mean in practice is refusing to accept funding from certain government agencies altogether.

All this might seem like a lot of rationalization were it not for the fact that I am a philosopher and historian, not a neuroscientist, so in that sense I have no dog in this hunt. However, there is one related effort that I can see no reasonable objection to neuroscientists’ vigorous involvement in: the formulation of international conventions to establish limits on the use of neuroscience-based innovations in international conflict. Some drugs are already covered by the treaties against chemical and biological weapons, and international human rights law generically prohibits many practices that exploit knowledge of the brain and central nervous system.  But, for example, the convergence of neuroscience with cyber- and drone technology raises new questions that have not been previously contemplated.

Neuroscience is moving along swiftly, and experimental applications are proceeding at a dizzying pace. But it can take decades for sovereign states to reach binding multilateral agreements. Given the consequences of having been slow to implement rules governing the use of atomic weapons, future generations might well wonder: What are we waiting for now?

Jonathan D. Moreno is the David and Lyn Silfen University Professor at the University of Pennsylvania and author of Mind Wars: Brain Research and the Military in the 21st Century. 
 

Advertisement

Add a Comment

Avatar of: You

You

Processing...
Processing...

Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo

Comments

November 13, 2012

Jonathan Moreno is clearly disingenuous when he argues (Military Mind Wars,
11.2012, 25) the ethics of military neuroscientists is as uncontroversial as Einstein’s was in the production of his 1905 paper on special relativity. Einstein
work was, theoretical and unsupported financially. its later weaponizing raised ethical issues for him and for others. it was not forseen in 1905.

  The work Moreno analogies similarly is not simply theoretical but perceived as applicable, and funded handsomely by the military for that reason. Einstein did not forsee, in 1905, the bomb. Neuroscientists under military contract work with the weapon in mind.
    Nor is Moreno correct when he says that as a philosopher he is without a stake the debate. A PhD in philosophy--or a career in bioethics--is no guarantee against bias and influence. This is especially true of those hyping and promoting their books in a field (, for example, military use of neuroscience) that is ethically contentious and at best fraught with moral uncertainties.

Tom Koch, PhD
Toronto, Canada.

Avatar of: Mary Finelli

Mary Finelli

Posts: 14

November 13, 2012

Well said, Dr. Koch. Thank you!!

Avatar of: Kathy Barker

Kathy Barker

Posts: 23

November 14, 2012

 "Finally, if neuroscientists worried about the unpredictable uses their work might be put to, what it might mean in practice is refusing to accept funding from certain government agencies altogether."

   That's right. That is one possible action if the ethical use of scientific discoveries bothers one. People of courage and conviction really can take extreme stands to stop wrongs. But there is so much more one can do, and this is really a flimsy and very sad excuse for scientists- and ethnicists- to not make moral and ethical decisions about their funding and the use of their research.

Avatar of: Kathy Barker

Kathy Barker

Posts: 23

November 14, 2012

"The publication of the new version, Mind Wars: Brain Science and the Military in the 21st Century, was more than justified both by the favorable reception the first edition enjoyed (surprisingly, it is still the only book on the topic), and the subsequent burst of interest in the issues raised in the first book."

Wow- popular mandate! I had to do it! WHere's the ethics?

Avatar of: Ed M.

Ed M.

Posts: 44

November 14, 2012

"All tools can be weapons,

Not all weapons can be tools,

Unless Death is considered

Well-built for fools."

 

 

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
The Scientist
The Scientist
Advertisement
The Scientist
The Scientist