Opinion: The Dark Side of Science

Scientists are responsible for the foreseeable consequences of their research—good and bad.

By | November 16, 2011

FLICKR, GLEN EDELSON

Within the burgeoning field of synthetic biology, teams of biologists and engineers are making great strides in understanding the cell and its functioning. (See The Scientist’s recent feature on the topic.)   However, there is more that should be discussed than the triumphs.  There are also the dark purposes to which science (and synthetic biology in particular) can be put.  Worries range from the development of pathogenic bioweapons to the potential contamination of native gene pools in our environment.  The question is, are scientists responsible for the potentially negative impacts of their work?

Some have argued that the answer to this question is no—that it is not researchers’ responsibility how science gets used in society.  But that is sophistry.  Scientists are responsible for both the impacts they intend and some of the impacts they do not intend, if they are readily foreseeable in specific detail.  These are the standards to which we are all held as moral agents.  If I were to negligently throw a used match into a dry field (merely because I wanted to dispose of it), for example, I would be responsible for the resulting wild fire.  In contrast, Einstein was not responsible for the use of his E=mc2 equation to build an atomic bomb and its use in wartime, though the scientists at Los Alamos were.

Of course, impacts (whether harmful or beneficial) are not solely scientists’ responsibility—others involved will also bear responsibility for their actions.  If scientific knowledge is used in a biological attack, the terrorists are first and foremost responsible for their heinous act.  But the researchers who generated the knowledge may be also partly responsible.  Consider, for example, the knowledge of how to build a virus like smallpox from the ground up or how to create other pathogenic, tailored organisms—targeted either to humans or the foods on which we depend.  If it is readily foreseeable that such knowledge could be used for nefarious purposes, the scientists who introduce such new technological capacities are partially responsible for an attack that could ultimately cause millions of deaths.

Scientists can no longer hope naively that people will only use science for the public good.  The world will always have the mentally unbalanced, the delusional, the vicious, and the sociopathic members of society, some of whom will also be intelligent enough to use the results of science.  Recognizing this should be part of the everyday backdrop of science, the assessment of its potential, and the desirability of the pursuit of a particular project.

As scientists plumb the depths of the cell, they must be particularly cognizant of the potentially harmful uses of their work, in addition to all its intended benefits.  For example, knowledge of how to generate specific strings of nucleotides with high precision greatly aids research by providing particular and accurate DNA sequences with which scientists can assess cell functioning and design new living systems.  But such knowledge can also produce the raw materials for building known pathogens from scratch, as has already been done (for research purposes) with the polio virus and the Spanish flu virus.  As scientists develop ways to generate sequences of base-pairs ever more cheaply and efficiently, the opportunity for the malicious or the simply unreflective to play with pathogens to see what kind of traits arise looms larger.  And it is not just technological know-how that can be problematic.  The detailed knowledge of cellular or genetic functioning can have worrisome implications as well.  Knowledge of what makes a virus more transmissible can assist us in detecting when a virus might be more prone to producing an epidemic, but it could also be used to make viruses more virulent.

In sum, scientists are responsible for both what they intend to achieve and that which is readily foreseeable, as we all are.  There is nothing inherent in becoming a scientist that removes this burden of responsibility.  The burden can be shared—scientists can come together to decide how to proceed, or ask for input from ethicists, social scientists, even the broader public.  Alternatively, scientists could decide (and it has been proposed) that some forms of regulation—either in the selection of projects or in the control and dissemination of results—be imposed on the field of synthetic biology, to reduce the risks.  The more oversight scientists submit to, the less responsibility they bear, but it comes at the cost of the freedom to choose the type of work they can do and how they do it.  This is the essential tension:  as long as there is freedom of research, there is the responsibility that comes with it.

Heather E. Douglas is the Waterloo Chair in Science and Society at the University of Waterloo.  She earned her PhD in History and Philosophy of Science in 1998 from the University of Pittsburgh, where she is currently a visiting associate professor.  Her book, Science, Policy, and the Value-Free Ideal, published in 2009, examines the moral responsibilities of scientists. 

Add a Comment

Avatar of: You

You

Processing...
Processing...

Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo

Comments

Avatar of: maurizio

maurizio

Posts: 7

November 16, 2011

Please do mind your language...what is the meaning of "readily" in "readily foreseeable"? I'd rather opt for "obviously foreseeable", where "obvious" is in the sense used in patents law.

Anything non-obvious is...obviously not something the scientist should have thought of as a matter of course.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Exactly.  I am happy with the ammendment.

Avatar of: CraigLeslie

CraigLeslie

Posts: 2

November 16, 2011

In today’s global electronic marketplace of ideas, the concept
of regulation just seems silly. Asking scientists and their peers to bear
responsibility for the darker uses of their research in an era where retractions
following peer review are becoming increasingly common doesn't make sense.
Perhaps a more robust peer review process that included discussion of the nefarious
potentials might expose issues earlier.  But
then you might have to mandate scoundrels in the review boards.

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

I'm curious as to whether the author is willing to apply her arguments to herself.  If a politician uses the contents of the article as justification for shutting down, say, a useful line of infectious disease research (a consequence which I'd say is certainly foreseeable) does the person who wrote those words bear responsibility?  Should Dr. Douglas have censored herself to prevent such an outcome?  Or do only scientists have the responsibility to withhold dangerous information, with those who observe and comment on science being immune?

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

Well, I'm less concerned than you are about biological warfare -- it's a danger, obviously, and we should be prepared for it, but speaking both as a medic and as a scientist I'm more worried about attacks from nature than from nurture, if you see what I mean.  Research funds are limited, and we should IMO concentrate on diseases which cause widespread morbidity and mortality right now in preference to those which might possibly be used as weapons in the future.  But these areas of research are by no means mutually exclusive; it's all important, and none of it could proceed under the author's proposed restrictions.

Avatar of: Eric_AEG

Eric_AEG

Posts: 2

November 16, 2011

The arguments made are old and deal with the foundation of scientific exploration and human advancement.  With any discovery there are potential pluses and minuses.  The more powerful the discovery, the greater chance of misuse.  However, if blind restraints are placed on discovery (and this includes limiting the sharing of knowledge), the result will be to limit the chance of achieving true advancement.  Instead, the issue should be that society is responsible for how it seeks to apply knowledge.  A scientist is part of society and thus can not hide his or her head in the sand.  But equally, society in general can not hide its head.  Instead, it is society's responsibility to make itself and its inhabitants the type that do not pursue destruction but instead pursue advancement (presumable due to being happy with existence in the society).

Avatar of: Paul

Paul

Posts: 1457

November 16, 2011

There are currently many examples of things that went wrong in research that were found out, but instead of being permanently shelved were then later used for good, e.g. thalidomide.  The field of "History of Science" is vitally important for a knowledgeable public and working cadre.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Please note-- I do not ask scientists to responsible "unknown" potential uses. I am quite clear that uses must be clearly foreseeable.  Stay on target, people!

Avatar of: jawex

jawex

Posts: 3

November 16, 2011

An old and endlessly vapid thesis. We could just run through its logical errors and convenient assumptions, but this is not worth that much time. Just begin by asking, "How much of the future is foreseeable?" Then, let's follow that with, "When and how will you know?"

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

Utter nonsense. Your thesis depends upon multiple false premises:

A. That this is the ONLY person who will EVER figure out this piece of science of technology. That is laughable.

B. That all persons who figure out a science or technology are "good" and intend to do "good" according to your idealisms. No, Virginia, not everyone is Santa Claus. You acknowledge this in the past about Russia and the Nazis. But then presume, implicitly, that something has changed in humanity. At the same time the whole point of the article is to somehow withhold from those in the present who want to do "bad" (by your roadmap) information they might need to do so. This position further presumes that only "good" (by your roadmap) people are scientists.

C. That this scientist is endowed with perfect powers to forecast uses, both good and bad, for the science or technology. That therefore, they can weigh the good and bad potentials, and come to a conclusion. That is ridiculous. If scientists could do that, then they would all be billionaires because they could foresee what will happen. Scientists are not able to foresee and weigh good and bad. In fact, you are posting this on the internet. The internet is a technology that was developed for war. That was its foreseen use. So by your thesis, the scientists who worked on it should have systematically withheld information.

D. That this particular horse (bioweapons) is still in the barn. That horse isn't just out of the barn, it's on the next continent, quite literally. You propose to lock the barn and burn it down in order to control the horse that isn't even in it anymore. There are facilities all over the middle east available for implementing bioweapons with the highest level of technology. There are people there (remember bin Laden was from a very good family and well educated and that Zawahiri is a physician) who want to use those weapons to kill you. They have said so.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

My argument depends on none of these things.  

A.  It matters not just that a new piece of knowledge arises, but when and in what contexts.  Such shifts alter what the foreseeable uses are.  As that happens, responsibilities shift as well.  So that another person may in another context discover the same thing is irrelevant to the responsibilities of a person at their time and place.

B.  If scientists intend harmful things, of course that matters to their responsibilities as well.  I was, for the sake of this particular argument, assuming the general good will of scientists.  If we reject this assumption, we can roundly condemn scientists who intend to do awful things.  It seems that (happily) this is relatively rare.

C.  I do not require scientists to be perfect forecasters.  Scientists are not responsible for what is not foreseeable.  They are responsible for what is foreseeable.  

D.  I have made no specific recommendations on any particular barns to burn down or lock (to borrow your metaphor).  The argument is about responsibilities that scientists have.  How scientists choose to deal with those responsibilities-- individually or collectively-- remains their decision.

Avatar of: George Huang

George Huang

Posts: 1457

November 16, 2011

Science per se is defined as a human endeavor to understand the world we live in and to make our lives improved. No one in the right mind should use science to harm lives as the goal. Sadly, science has been always in the wrong minds as tools to harm people more so than to help people. When it comes to war, scientists can get whatever they need to come up with the most superior weapons. When it comes to understanding the world and to help people, they get nickle and dimed.
The wrong minds are those politicians throughout our entire human history. And we approved these wrong minds all along. Sigh.
The CIA and FBI now come up with this rediculour Dual Use Research Concept. Another showcase of how those wrong minds at work.

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

And anyone who has obtained patents knows that what is "obvious" is in the mind of the examiner.

Avatar of: Spengler47

Spengler47

Posts: 2

November 16, 2011

Actually, Einstein did share responsibility for the A-bomb. He wrote a letter to Roosevelt encouraging him to build it.

Avatar of: Ted Howard NZ

Ted Howard NZ

Posts: 7

November 16, 2011

Heather
As soon as Einstein saw the equation E=MC^2, the possibility of a bomb must have occurred to him.
He may have been completely ignorant of the technology involved, but the implication would have been instantaneously obvious.

Any piece of technology can be a weapon.

Look at what was done on 911 with domestic air transport devices.

Anything at all, can be used for any purpose at all.

All it takes is a little imagination to turn anything into a weapon.
It really isn't that difficult.

That we haven't yet destroyed ourselves is clear testament to the moral integrity of many in the systems that exist today.   Certainly there are some few who lack any restraint, and they are a minority - a very dangerous minority.

There is enough stuff in any western kitchen to kill thousands of people - yet very few use it in such fashion.

And I agree in a sense that there are some things that are better left undone - AI is one such, at this point in time.   We need to be treating every human being with respect, and ensuring everyone has what they need to survive, and freedom to learn, communicate and travel, before we build any sort of artificial intelligence, otherwise it is going to see us as the greatest threat to its survival, and what happens after that is not likely to be pretty.

And there are no guarantees in life.

We each need to be aware, and responsible.

If we see someone doing something dangerous, we need to respond appropriately - whatever the status or "authority" of that individual.

Avatar of: Rich Engler

Rich Engler

Posts: 2

November 16, 2011

If we follow your argument, we should never invent anything that could ever cause harm.  Following your match example, not only are you responsible for throwing the match (I agree), but the inventor of the match, and perhaps also the manufacturer and vendor of the match, are also responsible.  By extension, we never should have invented the axe because it is so easily used to harm others.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

I am not equating legal responsibility with moral responsibility.  The two overlap but are not equivalent.  And yes, science is serendipitous, but I am only suggesting scientists are responsible for foreseeable consequences.

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

"to unleash new capabilities without thinking about such issues is what I am arguing is deeply irresponsible"

It would be, if it were actually happening.

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

The author seems to think we live in a world of wild, unregulated research in which unlimited time and money are available for latter-day Frankensteins to create monstrosities in their labs, hidden from public view until the horror is unleashed.  In reality, the opposite is true.  The primary ethical concern in biomedical science is with curing disease, saving lives, and reducing suffering -- and progress toward these goals is increasingly hindered by philosophers, theologians, and politicians who inject themselves into a process they refuse to understand.

I can't help but wonder if their remote ancestors during the Paleolithic were rubbing their chins and muttering about the dangers of this new flint-chipping technology.  Of course, once the hand axe was established as part of everyday life, they were happy enough to use it, all the while warning that tying a smaller, sharper piece of flint to the end of a stick was Going Too Far ...

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

Nor do you get responsibility without freedom, or you shouldn't.  My point is that "too much freedom" is not a problem that scientists have, or are likely to have any time in the foreseeable future.

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

I don't think anyone is arguing that scientists aren't members of society, or that we can't benefit from an understanding of the history and philosophy of science.  What we're saying (or what I'm saying, anyway, and I think a lot of other posters are too) is that we already are very aware of the ethical consequences of our work, and that a workable code of scientific ethics requires an understanding of the problems science _actually_ faces, not the Frankensteinian straw man presented in the article.

Avatar of: Raul Valedor Valencia

Raul Valedor Valencia

Posts: 1457

November 16, 2011

So we should think of the harm Wikipedia is doing? You can get enough knowledge from there to build a small bomb, or extract cholera toxin, spider venom, snake venom, etc. Should Wikipedia held responsibility for this? 

Avatar of: agelbert

agelbert

Posts: 50

November 16, 2011

The main problem that (ethical) scientists face is that they are EMPLOYEES. They sign corporate boiler plate legalese crapola that leaves them with a very narrow window open for working for the good of mankind and several cameras looking through a giant window at all their work. Most of the conscienseless greedballs with phds looking over the scientist's shoulder will militarize (i.e. use a discovery to kill humans and/or animals more efficiently), bury any discover that would make obsolete the use of expensive medical imaging technology, ensure a wonder drug is priced to only serve a few while the rest of humanity goes without, misuse new surveillance technology discoveries to decrease privacy and increase thought control for profit, etc.

It is a bad time for ethical scientists because our entire system is unsustainable. The only legitimate status quo for a human society is one that absorbs as many toxic waste products as it produces. Anything else leads logically to extinction.

In their zeal to extenalize costs of an industrial society on those without power or influence, the corporations have forgotten what any scientist knows too well; Shitting where you eat is a death sentence. 

http://www.heatisonline.org/vi...

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

The 1939 letter was written by Szilard and Wigner.  Einstein merely signed it.  And it had little impact on the direction of bomb research. (The NDRC and the MAUD report were far more important.)  But the point is this:  when Einstein conceived of his equation in the first decade of the 20th century, the bomb was not foreseeable, and so he was not responsible via that piece of work.

Avatar of: Rich Engler

Rich Engler

Posts: 2

November 16, 2011

If we follow your argument, we should never invent anything

Avatar of: Balazs

Balazs

Posts: 1457

November 16, 2011

I wholly agree. I think it's wrong that scientists and researchers should have to shoulder responsibility for the misuse of their work. If this is the case then it will greatly disincentivise people from taking a career in science or research. If a scientist has to constantly worry about the threat of being sued or prosecuted for the potential misuse of their work, then we really have reached the pits of knowledge. If anything we should be affording scientists more protection, so that they're not afraid to research and study, for today's litigation society plenty of us have been disillusioned.
So much scientific work has been serendipitous, and uses found that weren't the original intention, yet has provided the public and the world a great benefit. We cannot and should not be discouraging people from undertaking research based on the conjecture that their potential work may be used inappropriately, for it is not possible to foresee the results of research.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Recall that scientists are only responsible for the foreseeable impacts and uses.  And if there is such a threat of harm, it is the scientist's responsibility to mitigate it, perhaps by withholding key pieces of information from publication, perhaps by altering what is being researched so it is less amenable to nefarious use.  But to unleash new capabilities without thinking about such issues is what I am arguing is deeply irresponsible.

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

Yes. The metaphor is simply wrong. The correct metaphor is to ask whether the alchemist, long ago, who invented the match should have made his invention public. It is obvious to anyone that matches would be used to deliberately set fires. According to this article, he has responsibility for all the arsonists who have used them since.

This article is an outstanding example of fuzzy thinking and mixed metaphors.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Of course the primary concerns are about what a scientist intends to achieve.  But that does not remove responsibility for what is not intended and clearly foreseeable.  That responsibility remains.  We all bear it in general, and scientists have no excemption.  Whether to proceed in the face of the risks is the choice scientists have, as part of their freedom to do research.  You do not get freedom without responsibility.

Avatar of: primativewriter

primativewriter

Posts: 22

November 16, 2011

There was a time when many great scientsits were Bible believing persons.

Avatar of: Spengler47

Spengler47

Posts: 2

November 16, 2011

Granted, the atomic bomb was not foreseen at the time Einstein published his equation. Nevertheless, Einstein provided encouragement for the development of the A-bomb when he wrote to Roosevelt. Although Szilard and Wigner authored the letter, Einstein signed it. He lent the prestige of his name to the letter and, therefore, to the resulting research. Doubtless, other people had greater responsibility, but Einstein shared responsibility. Given that other countries were also working on atomic weapons research (Nazi Germany, Soviet Russia), the decision of the US government to pursue such research was probably inevitable. But, if blame is to be assigned, Einstein must share the blame.

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

This is, in fact, what happens. Well meaning people like Ms. Douglas convince politicians to make it so expensive and onerous to work that our ability to function is hugely slowed down.

Slowing down our ability to do research results in:
A. Loss of life because we cannot move faster.
B. Loss of ability to respond to a biological attack because we don't understand it well enough.
C. Allow the competition, who have cultural mores that are NOT restricted as we are to get closer and closer. Eventually, those parties will surpass us. And then we will really be in trouble.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Of course I do apply the same arguments to myself.  They are reflexive, and I bear the responsibilities for foreseeable consequences.  Note as well that the arguments in the article do not call for external regulation of science.  However, if scientists are unwilling to bear the burden of responsibility themselves, they need to structure institutions that will do so for them.  

Avatar of: Daniel Dvorkin

Daniel Dvorkin

Posts: 20

November 16, 2011

This.  This, exactly.

I'm particularly disturbed by the author's argument that relevant information should be withheld from publication.  "These are my results but I can't tell you how I got them" isn't science, it's alchemy.

Avatar of: kwinner

kwinner

Posts: 2

November 16, 2011

Weapons technology was built long ago and is continually refined, but, properly, we never hold the builders responsible.  Death and injury are the responsibility of the person who use a weapon, even if they do so in ignorance. If you want people to use technology responsibly, what you want to spend money on is education on ethics and the physical repercussions of using each technology (explosives blow things apart; viruses spread at a certain rate in certain species; etc.).  

Scientists should be held responsible for the direct repercussions of their actions, just like anyone else; in the case of synthetic biology, engineered organisms should be isolated appropriately until there is reliable proof that the organism cannot harm humans or their environment.  But, forcing scientists to shoulder an amorphous responsibility for all the potential uses of their work, known or unknown, would paralyze science.

Avatar of: Kathy Barker

Kathy Barker

Posts: 9

November 16, 2011

I wish I believed that most scientists were aware of the ethical consequences of their work. What I have seen more of is a belief that because they have chosen to work in infectious diseases, or in public health, or in academia, or other places or fields often assumed to be inherently ethical, that no further thought or action is necessary.

For example, there are many ethical decisions in funding. If you were against war, would you accept funding from the DoD? Many scientists do, rationalizing their opposing philosophies to keep the project funded.

"First, do no harm" would be a good place to start in a code of ethics.
Harm means different things to different people- but some guide to counter daily denial and lack of introspection would be grand.

Avatar of: heatherEdouglas

heatherEdouglas

Posts: 14

November 16, 2011

Responsibility is about how to apportion praise and blame.  It is a quite clear concept, with a wide range of potential implications, from verbal praise and blame to monetary rewards and fines to more coercive consequences.  The article was merely about how to apportion praise and blame, not what else should follow from that.

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

I have been wrestling for years with whether I should write a definitive book on biological warfare. Ms. Douglas has just made up my mind. The book needs to be written. The public needs to know exactly what the threats are, how they work in the ecosystem of war, and exactly what we have to do to deal with them.

Shoving our head in deeply into the sand is not it.

Avatar of: Ted Howard NZ

Ted Howard NZ

Posts: 7

November 16, 2011

Based on this argument, and scientist that works for a social paradigm that includes concepts like patriotism, and "respect for authority", ought to be able to figure out that whatever they do will be used to at best coercively control someone at some point, and at worst cause death and mayhem.

On that basis no scientist ought to accept employment in any state run institution, any defense establishment (because they know that the best defense is a strong offense), nor any corporation connected to either.

How many employed scientists do you think there would be?

No.

One has to be responsible for one's own actions.
For some that may mean resignation, or change of employment at some time; and for others it means willful ignorance.

It is not knowledge itself, but the paradigms within which it is organised, that lead to outcomes.   Context is king.

It is what we choose to do with our knowledge, and how we choose to relate it to other knowledge, and how that effects our choices.

Does anyone seriously think that nationalism is there for the benefit of the majority?

Does any scientist seriously believe that any group of individuals is really significantly different from any other group on a genetic basis?

Aren't we all just one species?

Aren't we capable of organising ourselves in ways that ensure the security and freedom of everyone?

Are we not capable of clearly seeing that money is only a measure of exchange value, which value includes a scarcity measure, and as such, does not value abundance at all (consider oxygen, abundant, valuable, but no monetary value)?

Are we not capable of seeing that societal needs require abundance of some basic commodities, and that money and free markets cannot (ever, even in theory) provide such abundance to all)?

Money can be a useful tool, but it is a very poor director.

We as scientists need to be responsible for our choices, certainly.

Who or what are you working for, and what is the likely consequence of doing so?

Avatar of: primativewriter

primativewriter

Posts: 22

November 16, 2011

I think the author is trying to say what most people say is common sense. Your actions have reactions and you should think about what impact you will have not only on yourself but the world as a whole. My father refused to teach us kids how to make snares because we might kill our own pets. Cause and effect.
As an engineer I see the world is complicated and governed by laws. You break them and well even a child knows stuff happens. So think ahead. Reality is you don't always know, but sometimes you should. Life is interconnected.

The Bible simply says "everyone will give an account."

Avatar of: DavidPBeck

DavidPBeck

Posts: 1

November 16, 2011

I suppose what I cannot understand about Dr. Douglas' article is what she means by "bear responsibility."  Very many of the misuses of the research results of the area in which I used to work were/are forseeable, at least in concept.  Although I do not doubt that there are "scientists" working in governmental or terrorist agencies working on evil science, research that can lead to understanding or contributing to a cure for XYZ disease cannot/should not be abandoned because of the potential for misuse.  So for what, exactly, should the scientist be "responsible in addition to the development of a new diagnostic or therapeutic?  Should all the scientists who worked on gene splicing, for example, have bad dreams or be sentenced to serving as test subjects for the latest MWD?  Is "responsibility" some sort of mystical group guilt?

Avatar of: meho0606dj

meho0606dj

Posts: 1

November 16, 2011

Scientists/Researchers are responsible for all aspects of their work, tey take great graditude and joy for all the good out come, tey can accept responssibilitie for all the really bad out come wheather done by them or somebody else it is their discovery and they know exactly the negitive out comes, and what could be done with their discovery good or bad by them or anyone else they know the potental of each discovery. If you can't accept blame then give up what your doing and do some thing you can live with at accepting the blame.   

Avatar of: Brian Hanley

Brian Hanley

Posts: 66

November 16, 2011

Yes your argument does depend on these undeclared assumptions. You are studiously avoiding dealing with the arguments. You are cherry-picking your data. (Ignoring the example of the effect of US bioweapons research shutdown and other matters and invoking the name of the sainted Einstein.)  

A. You conflate my point about the implicit presumption that Scientist X in our bailiwick will be the only one to discover something with your follow-on conclusion that by withholding information Scientist X will accomplish good as intended. While these points are related, you do imply that this is the case.

However, if I accept that you do not think Scientist X will be the only one, then your thesis is even more flawed. Because how can Scientist X know if Nefarious Y has also discovered it? And if Nefarious Y has made such a discovery, then won't withholding information definitionally damage the society of Scientist X?

And if, as you seem to be possibly suggesting, Scientist X is only responsible to withhold information for as long as he/she knows nobody else has publicized it, that adds yet another layer of impossibility, perfect information.

B. It appears from your response that you propose that what is awful is an absolute standard? You, a philosopher, suggest that? If so, exactly how far do you carry this? Obviously, by your standard, all the scientists who worked on nuclear weapons would be, in your thesis intending awful things. Therefore, your contention that intending awfulness is relatively rare is rubbish. By your standards, it is as common as grass. And yet, you say it isn't. But in your article you specifically use the example of nuclear scientists? Eh?

C. Yes, you do require scientists to be perfect forecasters. You cannot get away from this, because according to you, they must know that by withholding information they will do more good than they do by disseminating it. Or, if they don't know it, then you do.

"But", you say, over and over, "they are only responsible for what they can foresee." So maybe you don't actually think they will necessarily do good by withholding. And yet, you tell them to do so.

Dear god. A philosopher, and a chair no less, who has never heard of the road to perdition being paved with good intentions? You have seriously never entertained so much as a smidgen of thought about unintended consequences?

Ah, I can almost hear you say, "Of course I have. But only the foreseeable matters." Silliness. Even the law recognizes responsibility for results of acts that are unintended.

D. Yes you have made recommendations. You have said that what happens with the work of a scientist is on the shoulders of that person. You have specifically mentioned withholding information. And your general position is the foundation upon which the laws that Congress passed to lock the barn door were made. So don't play games. You know exactly what you are pushing on the world. You think it is right, and you think it is so obvious that everyone should agree with you.

Come, Ms. Douglas. You are penning sophistries worthy of cartoons! This is like having a debate with a freshman.

Avatar of: Kathy Barker

Kathy Barker

Posts: 9

November 16, 2011

Wonderful article, puzzling comments. Membership in society doesn't disappear when one becomes a scientist, and it is that part of us that should consider the usefulness of our work.

A history of science class might go a long way towards helping scientists see where their work might be used and misused. With more introspection and discussion about project, field of study, funding and career choices, and the use and misuse of science, scientists may be able to look a little further downstream. 

Of course there are unintended consequences that go along with amazing discoveries made with the best of intentions. But if they happen, and scientist/citizens don't speak up, then they bear at least the same responsibility that every citizen has to prevent harm.

Avatar of: rusty94114

rusty94114

Posts: 9

November 16, 2011

What does it actually mean to say that scientists are "responsible for the foreseeable consequences of their research"? Does it mean that they should be punished if the consequences are bad? Or does it merely mean that historians should note their willing participation in the causal process leading to those consequences?

The concept of "responsibility" -- nearly as vague as the notion of "sin" -- is easily applied to anything one happens to dislike. It is therefore almost useless as a guide for making decisions in the real world.

Avatar of:

Posts: 0

November 16, 2011

I wholly agree. I think it's wrong that scientists and researchers should have to shoulder responsibility for the misuse of their work. If this is the case then it will greatly disincentivise people from taking a career in science or research. If a scientist has to constantly worry about the threat of being sued or prosecuted for the potential misuse of their work, then we really have reached the pits of knowledge. If anything we should be affording scientists more protection, so that they're not afraid to research and study, for today's litigation society plenty of us have been disillusioned.
So much scientific work has been serendipitous, and uses found that weren't the original intention, yet has provided the public and the world a great benefit. We cannot and should not be discouraging people from undertaking research based on the conjecture that their potential work may be used inappropriately, for it is not possible to foresee the results of research.

Avatar of:

Posts: 0

November 16, 2011

Of course I do apply the same arguments to myself.  They are reflexive, and I bear the responsibilities for foreseeable consequences.  Note as well that the arguments in the article do not call for external regulation of science.  However, if scientists are unwilling to bear the burden of responsibility themselves, they need to structure institutions that will do so for them.  

Avatar of:

Posts: 0

November 16, 2011

Wonderful article, puzzling comments. Membership in society doesn't disappear when one becomes a scientist, and it is that part of us that should consider the usefulness of our work.

A history of science class might go a long way towards helping scientists see where their work might be used and misused. With more introspection and discussion about project, field of study, funding and career choices, and the use and misuse of science, scientists may be able to look a little further downstream. 

Of course there are unintended consequences that go along with amazing discoveries made with the best of intentions. But if they happen, and scientist/citizens don't speak up, then they bear at least the same responsibility that every citizen has to prevent harm.

Avatar of:

Posts: 0

November 16, 2011

In today’s global electronic marketplace of ideas, the concept
of regulation just seems silly. Asking scientists and their peers to bear
responsibility for the darker uses of their research in an era where retractions
following peer review are becoming increasingly common doesn't make sense.
Perhaps a more robust peer review process that included discussion of the nefarious
potentials might expose issues earlier.  But
then you might have to mandate scoundrels in the review boards.

Avatar of:

Posts: 0

November 16, 2011

Heather
As soon as Einstein saw the equation E=MC^2, the possibility of a bomb must have occurred to him.
He may have been completely ignorant of the technology involved, but the implication would have been instantaneously obvious.

Any piece of technology can be a weapon.

Look at what was done on 911 with domestic air transport devices.

Anything at all, can be used for any purpose at all.

All it takes is a little imagination to turn anything into a weapon.
It really isn't that difficult.

That we haven't yet destroyed ourselves is clear testament to the moral integrity of many in the systems that exist today.   Certainly there are some few who lack any restraint, and they are a minority - a very dangerous minority.

There is enough stuff in any western kitchen to kill thousands of people - yet very few use it in such fashion.

And I agree in a sense that there are some things that are better left undone - AI is one such, at this point in time.   We need to be treating every human being with respect, and ensuring everyone has what they need to survive, and freedom to learn, communicate and travel, before we build any sort of artificial intelligence, otherwise it is going to see us as the greatest threat to its survival, and what happens after that is not likely to be pretty.

And there are no guarantees in life.

We each need to be aware, and responsible.

If we see someone doing something dangerous, we need to respond appropriately - whatever the status or "authority" of that individual.

Avatar of:

Posts: 0

November 16, 2011

What does it actually mean to say that scientists are "responsible for the foreseeable consequences of their research"? Does it mean that they should be punished if the consequences are bad? Or does it merely mean that historians should note their willing participation in the causal process leading to those consequences?

The concept of "responsibility" -- nearly as vague as the notion of "sin" -- is easily applied to anything one happens to dislike. It is therefore almost useless as a guide for making decisions in the real world.

Avatar of:

Posts: 0

November 16, 2011

Of course the primary concerns are about what a scientist intends to achieve.  But that does not remove responsibility for what is not intended and clearly foreseeable.  That responsibility remains.  We all bear it in general, and scientists have no excemption.  Whether to proceed in the face of the risks is the choice scientists have, as part of their freedom to do research.  You do not get freedom without responsibility.

Avatar of:

Posts: 0

November 16, 2011

Utter nonsense. Your thesis depends upon multiple false premises:

A. That this is the ONLY person who will EVER figure out this piece of science of technology. That is laughable.

B. That all persons who figure out a science or technology are "good" and intend to do "good" according to your idealisms. No, Virginia, not everyone is Santa Claus. You acknowledge this in the past about Russia and the Nazis. But then presume, implicitly, that something has changed in humanity. At the same time the whole point of the article is to somehow withhold from those in the present who want to do "bad" (by your roadmap) information they might need to do so. This position further presumes that only "good" (by your roadmap) people are scientists.

C. That this scientist is endowed with perfect powers to forecast uses, both good and bad, for the science or technology. That therefore, they can weigh the good and bad potentials, and come to a conclusion. That is ridiculous. If scientists could do that, then they would all be billionaires because they could foresee what will happen. Scientists are not able to foresee and weigh good and bad. In fact, you are posting this on the internet. The internet is a technology that was developed for war. That was its foreseen use. So by your thesis, the scientists who worked on it should have systematically withheld information.

D. That this particular horse (bioweapons) is still in the barn. That horse isn't just out of the barn, it's on the next continent, quite literally. You propose to lock the barn and burn it down in order to control the horse that isn't even in it anymore. There are facilities all over the middle east available for implementing bioweapons with the highest level of technology. There are people there (remember bin Laden was from a very good family and well educated and that Zawahiri is a physician) who want to use those weapons to kill you. They have said so.

Avatar of:

Posts: 0

November 16, 2011

I think the author is trying to say what most people say is common sense. Your actions have reactions and you should think about what impact you will have not only on yourself but the world as a whole. My father refused to teach us kids how to make snares because we might kill our own pets. Cause and effect.
As an engineer I see the world is complicated and governed by laws. You break them and well even a child knows stuff happens. So think ahead. Reality is you don't always know, but sometimes you should. Life is interconnected.

The Bible simply says "everyone will give an account."

Avatar of:

Posts: 0

November 16, 2011

Actually, Einstein did share responsibility for the A-bomb. He wrote a letter to Roosevelt encouraging him to build it.

Avatar of:

Posts: 0

November 16, 2011

The arguments made are old and deal with the foundation of scientific exploration and human advancement.  With any discovery there are potential pluses and minuses.  The more powerful the discovery, the greater chance of misuse.  However, if blind restraints are placed on discovery (and this includes limiting the sharing of knowledge), the result will be to limit the chance of achieving true advancement.  Instead, the issue should be that society is responsible for how it seeks to apply knowledge.  A scientist is part of society and thus can not hide his or her head in the sand.  But equally, society in general can not hide its head.  Instead, it is society's responsibility to make itself and its inhabitants the type that do not pursue destruction but instead pursue advancement (presumable due to being happy with existence in the society).

Avatar of:

Posts: 0

November 16, 2011

Please note-- I do not ask scientists to responsible "unknown" potential uses. I am quite clear that uses must be clearly foreseeable.  Stay on target, people!

Avatar of:

Posts: 0

November 16, 2011

I have been wrestling for years with whether I should write a definitive book on biological warfare. Ms. Douglas has just made up my mind. The book needs to be written. The public needs to know exactly what the threats are, how they work in the ecosystem of war, and exactly what we have to do to deal with them.

Shoving our head in deeply into the sand is not it.

Avatar of:

Posts: 0

November 16, 2011

Of course the primary concerns are about what a scientist intends to achieve.  But that does not remove responsibility for what is not intended and clearly foreseeable.  That responsibility remains.  We all bear it in general, and scientists have no excemption.  Whether to proceed in the face of the risks is the choice scientists have, as part of their freedom to do research.  You do not get freedom without responsibility.

Avatar of:

Posts: 0

November 16, 2011

Yes your argument does depend on these undeclared assumptions. You are studiously avoiding dealing with the arguments. You are cherry-picking your data. (Ignoring the example of the effect of US bioweapons research shutdown and other matters and invoking the name of the sainted Einstein.)  

A. You conflate my point about the implicit presumption that Scientist X in our bailiwick will be the only one to discover something with your follow-on conclusion that by withholding information Scientist X will accomplish good as intended. While these points are related, you do imply that this is the case.

However, if I accept that you do not think Scientist X will be the only one, then your thesis is even more flawed. Because how can Scientist X know if Nefarious Y has also discovered it? And if Nefarious Y has made such a discovery, then won't withholding information definitionally damage the society of Scientist X?

And if, as you seem to be possibly suggesting, Scientist X is only responsible to withhold information for as long as he/she knows nobody else has publicized it, that adds yet another layer of impossibility, perfect information.

B. It appears from your response that you propose that what is awful is an absolute standard? You, a philosopher, suggest that? If so, exactly how far do you carry this? Obviously, by your standard, all the scientists who worked on nuclear weapons would be, in your thesis intending awful things. Therefore, your contention that intending awfulness is relatively rare is rubbish. By your standards, it is as common as grass. And yet, you say it isn't. But in your article you specifically use the example of nuclear scientists? Eh?

C. Yes, you do require scientists to be perfect forecasters. You cannot get away from this, because according to you, they must know that by withholding information they will do more good than they do by disseminating it. Or, if they don't know it, then you do.

"But", you say, over and over, "they are only responsible for what they can foresee." So maybe you don't actually think they will necessarily do good by withholding. And yet, you tell them to do so.

Dear god. A philosopher, and a chair no less, who has never heard of the road to perdition being paved with good intentions? You have seriously never entertained so much as a smidgen of thought about unintended consequences?

Ah, I can almost hear you say, "Of course I have. But only the foreseeable matters." Silliness. Even the law recognizes responsibility for results of acts that are unintended.

D. Yes you have made recommendations. You have said that what happens with the work of a scientist is on the shoulders of that person. You have specifically mentioned withholding information. And your general position is the foundation upon which the laws that Congress passed to lock the barn door were made. So don't play games. You know exactly what you are pushing on the world. You think it is right, and you think it is so obvious that everyone should agree with you.

Come, Ms. Douglas. You are penning sophistries worthy of cartoons! This is like having a debate with a freshman.

Avatar of:

Posts: 0

November 16, 2011

This is, in fact, what happens. Well meaning people like Ms. Douglas convince politicians to make it so expensive and onerous to work that our ability to function is hugely slowed down.

Slowing down our ability to do research results in:
A. Loss of life because we cannot move faster.
B. Loss of ability to respond to a biological attack because we don't understand it well enough.
C. Allow the competition, who have cultural mores that are NOT restricted as we are to get closer and closer. Eventually, those parties will surpass us. And then we will really be in trouble.

Avatar of:

Posts: 0

November 16, 2011

Exactly.  I am happy with the ammendment.

Avatar of:

Posts: 0

November 16, 2011

If we follow your argument, we should never invent anything

Avatar of:

Posts: 0

November 16, 2011

"to unleash new capabilities without thinking about such issues is what I am arguing is deeply irresponsible"

It would be, if it were actually happening.

Avatar of:

Posts: 0

November 16, 2011

So we should think of the harm Wikipedia is doing? You can get enough knowledge from there to build a small bomb, or extract cholera toxin, spider venom, snake venom, etc. Should Wikipedia held responsibility for this? 

Avatar of:

Posts: 0

November 16, 2011

Science per se is defined as a human endeavor to understand the world we live in and to make our lives improved. No one in the right mind should use science to harm lives as the goal. Sadly, science has been always in the wrong minds as tools to harm people more so than to help people. When it comes to war, scientists can get whatever they need to come up with the most superior weapons. When it comes to understanding the world and to help people, they get nickle and dimed.
The wrong minds are those politicians throughout our entire human history. And we approved these wrong minds all along. Sigh.
The CIA and FBI now come up with this rediculour Dual Use Research Concept. Another showcase of how those wrong minds at work.

Avatar of:

Posts: 0

November 16, 2011

This is, in fact, what happens. Well meaning people like Ms. Douglas convince politicians to make it so expensive and onerous to work that our ability to function is hugely slowed down.

Slowing down our ability to do research results in:
A. Loss of life because we cannot move faster.
B. Loss of ability to respond to a biological attack because we don't understand it well enough.
C. Allow the competition, who have cultural mores that are NOT restricted as we are to get closer and closer. Eventually, those parties will surpass us. And then we will really be in trouble.

Avatar of:

Posts: 0

November 16, 2011

I am not equating legal responsibility with moral responsibility.  The two overlap but are not equivalent.  And yes, science is serendipitous, but I am only suggesting scientists are responsible for foreseeable consequences.

Avatar of:

Posts: 0

November 16, 2011

Exactly.  I am happy with the ammendment.

Avatar of:

Posts: 0

November 16, 2011

Scientists/Researchers are responsible for all aspects of their work, tey take great graditude and joy for all the good out come, tey can accept responssibilitie for all the really bad out come wheather done by them or somebody else it is their discovery and they know exactly the negitive out comes, and what could be done with their discovery good or bad by them or anyone else they know the potental of each discovery. If you can't accept blame then give up what your doing and do some thing you can live with at accepting the blame.   

Avatar of:

Posts: 0

November 16, 2011

Actually, Einstein did share responsibility for the A-bomb. He wrote a letter to Roosevelt encouraging him to build it.

Avatar of:

Posts: 0

November 16, 2011

Weapons technology was built long ago and is continually refined, but, properly, we never hold the builders responsible.  Death and injury are the responsibility of the person who use a weapon, even if they do so in ignorance. If you want people to use technology responsibly, what you want to spend money on is education on ethics and the physical repercussions of using each technology (explosives blow things apart; viruses spread at a certain rate in certain species; etc.).  

Scientists should be held responsible for the direct repercussions of their actions, just like anyone else; in the case of synthetic biology, engineered organisms should be isolated appropriately until there is reliable proof that the organism cannot harm humans or their environment.  But, forcing scientists to shoulder an amorphous responsibility for all the potential uses of their work, known or unknown, would paralyze science.

Avatar of:

Posts: 0

November 16, 2011

Yes. The metaphor is simply wrong. The correct metaphor is to ask whether the alchemist, long ago, who invented the match should have made his invention public. It is obvious to anyone that matches would be used to deliberately set fires. According to this article, he has responsibility for all the arsonists who have used them since.

This article is an outstanding example of fuzzy thinking and mixed metaphors.

Avatar of:

Posts: 0

November 16, 2011

An old and endlessly vapid thesis. We could just run through its logical errors and convenient assumptions, but this is not worth that much time. Just begin by asking, "How much of the future is foreseeable?" Then, let's follow that with, "When and how will you know?"

Avatar of:

Posts: 0

November 16, 2011

I wish I believed that most scientists were aware of the ethical consequences of their work. What I have seen more of is a belief that because they have chosen to work in infectious diseases, or in public health, or in academia, or other places or fields often assumed to be inherently ethical, that no further thought or action is necessary.

For example, there are many ethical decisions in funding. If you were against war, would you accept funding from the DoD? Many scientists do, rationalizing their opposing philosophies to keep the project funded.

"First, do no harm" would be a good place to start in a code of ethics.
Harm means different things to different people- but some guide to counter daily denial and lack of introspection would be grand.

Avatar of:

Posts: 0

November 16, 2011

And anyone who has obtained patents knows that what is "obvious" is in the mind of the examiner.

Avatar of:

Posts: 0

November 16, 2011

I suppose what I cannot understand about Dr. Douglas' article is what she means by "bear responsibility."  Very many of the misuses of the research results of the area in which I used to work were/are forseeable, at least in concept.  Although I do not doubt that there are "scientists" working in governmental or terrorist agencies working on evil science, research that can lead to understanding or contributing to a cure for XYZ disease cannot/should not be abandoned because of the potential for misuse.  So for what, exactly, should the scientist be "responsible in addition to the development of a new diagnostic or therapeutic?  Should all the scientists who worked on gene splicing, for example, have bad dreams or be sentenced to serving as test subjects for the latest MWD?  Is "responsibility" some sort of mystical group guilt?

Avatar of:

Posts: 0

November 16, 2011

There are currently many examples of things that went wrong in research that were found out, but instead of being permanently shelved were then later used for good, e.g. thalidomide.  The field of "History of Science" is vitally important for a knowledgeable public and working cadre.

Avatar of:

Posts: 0

November 16, 2011

If we follow your argument, we should never invent anything

Avatar of:

Posts: 0

November 16, 2011

Yes. The metaphor is simply wrong. The correct metaphor is to ask whether the alchemist, long ago, who invented the match should have made his invention public. It is obvious to anyone that matches would be used to deliberately set fires. According to this article, he has responsibility for all the arsonists who have used them since.

This article is an outstanding example of fuzzy thinking and mixed metaphors.

Avatar of:

Posts: 0

November 16, 2011

Responsibility is about how to apportion praise and blame.  It is a quite clear concept, with a wide range of potential implications, from verbal praise and blame to monetary rewards and fines to more coercive consequences.  The article was merely about how to apportion praise and blame, not what else should follow from that.

Avatar of:

Posts: 0

November 16, 2011

Nor do you get responsibility without freedom, or you shouldn't.  My point is that "too much freedom" is not a problem that scientists have, or are likely to have any time in the foreseeable future.

Avatar of:

Posts: 0

November 16, 2011

My argument depends on none of these things.  

A.  It matters not just that a new piece of knowledge arises, but when and in what contexts.  Such shifts alter what the foreseeable uses are.  As that happens, responsibilities shift as well.  So that another person may in another context discover the same thing is irrelevant to the responsibilities of a person at their time and place.

B.  If scientists intend harmful things, of course that matters to their responsibilities as well.  I was, for the sake of this particular argument, assuming the general good will of scientists.  If we reject this assumption, we can roundly condemn scientists who intend to do awful things.  It seems that (happily) this is relatively rare.

C.  I do not require scientists to be perfect forecasters.  Scientists are not responsible for what is not foreseeable.  They are responsible for what is foreseeable.  

D.  I have made no specific recommendations on any particular barns to burn down or lock (to borrow your metaphor).  The argument is about responsibilities that scientists have.  How scientists choose to deal with those responsibilities-- individually or collectively-- remains their decision.

Avatar of:

Posts: 0

November 16, 2011

The main problem that (ethical) scientists face is that they are EMPLOYEES. They sign corporate boiler plate legalese crapola that leaves them with a very narrow window open for working for the good of mankind and several cameras looking through a giant window at all their work. Most of the conscienseless greedballs with phds looking over the scientist's shoulder will militarize (i.e. use a discovery to kill humans and/or animals more efficiently), bury any discover that would make obsolete the use of expensive medical imaging technology, ensure a wonder drug is priced to only serve a few while the rest of humanity goes without, misuse new surveillance technology discoveries to decrease privacy and increase thought control for profit, etc.

It is a bad time for ethical scientists because our entire system is unsustainable. The only legitimate status quo for a human society is one that absorbs as many toxic waste products as it produces. Anything else leads logically to extinction.

In their zeal to extenalize costs of an industrial society on those without power or influence, the corporations have forgotten what any scientist knows too well; Shitting where you eat is a death sentence. 

http://www.heatisonline.org/vi...

Avatar of:

Posts: 0

November 16, 2011

Granted, the atomic bomb was not foreseen at the time Einstein published his equation. Nevertheless, Einstein provided encouragement for the development of the A-bomb when he wrote to Roosevelt. Although Szilard and Wigner authored the letter, Einstein signed it. He lent the prestige of his name to the letter and, therefore, to the resulting research. Doubtless, other people had greater responsibility, but Einstein shared responsibility. Given that other countries were also working on atomic weapons research (Nazi Germany, Soviet Russia), the decision of the US government to pursue such research was probably inevitable. But, if blame is to be assigned, Einstein must share the blame.

Avatar of:

Posts: 0

November 16, 2011

I am not equating legal responsibility with moral responsibility.  The two overlap but are not equivalent.  And yes, science is serendipitous, but I am only suggesting scientists are responsible for foreseeable consequences.

Avatar of:

Posts: 0

November 16, 2011

The arguments made are old and deal with the foundation of scientific exploration and human advancement.  With any discovery there are potential pluses and minuses.  The more powerful the discovery, the greater chance of misuse.  However, if blind restraints are placed on discovery (and this includes limiting the sharing of knowledge), the result will be to limit the chance of achieving true advancement.  Instead, the issue should be that society is responsible for how it seeks to apply knowledge.  A scientist is part of society and thus can not hide his or her head in the sand.  But equally, society in general can not hide its head.  Instead, it is society's responsibility to make itself and its inhabitants the type that do not pursue destruction but instead pursue advancement (presumable due to being happy with existence in the society).

Avatar of:

Posts: 0

November 16, 2011

Based on this argument, and scientist that works for a social paradigm that includes concepts like patriotism, and "respect for authority", ought to be able to figure out that whatever they do will be used to at best coercively control someone at some point, and at worst cause death and mayhem.

On that basis no scientist ought to accept employment in any state run institution, any defense establishment (because they know that the best defense is a strong offense), nor any corporation connected to either.

How many employed scientists do you think there would be?

No.

One has to be responsible for one's own actions.
For some that may mean resignation, or change of employment at some time; and for others it means willful ignorance.

It is not knowledge itself, but the paradigms within which it is organised, that lead to outcomes.   Context is king.

It is what we choose to do with our knowledge, and how we choose to relate it to other knowledge, and how that effects our choices.

Does anyone seriously think that nationalism is there for the benefit of the majority?

Does any scientist seriously believe that any group of individuals is really significantly different from any other group on a genetic basis?

Aren't we all just one species?

Aren't we capable of organising ourselves in ways that ensure the security and freedom of everyone?

Are we not capable of clearly seeing that money is only a measure of exchange value, which value includes a scarcity measure, and as such, does not value abundance at all (consider oxygen, abundant, valuable, but no monetary value)?

Are we not capable of seeing that societal needs require abundance of some basic commodities, and that money and free markets cannot (ever, even in theory) provide such abundance to all)?

Money can be a useful tool, but it is a very poor director.

We as scientists need to be responsible for our choices, certainly.

Who or what are you working for, and what is the likely consequence of doing so?

Avatar of:

Posts: 0

November 16, 2011

There was a time when many great scientsits were Bible believing persons.

Avatar of:

Posts: 0

November 16, 2011

Well, I'm less concerned than you are about biological warfare -- it's a danger, obviously, and we should be prepared for it, but speaking both as a medic and as a scientist I'm more worried about attacks from nature than from nurture, if you see what I mean.  Research funds are limited, and we should IMO concentrate on diseases which cause widespread morbidity and mortality right now in preference to those which might possibly be used as weapons in the future.  But these areas of research are by no means mutually exclusive; it's all important, and none of it could proceed under the author's proposed restrictions.

Avatar of:

Posts: 0

November 16, 2011

I don't think anyone is arguing that scientists aren't members of society, or that we can't benefit from an understanding of the history and philosophy of science.  What we're saying (or what I'm saying, anyway, and I think a lot of other posters are too) is that we already are very aware of the ethical consequences of our work, and that a workable code of scientific ethics requires an understanding of the problems science _actually_ faces, not the Frankensteinian straw man presented in the article.

Avatar of:

Posts: 0

November 16, 2011

If we follow your argument, we should never invent anything that could ever cause harm.  Following your match example, not only are you responsible for throwing the match (I agree), but the inventor of the match, and perhaps also the manufacturer and vendor of the match, are also responsible.  By extension, we never should have invented the axe because it is so easily used to harm others.

Avatar of:

Posts: 0

November 16, 2011

I think the author is trying to say what most people say is common sense. Your actions have reactions and you should think about what impact you will have not only on yourself but the world as a whole. My father refused to teach us kids how to make snares because we might kill our own pets. Cause and effect.
As an engineer I see the world is complicated and governed by laws. You break them and well even a child knows stuff happens. So think ahead. Reality is you don't always know, but sometimes you should. Life is interconnected.

The Bible simply says "everyone will give an account."

Avatar of:

Posts: 0

November 16, 2011

There was a time when many great scientsits were Bible believing persons.

Avatar of:

Posts: 0

November 16, 2011

Utter nonsense. Your thesis depends upon multiple false premises:

A. That this is the ONLY person who will EVER figure out this piece of science of technology. That is laughable.

B. That all persons who figure out a science or technology are "good" and intend to do "good" according to your idealisms. No, Virginia, not everyone is Santa Claus. You acknowledge this in the past about Russia and the Nazis. But then presume, implicitly, that something has changed in humanity. At the same time the whole point of the article is to somehow withhold from those in the present who want to do "bad" (by your roadmap) information they might need to do so. This position further presumes that only "good" (by your roadmap) people are scientists.

C. That this scientist is endowed with perfect powers to forecast uses, both good and bad, for the science or technology. That therefore, they can weigh the good and bad potentials, and come to a conclusion. That is ridiculous. If scientists could do that, then they would all be billionaires because they could foresee what will happen. Scientists are not able to foresee and weigh good and bad. In fact, you are posting this on the internet. The internet is a technology that was developed for war. That was its foreseen use. So by your thesis, the scientists who worked on it should have systematically withheld information.

D. That this particular horse (bioweapons) is still in the barn. That horse isn't just out of the barn, it's on the next continent, quite literally. You propose to lock the barn and burn it down in order to control the horse that isn't even in it anymore. There are facilities all over the middle east available for implementing bioweapons with the highest level of technology. There are people there (remember bin Laden was from a very good family and well educated and that Zawahiri is a physician) who want to use those weapons to kill you. They have said so.

Avatar of:

Posts: 0

November 16, 2011

This.  This, exactly.

I'm particularly disturbed by the author's argument that relevant information should be withheld from publication.  "These are my results but I can't tell you how I got them" isn't science, it's alchemy.

Avatar of:

Posts: 0

November 16, 2011

The 1939 letter was written by Szilard and Wigner.  Einstein merely signed it.  And it had little impact on the direction of bomb research. (The NDRC and the MAUD report were far more important.)  But the point is this:  when Einstein conceived of his equation in the first decade of the 20th century, the bomb was not foreseeable, and so he was not responsible via that piece of work.

Avatar of:

Posts: 0

November 16, 2011

Nor do you get responsibility without freedom, or you shouldn't.  My point is that "too much freedom" is not a problem that scientists have, or are likely to have any time in the foreseeable future.

Avatar of:

Posts: 0

November 16, 2011

Wonderful article, puzzling comments. Membership in society doesn't disappear when one becomes a scientist, and it is that part of us that should consider the usefulness of our work.

A history of science class might go a long way towards helping scientists see where their work might be used and misused. With more introspection and discussion about project, field of study, funding and career choices, and the use and misuse of science, scientists may be able to look a little further downstream. 

Of course there are unintended consequences that go along with amazing discoveries made with the best of intentions. But if they happen, and scientist/citizens don't speak up, then they bear at least the same responsibility that every citizen has to prevent harm.

Avatar of:

Posts: 0

November 16, 2011

The 1939 letter was written by Szilard and Wigner.  Einstein merely signed it.  And it had little impact on the direction of bomb research. (The NDRC and the MAUD report were far more important.)  But the point is this:  when Einstein conceived of his equation in the first decade of the 20th century, the bomb was not foreseeable, and so he was not responsible via that piece of work.

Popular Now

  1. A Potential Remedy for the Aging Brain
    The Scientist A Potential Remedy for the Aging Brain

    In mice, injected fragments of a naturally occurring protein boost memory in young and old animals and improve cognition and mobility in a model of neurodegenerative disease. 

  2. The Sleeping Brain Can Learn
    Daily News The Sleeping Brain Can Learn

    Humans can remember new sensory information presented during REM sleep, but this ability is suppressed during deep, slow-wave slumber.

  3. USDA Emails: Don’t Use “Climate Change”
  4. Nature Index Identifies Top Contributors to Innovation
AAAS