Productive Policy Depends On Public's Understanding Of Scientific Issues

Understanding Of Scientific Issues In our democratic society, in which scientific and technological controversies crowd the public agenda, the American people often are asked to make judgments about unfamiliar, complex issues. On many such issues, the scientific community cannot detail with certainty the likely outcome of many trends or policies. When experts disagree, how can the public acquire enough knowledge to participate thoughtfully in developing policy responses to such controversies? A

John Doble
Jun 23, 1996

Understanding Of Scientific Issues In our democratic society, in which scientific and technological controversies crowd the public agenda, the American people often are asked to make judgments about unfamiliar, complex issues. On many such issues, the scientific community cannot detail with certainty the likely outcome of many trends or policies. When experts disagree, how can the public acquire enough knowledge to participate thoughtfully in developing policy responses to such controversies? And if the public is not involved, what are the implications for society?

The issue is often debated in the scholarly community. Some maintain that reasonable involvement by a public that is, on the whole, woefully lacking in scientific literacy is impossible. Jon Miller, a professor of political science at Northern Illinois University, argues that only the roughly 20 percent of the public who are comparatively well-informed about science-the "scientifically literate"-have the requisite background, understanding, and level of interest to participate effectively.

But this conclusion may be overly pessimistic. A study conducted by New York-based Public Agenda in conjunction with the Kettering Foundation of Dayton, Ohio-two nonpartisan research organizations working to increase citizen involvement in public affairs-suggests that people's considered judgments, the views they reach after learning more and taking time to deliberate, may be qualitatively different from their initial, top-of-the-head views.

A team of researchers from Public Agenda examined public opinion about two issues laced with technical intricacy and scientific uncertainty: the threat of global warming and the disposal of solid waste (J. Doble, Public Understanding of Science, 4:95-118, 1995). Each issue involves a considerable degree of risk and solutions that require changes in behavior or higher costs. However, government leaders and scientists have found it difficult to communicate with the public.

A previous study conducted by Harvard University and Public Agenda on the public's priorities on scientific research (G. Holton et al., Science Policy Priorities and the Public, New York, Public Agenda, 1982) led Public Agenda researchers to question how the general population could make more informed judgments about scientifically complex issues. Rather than take a traditional telephone poll, Public Agenda's researchers used a citizen panel. This method combines the strengths of polling-that is, a large, representative sample fills out a questionnaire-with focus groups, in which people come together for several hours and deliberate.

This panel consisted of a nationally representative group of 402 people chosen to reflect a national cross-section in terms of age, gender, ethnicity, and education level. They met in groups of about 40 in four sites: Hartford, Conn.; Chicago; Nashville, Tenn.; and Los Angeles. Each group filled out a questionnaire-the pretest-then watched a 15-minute video. The tape, produced by Public Agenda, gave a balanced description of the issues surrounding solid waste disposal and options for dealing with it. They then broke up into groups of about a dozen to deliberate about the issue for approximately 30 minutes under the direction of an impartial, professional Public Agenda moderator. After the discussion, the groups reconvened to watch a second video about global warming, then broke up again to deliberate for 45 minutes about that issue and possible solutions, each with an accompanying tradeoff. At the end of the evening, they filled out a second questionnaire-the post-test-that repeated identical questions as in the pretest. Four hundred eighteen of the United States' leading scientists were also interviewed by mail and asked the same questions.

We drew five main conclusions from our "Science and the Public" study:

1. The public has the ability to thoughtfully assess even very scientifically complex issues featuring areas of substantial expert uncertainty.

The panel's judgment, as expressed in the post-test, was a logically consistent, reasonable prescription for dealing with each issue. The opinions expressed in the post-test significantly differed from the panel's initial opinions.

When the opinions of the panel and the scientists were compared, a pattern emerged: In the pretest, before people learned about the issues, there was a significant gap between their thinking and the scientists'. But in the post-test, with some important exceptions, the gap narrowed and the panel's views generally aligned with the scientists'.

2. After they learn more and deliberate about technical issues, people's thinking will shift so that inconsistencies are minimized and guidelines about an acceptable policy will emerge.

Especially with global warming, there were changes in respondents' views from the pretest to the post-test. In the pretest, majorities favored seven of the 25 proposals; but in the post-test, majorities favored 13 proposals, including 10 that would mean higher prices or higher taxes. The panel also flatly rejected 12 proposals, including drilling for natural gas in national parks and building more nuclear power plants.

3. The public's views generally will come into alignment with the consensus among experts. But differences that persist may stem from different values, not levels of technical expertise.

The shift in people's thinking tended to be in the direction of the scientists' preferences. Regarding global warming, both scientists and the panel in the post-test favored implementing 11 of 25 measures to reduce greenhouse emissions while rejecting eight others. However, there were a few notable differences, including four proposals the scientists favored that the panel did not: a "gas guzzler" tax, greater use of nuclear power, more aid to countries that stop destroying the rainforest, and a 25 cents-per-gallon increase in the gas tax.

In each case, the discrepancy stemmed from different values, not scientific expertise. For example, 68 percent of the scientists advocated building more nuclear plants to reduce dependence on fossil fuels, a result that might lead some to conclude that if only the public understood the issue, it would reach the same conclusion. But the findings suggest otherwise. In the post-test, 46 percent opposed using more nuclear power, compared with 45 percent in the pretest.

Even after learning about the threat of global warming and that nuclear power, alone among all major energy alternatives, does not pollute the air or contribute to the problem in any way, opponents did not budge.

Rather than technical expertise, the panel's opposition to nuclear power was rooted in what it saw as a history of bad faith. In their deliberations, people's concerns centered on the disposal of radioactive waste, the threat of an accident, the cost of construction, and low confidence in those who design, build, manage, and regulate nuclear facilities. Opponents felt that the nuclear power industry and its regulatory agencies have demonstrated an inability to safely manage this technology. Therefore, the results suggest, opponents of nuclear power are not likely to change their minds, no matter how much they learn about the "greenhouse effect."

4. A lack of expertise is not necessarily what blocks the public from thoughtfully considering scientific issues. Having a framework that spells out the choices and the tradeoffs would seem to be more important than mere command of technical information.

We gave people a framework: We briefly explained what the greenhouse effect is, saying that there are many questions that scientists have not yet answered, including when global warming might start or what its effects will be. We told people that there are no easy answers, no technological miracles on the horizon, no scapegoats, and no villains. The problem, we said, stems from how we live: Most greenhouse gases come from driving cars, using home heating oil, and buying electricity from utility companies. We gave people 25 proposals to reduce carbon dioxide emissions and pointed out that each has an accompanying tradeoff. We asked them to imagine themselves as a citizens' committee that advises policymakers about these issues, likening their role to jurors-nonexperts who, in view of conflicting expert testimony, must make the ultimate decision about guilt or innocence. The fact that the majority endorsed costly proposals for the greater good suggests that the way technical information is organized and presented may be more important to making a judgment than the information itself.

Consider this analogy: A patient asks a doctor whether a certain treatment, which is expensive and risky, will relieve a painful symptom. There are two ways a doctor can present the information. The doctor could strongly recommend the treatment, but urge the patient to get a second opinion. Say the second doctor strongly opposes the treatment because he believes it will make the pain worse. In such a case, it is easy to imagine Patient A becoming frustrated while running from doctor to doctor, counting expert opinions the way we count votes, trying to learn which expert is correct. When scientific issues are presented like this, "expert uncertainty" will lead to gridlock, with the public finding it impossible to decide among competing experts.

But let's suppose the doctor simply says, "Medical science does not know whether you should get the treatment. Most doctors would say, 'Yes'; a minority would say, 'The treatment will make the pain worse'; and a fair number simply aren't sure. We don't yet know whether this treatment is a good idea. And so the decision is up to you. Now, here are the costs, and here are the risks and benefits."

The difference is that Patient A, like the general public when evaluating scientific issues, is asked to judge which expert is correct-a question that, by definition, requires him to be an expert. On the other hand, Patient B-like the panelists in the study-is given the responsibility to decide after learning about the areas and extent of expert uncertainty.

5. Uncertainty among experts need not produce political gridlock.

Panel members were not paralyzed in the face of expert uncertainty, in large part because of how the uncertainty was presented. For example, participants were told that most scientists believe global warming will cause slow, incremental changes in the environment over the next 50 years if present trends continue; a minority thinks the changes will be catastrophic; and a "fair number" are not sure what will happen. In light of the information, these panelists decided on policy that reflected sound scientific judgment.

This study suggests that the public can make well-grounded judgments when presented with all sides of an issue. In addition, several conditions must be met. First, people must understand the extent of expert uncertainty and what those uncertainties are. People also must see the uncertainty within the broader framework of the issue. If they don't understand how the uncertainty relates to the policy options and the tradeoffs, it will affect their judgment on an issue.

Perhaps most important, people must have the opportunity to deliberate, especially about scientifically and technologically complex issues. Without open deliberation, our democratic system would have failed years ago. This is the principle on which our government rests. As issues become more complex, deliberation becomes even more essential to instituting thoughtful policy.

John Doble, formerly research director at Public Agenda in New York, is founder of Englewood, N.J.-based Doble Research Associates Inc., which specializes in analyzing public opinion about policy issues from a nonpartisan perspective. He acknowledges the coauthors of the "Science and the Public" study: Jean Johnson, Amy Richardson, and, in particular, the late Allen Danks.