In 2009, I contributed a chapter to the edited volume Communicating Biological Sciences: Ethical and Metaphorical Dimensions that was a first attempt on my part to dig deeper into the the normative and ethical dimensions of science communication, particularly many of the questions that had been raised by the growing attention to communication research over the decade and the correlated attention to the role of scientists in high-profile political debates over stem cell research, climate change, evolution and other policy controversies.  In doing so, I drew on on my experience in discussing and giving talks specific to the role of framing in science policy-debates.  I also drew on the contributions of science policy scholars, most notably Roger Pielke Jr.’s and Daniel Sarewitz’s work on “politicization” and the different roles that scientists and their organizations can play in managing policy conflict.

Three years later, attention to the valuable contributions of communication research and policy sciences to understanding the social dimensions of science controversies has only continued to grow, as evidenced by last year’s very successful “Science of Science Communication” Sackler Colloquium organized by the National Academies.  Moreover, as policy debates continue over climate change, the teaching of evolution, biomedical research, food biotechnology, and other issues, the role of scientists remains front and center, most notably raised last week in the pages of Nature magazine by Dan Sarewitz and further discussed by Roger Pielke at his blog. Both argue that as scientists are increasingly viewed not as honest brokers, but as advocates aligned with the goals of the Democratic party, scientists and their organizations risk losing public trust and only likely contribute to polarization on hot button issues like climate change.

For all the attention that science communication research has deservedly received, what is still missing from this discussion is careful analysis, understanding and application of normative and ethical principles to how scientists and their organizations can both effectively and ethically apply this research to their engagement of the public and policy makers.  As I showed in a recent study with John Besley, scientists’ own role conceptions relative to the media, the public, and public affairs still remain relatively out of step with the findings from research and much of the work that has been done in a variety of social science fields over the past thirty years.  With this in mind – as a way to hopefully jump start more conversation and discussion — I have posted the final draft of the full text of the 2009 chapter below and you can also read the chapter at Google books.

Nisbet, M.C. (2009). The Ethics of Framing Science. In B. Nerlich, B. Larson, & R. Elliott (Eds.). Communicating Biological Sciences: Ethical and Metaphorical Dimensions (pp 51-74). London: Ashgate. [Google Books]

Download PDF of the Chapter

THE ETHICS OF FRAMING SCIENCE

Matthew C. Nisbet

Over the past decade, among an avant-garde of U.S. scientists, journalists, and affiliated organizations, there has been a growing recognition that scientific knowledge alone does not compel public perceptions or policymaker decisions. Instead, these innovators understand that effective communication involves addressing an intended audience’s values, interests, and worldviews. This approach requires scientists and journalists to not only draw upon audience-based research to tailor messages but also to actively sponsor public dialogue and the exchange of perspectives. In this new light, science communication is no longer defined as a process of transmission, but rather as an active and ongoing conversation with a range of stakeholders.

Research in the area of framing has been a central driver of this paradigm shift. “Frames” are the conceptual term for interpretative storylines that communicate what is at stake in a science-related debate and why the issue matters (Gamson & Modigliani, 1989). At a theoretical and descriptive level, framing research offers a rich explanation for how various actors in society define science-related issues in politically strategic ways, how journalists from various beats selectively cover these issues, and how diverse publics differentially perceive, understand, and participate in these debates (Pan & Kosicki, 1993; Scheufele, 1999; Nisbet 2009a).  For each group, frames help simplify complex issues by lending greater weight to certain considerations and arguments over others, translating why an issue might be a problem, who or what might be responsible, and what should be done (Feree, et al., 2002). In this manner, frames provide common points of reference and meaning between science, the media, and key publics (Hellstein and Nerlich, 2008).

Perhaps even more importantly, at an applied level, this basic research can serve as an innovative public communication technology that can be harnessed by scientists, journalists, and their organizations. On issues such as climate change, evolution, and nanotechnology, studies are examining what specific groups in society want to know, their political interpretations, the perceived implications for their daily lives, the resonance or conflict with their values and social identities, where they are most likely to receive information, and who or what they are looking to for answers. When specific intended audiences have been carefully researched and understood, the resulting tailored messages can be true to the science, but also personally relevant and meaningful to a diverse array of publics. Government agencies, nongovernmental organizations, and science institutions can use the results of this audience research to design and plan their communication initiatives, thereby promoting public learning, empowering public participation, or moving discussion beyond polarization and gridlock. Journalists can also use this information to craft novel, accessible, and relevant narratives for nontraditional audiences across media formats, expanding their journalistic reach and impact (see Labov & Pope, 2008; Leiserowitz, Maibach, & Roser-Renouf, 2008; Nisbet 2009b; Scheufele, 2006 for examples).

Yet, as these research-based approaches to public communication move forward, critics of this paradigm shift have argued that these innovations imperil the perceived objectivity, neutrality, and independence of scientists and journalists, while reinforcing a tradition of “top down” communication from experts to the public (see for example Holland et al., 2007). In this chapter, I address these concerns by outlining the ethical implications of framing as applied to science-related policy debates, focusing specifically on the normative obligations of scientists, journalists, and their affiliated organizations. Importantly, I note the key differences in ethical imperatives between these groups and other communicators in science-related policy debates, notably social critics and partisan strategists.

FOUR GUIDING PRINCIPLES

To begin the chapter, I briefly review how past research in political communication and sociology describes a lay public that makes sense of science-related policy debates by drawing upon a mental toolkit of cognitive short cuts and easily applied criteria. This research shows that science literacy has only a limited influence on perceptions; instead public judgments are based on an interaction between the social background of an audience and the frames most readily available by way of the news, popular culture, social networks, and/or conversations.

Surveys indicate that Americans strongly believe in the promise of science to improve life, deeply admire scientists, and hold science in higher esteem than almost any other institution. Scientists therefore enjoy tremendous communication capital; the challenge is to understand how to use this resource effectively and wisely. Importantly, in terms of ethical obligations, one of the conclusions of this body of research is that whenever possible, dialogue should be a focus of science communication efforts, rather than traditional top-down and one-way transmission approaches.

I then briefly describe a deductive set of frames that apply consistently across science-related debates. Breaking “the frame” so to speak is very difficult to do, since the interpretative resources that society draws upon to collectively make sense of science are based on shared identities, traditions, history, and culture. I also review the important differences between “science,” “policy,” and “politics,” arguing that there are few cases, if any, where science points decisively to a clear policy path or where policy decisions are free from politics. In this context, scientists and journalists can be either “issue advocates” or “honest brokers,” and in each role, framing is central to communication effectiveness.

Yet, no matter their chosen role, scientists and journalists should always emphasize the values-based reasons for a specific policy action. As I discuss, when a policy choice is simplistically defined as driven by “sound science” or as a matter of “inconvenient truths,” it only serves to get in the way of public engagement and consensus-building. Science becomes just another political resource for competing interest groups, with accuracy often sacrificed in favor of political victory.

Indeed, accuracy is a third ethical imperative. No matter their role as issue advocate or honest broker, both scientists and journalists must respect the uncertainty that is inherent to any technical question and resist engaging in hyperbole. If these groups stray from accurately conveying what is conventionally known about an issue, they risk losing public trust.

Finally, for scientists and journalists, a fourth ethical imperative is to avoid using framing to denigrate, stereotype, or attack a particular social group or to use framing in the service of partisan or electoral gains. As I review, this is particularly relevant to communicating about issues such as evolution, where pundits such as Richard Dawkins use their authority as scientists to argue their personal opinion that science undermines the validity of religion and even respect for the religious. The ethical norm also applies to the use by partisans of stem cell research—and science generally—as a political wedge strategy in recent elections. Framing will always be an effective and legitimate part of social criticism and electoral politics, but for scientists and journalists to simplistically define critiques of religion or opposition to a candidate as a “matter of science” only further fuels polarization, alienating key publics and jeopardizing the perceived legitimacy of science.

FRAMING AND SCIENCE POLICY DEBATES

A prevailing assumption historically has been that ignorance is at the root of social conflict over science. As a solution, after formal education ends, science media and other communication methods should be used to educate the public about the technical details of the matter in dispute. Once citizens are brought up to speed on the science, they will be more likely to judge scientific issues as scientists do and controversy will go away. In this decades-old “deficit” model, communication is defined as a process of transmission. The facts are assumed to speak for themselves and to be interpreted by all citizens in similar ways. If the public does not accept or recognize these facts, then the failure in transmission is blamed on journalists, “irrational” public beliefs, or both (For more on the deficit model see, Nisbet & Scheufele, under review).

Yet as communication researchers will recognize, the deficit model ignores a number of realities about audiences and how they use the media to make sense of public affairs and policy debates. First, individuals are naturally “cognitive misers” who rely heavily on mental short cuts, values, and emotions to make sense of a science-related issue.  These ‘shortcuts’ work in place of paying close attention to news coverage of science debates and in lieu of scientific or policy-related knowledge (see Downs, 1957; Popkin, 1991). Second, as part of this miserly nature, individuals are drawn to news sources that confirm and reinforce their pre-existing beliefs. This tendency, of course, has been facilitated by the fragmentation of the media and the rise of ideologically slanted news outlets (Mutz, 2006). Third, in a media environment with many choices, if an individual lacks a strong preference or motivation for quality science coverage, then they can completely avoid such content, instead focusing narrowly on their preferred news topics or entertainment and infotainment (Prior, 2005).

Finally, survey evidence counters deficit model claims that science has lost its position of respect and authority in American society. Consider that more than 85% of Americans agree that “even if it brings no immediate benefits, scientific research that advances the frontiers of knowledge is necessary and should be supported by the federal government.” On the specific issues of climate change, stem cell research, and food biotechnology, respondents believe scientists hold greater expertise, are less self interested, and should have greater say in decisions than industry leaders, elected officials, and/or religious leaders. Moreover, during the past twenty years, as public trust in Congress, the presidency, industry, religious institutions, and the media have plummeted; public faith in science has remained virtually unchanged. In fact, among American institutions, only the military enjoys more trust (NSB, 2008).

Given these realities, to focus on science literacy as both the cause and the solution to conflict remains a major distraction for scientists, journalists, and advocates alike. Moreover, if scientists in particular had a better understanding of the complex factors that shape public preferences and policy decisions, they would be less likely to define every debate in terms of “crisis” or “politicization,” interpretations that distract from building consensus around shared values and common goals and that may actually alienate key publics (Goldston, 2008; Nisbet, 2009b).

Alternatives to the Deficit Model

Serious critiques of the deficit model first gained prominence in the early 1990s as sociologists used ethnographic approaches to study how particular social groups made sense of scientific expertise and authority (see Irwin & Michael, 2003 for an overview). Among these studies, Bryan Wynne and colleagues proposed a set of mental rules that lay publics are likely to use in evaluating scientific advice and expertise (CSE, 2001; Marris, 2001; Wynne, 1992). These common sense heuristics fit closely with the conclusions from quantitative public opinion research reviewed earlier (see Nisbet & Goidel, 2007; Bauer, 2008 for more). Specifically, lay publics are likely to apply the following criteria in reaching judgments:

  • Does scientific knowledge work? Do public predictions by scientists fail or prove to be true?
  • Do scientific claims pay attention to other available knowledge? For example, in understanding the risks to the food supply from the 1986 Chernobyl disaster, did scientists consult with farmers on how to best monitor grazing habits and take samples from livestock?
  • Are scientists open to criticism? Are they willing to admit errors and oversights?
  • What are the social and institutional affiliations of scientists? In other words, do they have a historical track record of trustworthiness? Similarly, do they have perceived conflicts of interest relative to their associations with industry, government, universities, or advocacy groups?
  • What other issues overlap or connect to a publics’ immediate perception of the scientific issue? In the UK debate over genetically modified food, both Chernobyl and mad cow disease served as recent events that undermined public trust in government claims about risk.
  • Specific to risks, have potential long-term and irreversible consequences of science been seriously evaluated, and by whom? And do regulatory authorities have sufficient powers to effectively regulate organizations and companies who wish to develop the science? Who will be held responsible in cases of unforeseen harm?

In 2000, drawing upon this emerging body of work, a UK House of Lords report urged science institutions to move beyond just a one-way transmission model of science communication towards a new focus on deliberative contexts where a variety of stakeholders could participate in a dialogue and exchange of views about science policy. Over the past decade, in the U.K, Europe, and Canada there has been a wave of consensus conferences, deliberative forums, and town meetings on a number of issues. In these initiatives, recruited lay participants receive background materials in advance, provide input on the types of questions they would like addressed at the meeting, and then provide direct input on recommendations about what should be done in terms of policy. Each initiative, however, varies by how participants are asked for feedback, how much their feedback matters, and exactly when in the development of a scientific debate consultation occurs (Einsiedel, 2008).

Through these initiatives, studies find that participants not only learn directly about the technical aspects of the science involved, but perhaps more importantly, they also learn about the social, ethical, and economic implications of the scientific topic. Participants also feel more confident and efficacious about their ability to participate in science decisions, perceive relevant institutions as more responsive to their concerns, and say that they are motivated to become active on the issue if provided a future opportunity to do so (Besley, Kramer, Yao, & Toumey, 2008; Powell & Kleinmann, 2008). Just as importantly, deliberative forums, if carefully organized, shape perceptions of scientists as open to feedback and respectful of public concerns, perceptions that predict eventual acceptance and satisfaction with a policy outcome, even if the decision is contrary to an individual’s original preference (Besley & McComas, 2005; Borchelt & Hudson, 2008).

Top-down and Bottom-up Framing

From a normative stand point, empowering citizens to participate in collective decisions about science-related policy is central to the functioning of a democracy, especially when citizens are expected to bear both the costs and the risks of a policy decision. Yet the major limitation to public dialogue initiatives is their small scale size and scope. Unless intensive resources are spent on recruiting a diverse set of participants, the most likely individuals to turn out are those already opinion intense, well informed, and emotionally committed to an issue (Goidel & Nisbet, 2006).  As a result, in combination with traditional science media and public consultation efforts, scientists and journalists must also learn to focus on “framing” their messages in ways that engage wider and more diverse publics, while discovering new media platforms for reaching audiences and sponsoring dialogue.

Frames work by connecting the mental dots for the public. They suggest a connection between two concepts, issues, or things, such that after exposure to the framed message, audiences accept or are at least aware of the connection. An issue has been successfully framed when there is a fit between the line of reasoning a message or news story suggests on an issue and the presence of those existing mental associations within a particular audience (Scheufele & Tewksbury, 2007).  For example, as I will review later on climate change, by emphasizing the religious and moral dimensions of the issue, several scientists have convinced religious leaders that understanding the science of climate change is directly applicable to questions of faith. The normatively desirable outcome has been to bridge ideological divides on the issue while sponsoring learning and dialogue..

Complementing these psychological accounts, sociologists such as William Gamson have promoted a “social constructivist” explanation of framing. According to this research, in order to make sense of political issues, citizens use as resources the frames available in media coverage, but integrate these packages with the frames forged by way of personal experience or conversations with others. Frames might help set the terms of the debate among citizens, but rarely, if ever, do they exclusively determine public opinion. Instead, as part of a “frame contest,” one interpretative package might gain influence because it resonates with popular culture or a series of events, fits with media routines or practices, and/or is heavily sponsored by elites (Gamson, 1992; Price, Nir, & Capella, 2005).

As Wynne (1992) has argued, many members of the public hold their own applicable lay knowledge about a science-related debated that is based on personal experience, culture, or conventional wisdom.  Moreover, in combination with media coverage, these lay theories enable people to reason and talk about a complex science debate in their own familiar terms and to participate in consultation exercises such as deliberative forums (Pan & Kosicki, 2007). In other words, motivated citizens–when given the opportunity–can actively participate in a “bottom up” framing of issues. Social movements, for example, have historically used frames to mobilize members and connect groups into advocacy coalitions (see Croteau, Hoynes, & Ryan, 2005 for an overview).

With new forms of user-centered and user-controlled digital media such as blogs, online video, and social media sites, “bottom up” alternative frames may be gaining greater influence in the discursive contest that surround issues such as climate change or stem cell research. Therefore, one way to effectively and ethically use framing to sponsor dialogue among a wider public is to invest at the local and regional level in “participatory digital media infrastructures” for science and environmental issues.

This type of investment may be particularly important for the U.S., where local newspapers have cut meaningful coverage of science and the environment. As a result, many communities lack the type of relevant news and information that is needed to adapt to environmental challenges or to reach collective choices about issues such as nanotechnology and biomedical research. As one possible way forward, government agencies and foundations can fund public television and radio organizations as community science information hubs. These “public media 2.0” initiatives would partner with universities, museums, and other local media outlets to share digital content that is interactive and user-focused.  The digital portals would feature in depth reporting, blogs, podcasts, shared video, news aggregation, user recommendations, news games, social networking, and commenting. Via a mix of “top-down” and “bottom-up” framing of issues, these new models for non-profit science media would be an integral part of the infrastructure that local communities need to adapt to climate change, to move forward with sustainable economic development, and to participate in the governance of science, medicine, and technology (See Clark & Aufderheide, 2009 for a discussion of these new media models).

The Anatomy of Frames

The identification and application of frames as general organizing devices—whether as used in advocacy campaigns, in a news story, or in a digital discussion—should not be confused with specific policy positions. As Gamson and his colleagues describe, individuals can disagree on an issue but share the same interpretative frame (cf. Gamson & Modigliani, 1989), which means that any frame can include pro, anti, and neutral arguments (see Feree et al., 2002; Tankard, 2001). For example, as I will review, a dominant frame applied to stem cell research is that the issue is fundamentally a matter of “morality/ethics.”  Both sides use this frame to argue their case in the debate.  Research opponents say it is morally wrong to destroy embryos, since they constitute human life. Research supporters say it is morally wrong to hold back on research that could lead to important cures.

The latent meaning of any frame is often translated instantaneously by specific types of framing devices such as catchphrases, metaphors, sound bites, graphics, and allusions to history, culture, and/or literature (Gamson, 1992).  Many studies often confuse frames and frame devices.  For example, they might track in news coverage or test in an experiment a slogan such as former U.S. vice president Al Gore’s “climate crisis,” but never carefully consider the underlying interpretative meaning (“runaway, impending disaster”), of which the slogan is just one among many possible triggers.

Identifying the frames that apply to a science-related policy debate should be approached both deductively and inductively. Drawing on previous work, studies usually work from a set of frames that appear to reoccur across policy debates. Originally identified by Gamson and Modigliani (1989) in a framing study of nuclear energy, the typology of frames, which include for example public accountability and social progress, was further adapted in studies of food and medical biotechnology in Europe and the United States (Dahniden, 2002; Durant, Bauer, & Gaskell, 1998; Nisbet & Lewenstein, 2002). In Table 1, I outline this generalizable typology of frames, defining the latent meanings of each interpretation. (With the reader in mind, throughout the rest of the chapter, when discussing the framing of a specific issue, references to frames from the typology are italicized and frame devices are in quotes.)

Table 4.1 A Typology of Frames Applicable to Climate Change

 

Frame

 

 

Defines Science-Related Issue As…

Social progress …improving quality of life, or solution to problems. Alternative interpretation as harmony with nature instead of mastery, “sustainability.”
Economic development/competitiveness …economic investment, market benefits or risks; local, national, or global competitiveness.
Morality/ethics …in terms of right or wrong; respecting or crossing limits, thresholds, or boundaries.
Scientific/technical uncertainty …a matter of expert understanding; what is known versus unknown; either invokes or undermines expert consensus, calls on the authority of “sound science,” falsifiability, or peer-review.
Pandora’s box / Frankenstein’s monster / runaway science …call for precaution in face of possible impacts or catastrophe. Out-of-control, a Frankenstein’s monster, or as fatalism, i.e. action is futile, path is chosen, no turning back.
Public accountability/governance …research in the public good or serving private interests; a matter of ownership, control, and/or patenting of research, or responsible use or abuse of science in decision-making, “politicization,”
Middle way/alternative path …around finding a possible compromise position, or a third way between conflicting/polarized views or options.
Conflict/strategy …as a game among elites; who’s ahead or behind in winning debate; battle of personalities; or groups; (usually journalist-driven interpretation.)

Note: Frame typology derived from previous analyses of nuclear energy, food and medical biotechnology, and recently applied to climate change and evolution (Dahinden, 2002; Durant, Bauer, & Gaskell, 1998; Nisbet & Lewenstein, 2002; Nisbet 2009a; Nisbet 2009b).

COMMUNICATING VALUES RATHER THAN INCONVENIENT TRUTHS

“Science,” broadly speaking, can be defined as the systematic pursuit of knowledge, whereas “policy” refers to a specific decision or course of action, and “politics” is the process of bargaining, conflict, negotiation, and compromise that determines who gets what, when, and how (Pielke, 2007). These distinctions matter to public communication and framing. The tendency by scientists, journalists, advocates, and elected officials to define policy debates as exclusively about science and not in terms of either politics or values is often at the root of conflict and puts at risk public trust in scientific research (Bipartisan Policy Center, 2009).

In particular, if science becomes the perceived dominant reason for a course of action, then as a matter of strategy and politics, competing interests will have the incentive to claim that scientific evidence is on their side. As a consequence, an inevitable part of the framing of an issue will involve a contest over uncertainty, with each side potentially hyping or distorting the objective state of expert consensus.

Arguing that a policy debate is simply a matter of “sound science” reduces scientific knowledge to just another resource that interest groups can draw upon in political battles, threatening the perceived integrity of science. Each time an exaggerated scientific claim is proven false or inaccurate; it risks further alienating publics already distrustful of the science and scientists (See Pielke, 2007 for more).

This tendency to reduce science policy decisions down to debates over science rather than values is perhaps principally responsible in the United States for lingering political gridlock over climate change. To date, so-called climate skeptics continue to successfully downplay public concern by narrowly framing the issue in terms of scientific uncertainty. In contrast, Al Gore, many environmentalists, and even some scientists have attempted to counter the uncertainty frame with their own message that climate science in fact compels action, dramatizing this science by way of a Pandora’s box emphasis on a looming “climate crisis.”

Publicity for Gore’s An Inconvenient Truth led with this storyline, including a movie poster with the frame device of a hurricane-shaped plume spewing from a smoke stack and a trailer that told audiences to expect “the most terrifying film you will ever see.” With an accent on the visual and the dramatic, the catastrophe strategy triggered similarly framed news coverage. For example, a much talked about Time magazine cover from 2006 featured the image of a polar bear on melting ice with the tagline: “Be worried, be VERY worried” (See Nisbet, 2009b for an overview.)

Yet these claims are effectively challenged by climate skeptics as liberal “alarmism,” putting the issue quickly back into the mental box of scientific uncertainty and partisanship. Polls suggest that the American public has picked up on these claims of “climate exaggeration,” likely filtering them back through their preferred partisan lens and their existing views on liberal media bias. The result is that many otherwise well-informed Americans increasingly discount the climate change problem, while also believing that the mainstream news media is exaggerating the issue (See Nisbet, 2009b).

A Values-Based Second Premise

An overwhelming majority of scientists have concluded that human activities are contributing to climate change and that this presents a major risk to society. Yet this scientific research–this first premise about climate change–does not offer an explicit normative framework that might guide decision-making, helping individuals decide whether action is worth the costs and trade-offs or which policy might be most in line with their values, whether religiously or secularly based. In short, the science does not speak for itself, and as survey research continues to show, this first premise of scientific certainty remains selectively interpreted by the American public based on their values and partisan identity.[1]

Many scientists and environmental advocates, of course, do offer an explicit second premise, though these values are probably not strongly shared by a majority of Americans. As a matter of social progress and environmental ethics, advocates such as Al Gore and best-selling writers such as Bill McKibbon (2008) argue that we should take action on climate change because human activities have shifted the planet into “dangerous disequilibrium,” altering the natural order of things. Not only is it morally wrong to violate and imperil nature but our actions threaten future generations of humans.

The challenge on climate change is to identity the moral framework–or second premise– that works for specific segments of the public and to effectively frame the significance of climate change as in line with that framework. For example, a 2006 “Evangelical Call to Action” succinctly lays out the first and second premise for a Christian public.[2] The document asserts that “human induced climate change is real” and that “the consequences of climate change will be significant.” The document then frames the second premise, or the reason why Christians should care:

  • Christians must care about climate change because we love God the Creator and Jesus our Lord, through whom and for whom the creation was made. This is God’s world, and any damage that we do to God’s world is an offense against God Himself (Gen. 1; Ps. 24; Col. 1:16).
  • Christians must care about climate change because we are called to love our neighbors, to do unto others as we would have them do unto us, and to protect and care for the least of these as though each was Jesus Christ himself (Mt. 22:34-40; Mt. 7:12; Mt. 25:31-46).

Every science policy debate, no matter how certain expert agreement, falls into the “second premise” category. In fact, as President Barack Obama’s March 2009 stem cell decision and speech makes clear, while there is often conflict and distortion over what “consensus” might be in the scientific community, most political battles over science revolve over the normative frameworks that are the grounds for action.

In Obama’s speech, he opened with the established framing playbook among stem cell advocates, defining the issue in terms of social progress while also being careful not to go beyond scientific uncertainty, avoiding past exaggerations over the realistic timeline for discoveries:

At this moment, the full promise of stem cell research remains unknown, and it should not be overstated. But scientists believe these tiny cells may have the potential to help us understand, and possibly cure, some of our most devastating diseases and conditions. To regenerate a severed spinal cord and lift someone from a wheelchair. To spur insulin production and spare a child from a lifetime of needles. To treat Parkinson’s, cancer, heart disease and others that affect millions of Americans and the people who love them.

Obama also argued the economic competitiveness frame that has been frequently applied by funding proponents, asserting the U.S. risked losing scientists to other countries if research did not move forward.

Yet perhaps most importantly, Obama was explicit in acknowledging that science alone did not drive policy choices and decisions. The President, in fact, was careful to articulate the second premise that lay behind his decision. First, he defined his decision in terms of a moral and ethical duty to help those in need. Second, he cited his public accountability duty to be in line with the wishes of a majority of Americans. Notice specifically how Obama referenced his religious beliefs as compelling action:

As a person of faith, I believe we are called to care for each other and work to ease human suffering. I believe we have been given the capacity and will to pursue this research – and the humanity and conscience to do so responsibly… The majority of Americans – from across the political spectrum, and of all backgrounds and beliefs – have come to a consensus that we should pursue this research. That the potential it offers is great, and with proper guidelines and strict oversight, the perils can be avoided. That is a conclusion with which I agree. That is why I am signing this Executive Order, and why I hope Congress will act on a bi-partisan basis to provide further support for this research.

In stating the religious reasoning behind his decision, Obama is no different than former President George W. Bush, who was equally open about the values that guided his decision to limit embryonic stem cell funding.  As a direct parallel to Obama’s religious reasoning, consider this statement from Bush’s August 2001 speech announcing his compromise funding for stem cell research:

My position on these issues is shaped by deeply held beliefs. I’m a strong supporter of science and technology, and believe they have the potential for incredible good — to improve lives, to save life, to conquer disease…. I also believe human life is a sacred gift from our creator. I worry about a culture that devalues life, and believe as your president I have an important obligation to foster and encourage respect for life in America and throughout the world.

Or consider this September, 2006 Bush speech announcing his decision to veto a Congressional bill that would have expanded funding for embryonic stem cell research.  The speech was delivered at a press conference where Bush was surrounded by “snowflake babies,” children born from adopted embryos that otherwise would have been discarded by their biological parents and the respective fertilization clinic:

This bill would support the taking of innocent human life… Each of these human embryos is a unique human life with inherent dignity and matchless value…These boys and girls are not spare parts.

A recognition of the need to frame both the first and second premise also comes through in Obama’s statement on scientific integrity delivered as part of his stem cell announcement. In short, Obama’s directive to his science advisors to “develop a strategy for restoring scientific integrity to government decision-making” is about protecting the ability of scientists to establish the first premise, to be free to accurately identify through research various potential opportunities and risks to society.

This, in fact, is the significant difference between the Bush and Obama administration, at least at this early part of the latter president’s term. When it comes to the second premise, both openly apply their own set of values in deciding how to take policy action on the conclusions of science. As a matter of governing there is no way to avoid applying values to craft science policy. Scientists, journalists, and elected officials need to transparently articulate this reality. Where Obama and Bush appear to differ is that the Bush administration was also willing to move into the territory of the first premise. On issues such as climate change, as a number of investigations have revealed, the Bush administration actually shaped, re-framed, or even obstructed what scientists had concluded about climate change-related risks (See Brainard, 2009 for a recent discussion).

In sum, when science policy debates are simplistically reduced down to a “debate over the science” or a matter of “inconvenient truths,” with discussion of values and politics lost in the translation, framing is most likely to be applied unethically, violating the norm of accuracy, and used to hype, exaggerate, or distort scientific evidence. Indeed, if scientists, journalists, and a range of political actors were more open and transparent about the values that guide their preferred policy actions public engagement and dialogue are likely to benefit.

In spring 2007, The Scientist magazine sponsored an online discussion of framing and its implications for science communication. In one posting, scientist-turned-environmental advocate Mark Powell succinctly summarized this key distinction between communicating the first and second premise:

I left active science to work as an environmental advocate. I learned quickly that values trump hard facts—for instance, people in a logging town had a hard time believing that logging could cause harm, because their value structure was threatened by such a claim. If I started my communications with “logging can harm forest eco-systems,” I mostly got denial and dismissal.  Instead, if I started with “do you care about deer and salmon?” then people would say yes and engage in conversation. Later, I could get to my science about logging effects on salmon. Is it lying? No, it’s framing and it’s smart (The Scientist, 2007, p. 42).

TRUTH-TELLING, ISSUE ADVOCATES AND HONEST BROKERS

As professionals, both scientists and journalists share a deep ethical commitment to truth-telling and accuracy. For example, scientists have developed a prescribed and socially shared set of rules for translating research questions into their testable form, collecting and evaluating data, and communicating the results. These rules are used to ensure inter-subjectivity and the replication of observations and conclusions, allowing scientists to figure out what is approximately true about the world and to do so while minimizing social biases and value-laden observations. Across many policy debates there usually exists expert agreement–or at least an emerging body of scientific knowledge–by which first premise truth claims can be evaluated.

Similarly in journalism, methods have been developed for achieving accuracy. These methods include fact-checking and the reliance on multiple and credible sources. In the United States historically, the deep professional emphasis on accuracy derives from a belief that focusing on “who, what, where, when, and how” is the best means for capturing a broad-based audience while avoiding political and legal conflicts (Christians, 2008; Danielian, 2008).

Though there is little question that scientists have developed an unrivaled institutional ability to arrive at approximately true observations about the world, as political scientist Roger Pielke (2007) argues, there are really only two communication roles that scientists can play in policy-related debates and the reality of these roles might in fact be at odds with how scientists prefer to define themselves. As he describes, many scientists prefer to think of themselves as creating knowledge that can be drawn upon by policymakers but not entering into policy debates themselves. Though this self-defined role has great appeal, in reality even so-called pure scientists often engage in strategic communication as a means to promote their careers or to ensure continued government funding, framing research heavily in terms of social progress, societal benefits, breakthroughs and economic competitiveness (Hellstein & Nerlich, 2008).

A second imagined policy role among scientists is as a neutral “science arbiter,” providing science-related information, expertise, testimony, and reports when called upon by policymakers. However, as Pielke (2007) details with the example of the food pyramid and U.S. dietary guidelines, advisory committees and expert panels often engage in implicit normative considerations when providing scientific advice or provide input on a narrow set of predetermined policy options (See also Hilgartner, 2000). A result is that the imagined role of science arbiter shifts into stealth issue advocacy.

Still, according to Pielke (2007), issue advocate is one of the authentic communication roles that scientists do assume in policy debates. Indeed, there is nothing ethically wrong with scientists serving as issue advocates, as long as follow the normative imperatives outlined so far, namely that they are open and transparent about their advocacy, communicate the values that shape their policy preferences, and are true to what is conventionally known about the related science.[3] After all, scientists are citizens too and have their own self-interests and values at stake in many policy debates. On evolution, for example, leading science organizations advocate in a bi-partisan way for a clear policy outcome: teaching only evolution in public school science classes. Similarly, on embryonic stem cell research, most scientists favor unrestricted government funding for research (stopping short, of course, of human cloning).

While scientists are often–and justifiably–issue advocates, they can also serve as what Pielke calls honest brokers. In this role, a diversity of scientists, operating for example as an interdisciplinary National Academies panel, seeks to “place scientific understandings in the context of a smorgasbord of policy options” (p. 17). When scientists communicate from the position of honest broker, they openly acknowledge that science alone will not resolve political differences over policy.

Instead scientists use their expertise to expand the scope and diversity of policy options under consideration. For example, on climate change, scientists serving as honest brokers would provide input on the feasibility of cap and trade legislation to reduce greenhouse gas emissions but they would also highlight alternative actions such as the potential of alternative energy technology to reduce emissions or the ability of technology to capture and remove carbon dioxide from the atmosphere (see Tierney, 2009 for a recent discussion.)

Pielke’s typology of issue advocate and honest broker can also be applied to the communication role of journalists in science policy debates, especially in an era when journalists seek new financial models for news production and delivery. For example, many veteran science journalists have been forced to leave their jobs at major news organizations while early career journalists encounter limited job prospects. As an alternative career path, some science reporters have joined with universities or foundations to work at the type of emerging participatory digital media outlets described earlier. The focus at these outlets is not only to inform but also to engage and mobilize the public around problems such as climate change.

Yet whether a journalist is playing the role of issue advocate or honest broker, the same ethical imperative of accuracy and truth-telling applies. Journalists should not engage in the false balancing of first premise claims on issues such as climate change where there is clear expert agreement in an area. Nor should they exaggerate the implications of expert consensus as a way to dramatize a complex topic such as climate change. Otherwise, journalists risk their own credibility and do further harm to public trust in the media as a conveyer of reliable information about science and public affairs (See Revkin, 2008; 2009 for discussions).

COMMUNICATION AS CONSENSUS OR AS CONFLICT?

In January 2008, the National Academies issued a revised edition of Science, Evolution, and Creationism, a report intentionally framed in a manner that would more effectively engage audiences who remain uncertain about evolution and its place in the public school curriculum. To guide their efforts, the Academies commissioned focus groups and a national survey to gauge the extent of lay citizens’ understanding of the processes, nature, and limits of science. They also specifically wanted to test various frames that explained why alternatives to evolution were inappropriate for science class (Labov & Pope, 2008). The National Academies’ use of audience research in structuring their report is worth reviewing, since it stands as a leading example of how to ethically employ framing to move beyond polarization and to promote public dialogue on historically divisive issues.

The Academies’ committee had expected that a convincing storyline for the public on evolution would be a public accountability frame, emphasizing past legal decisions and the doctrine of church-state separation. Yet the data revealed that audiences were not persuaded by this framing of the issue. Instead, somewhat surprisingly, the research pointed to the effectiveness of a social progress frame that defined evolutionary science as the modern building block for advances in medicine and agriculture. The research also underscored the effectiveness of a middle-way/ compromise frame, reassuring the public that evolution and religious faith can be fully compatible. Taking careful note of this feedback, the National Academies decided to structure and then publicize the final version of the report around these core frames.

To reinforce these messages, the National Academies report was produced in partnership with the Institute of Medicine and the authoring committee chaired by Francisco Ayala, a leading biologist who had once trained for the Catholic priesthood.  The report opens with a compelling “detective story” narrative of the supporting evidence for evolution, yet placed prominently in the first few pages is a call out box titled “Evolution in Medicine: Combating New Infectious Diseases,” featuring an iconic picture of passengers on a plane wearing SARS masks. On subsequent pages, other social progress examples are made prominent in call out boxes titled “Evolution in Agriculture: The Domestication of Wheat” and “Evolving Industry: Putting Natural Selection to Work.”  Lead quotes in the press release feature a similar emphasis.

To engage religious audiences, at the end of the first chapter, following a definition of science,  there is a prominent three page special color section that features testimonials from religious scientists, religious leaders and official church position statements, all endorsing the view that religion and evolution are compatible.  Both the report and the press release state that: “The evidence for evolution can be fully compatible with religious faith. Science and religion are different ways of understanding the world.  Needlessly placing them in opposition reduces the potential of each to contribute to a better future.”  In a subsequent journal editorial, these core themes as featured in the report were endorsed by twenty professional science societies and organizations (FASEB 2008).

The Richard Dawkins School of Communication

For the National Academies and these professional societies, political conflicts over evolution have been a lesson learned as to the importance of connecting with diverse audiences and building consensus around commonly shared values. Yet what continues to be the loudest science-affiliated voice on the matter of evolution takes a decidedly different framing strategy. Several scientist authors and pundits, led by the biologist Richard Dawkins (2006), argue that the implications of evolutionary science undermine not only the validity of religion but also respect for all religious faith. Their claims help fuel the conflict frame in the news media, generating journalistic frame devices that emphasize “God vs. Science,” or “Science versus religion.” These maverick communicators, dubbed “The New Atheists,” also reinforce deficit model thinking, consistently blaming conflict over evolution on public ignorance and irrational religious beliefs.

Dawkins, for example, argues as a scientist that religion is comparable to a mental virus or “meme” that can be explained through evolution, that religious believers are delusional, and that in contrast, atheists are representative of a healthy, independent, and pro-science mind. In making these claims, not only does Dawkins use his authority as the “Oxford University Professor of the Public Understanding of Science” to denigrate various social groups, but he gives resonance to the false narrative of social conservatives that the scientific establishment has an anti-religion agenda.

The conflict narrative is powerfully employed in the 2008 anti-evolution documentary Expelled: No Intelligence Allowed.  By relying almost exclusively on interviews with outspoken atheist scientists such as Dawkins and the blogger PZ Myers, Expelled reinforces the false impression that evolution and faith are inherently incompatible and that scientists are openly hostile to religion. In the film, the comedic actor Ben Stein plays the role of a conservative Michael Moore, taking viewers on an investigative journey into the realm of “Big Science,” an institution where Stein concludes that “scientists are not allowed to even think thoughts that involve an intelligent creator.”

Stein and the film’s producers employ a public accountability narrative to suggest that scientists have been denied tenure and that research has been suppressed, all in the service of an atheist agenda to hide the supposedly fatal flaws in evolutionary theory. As central frame devices, the film uses historic footage of the Berlin Wall and emphasizes freedom as a central American value. The sinister message is that “Darwinism” has led to atheism, fascism, and communism. As a corollary, if Americans can join Stein in tearing down the wall of censorship in science it would open the way to religious freedom and cultural renewal.

One leading example from the film is an interview with Myers, a professor of biology at the University of Minnesota-Morris, and author of the Pharyngula blog.  Myers’ comments in the film reflect much of the content of his blog, which is estimated to receive over a 1 million readers per month. Interviewed in his laboratory, against a backdrop of microscopes and scientific equipment, Myers offers the following view of religion:

Religion is naivete that gives some people comfort and we don’t want to take it away from them. It’s like knitting, people like to knit. We are not going to take their knitting needles away, we are not going to take away their churches, but we have to get it to a place where religion is treated at a level that it should be treated. That is something fun that people get together and do on the weekend, and really doesn’t affect their life as much as it has been so far.

In a follow up, when prompted to discuss how he believes this goal might be accomplished, Myers offers a line of reasoning that reflects the deficit model paradigm, arguing that science literacy is in direct conflict with religious belief:

Greater science literacy, which is going to lead to the erosion of religion, and then we will get this nice positive feedback mechanism going where as religion slowly fades away we will get more and more science to replace it, and that will displace more and more religion which will allow more and more science in and we will eventually get to that point where religion has taken that appropriate place as a side dish rather than a main course.

By the end of its spring 2008 run in theaters, Expelled ranked as one of the top grossing public affairs documentaries in U.S. history. Even more troubling have been the advanced screenings of Expelled for policymakers, interest groups, and other influentials. These screenings have been used to promote “Academic Freedom Acts” in several states, legislation that would encourage teachers (as a matter of “academic freedom”) to discuss the alleged flaws in evolutionary science. In June 2008, an Academic Freedom bill was successfully passed into law in Louisiana with similar legislation under consideration in other states (See Nisbet, 2008; 2009a for more).

As social critics and pundits, there is nothing ethically wrong with Dawkins, Myers, and other so-called New Atheists arguing their personal views on religion, using as exclamation points carefully framed comparisons to fairies, hobgoblins, knitting, and child abuse. Similar to the feminist movement of the 1960s, Dawkins describes his communication goal as “consciousness raising” among the non-religious and those skeptical of religion.

Yet when Dawkins and other New Atheists also use the trust granted them as scientists to argue that religion is a scientific question, that science undermines even respect for religious publics, they employ framing unethically, drawing upon the rhetorical authority of science to stigmatize and attack various social groups. In the process, New Atheists turn what normatively should be a public dialogue about science and religion into a shouting match and media spectacle.

Partisan Soldiers with Science on their Side

As described earlier, a significant difference between the Bush and Obama administration, at least at this early stage in the latter’s presidency, is that the Bush administration appeared willing to distort, obstruct, and re-frame for political gain the “first premise” conclusions of scientific experts and agencies, especially on scientific research related to climate change and the environment.

In response, during the Bush administration, many scientists, journalists, elected officials, and political strategists focused on public accountability as a call-to-arms “to defend science.” These advocates accused the George W. Bush administration of putting politics ahead of science and expertise on a number of issues, including climate change. For example, in the 2004 election, Democratic presidential candidate U.S. Senator John Kerry (D-MA) made strategic use of the public accountability frame, comparing distortions on climate change to the administration’s use of intelligence to invade Iraq: ““What I worry about with the president is that he’s not acknowledging what’s on the ground, he’s not acknowledging the realities of North Korea, he’s not acknowledging the truth of the science of stem-cell research or of global warming and other issues.”

In 2005, journalist Chris Mooney’s best-selling The Republican War on Science helped crystallize the public accountability train of thought, turning the “war on science” into a partisan rallying cry.  In 2007, Hillary Clinton, in a speech marking the 50th anniversary of Sputnik, promised to end the “war on science” in American politics, highlighting the emergent prominence of this frame device.

The public accountability frame has outraged and intensified the commitment of many Democrats, environmental advocates, and scientists, motivating them to label Republican and conservative political figures as “deniers” on climate change and to engage in sharp rhetorical attacks on political opponents in other policy disputes. Yet for many members of the public, “war on science” claims are likely ignored as just more elite rancor or only further alienate Republicans on the issue.

Framing will always be a part of electoral politics and scientists as citizens should actively participate in political campaigns. Yet similar to the case of New Atheists, if scientists speak from their authority and institutional position as trusted experts, using framing to claim that a specific political party or a candidate is either “pro-science” or “anti-science,” the result is likely to be both normatively and strategically undesirable.

First, claims of a “war on science” or a “rising anti-science culture” are inaccurate– and similar to the New Atheist movement– reinforce deficit model assumptions. In Congress, for example, on the great majority of issues there is widespread bi-partisan support for science, a reality reflected in Federal spending on basic research and bi-partisan boosterism in areas such as food biotechnology (see Nisbet & Huge, 2006 for a review). Even members of Congress who personally believe in creationism are likely to vote for broad-based funding of scientific research, since they perceive science generally in terms of social progress and economic competitiveness. Moreover, in terms of the general public, as detailed at the beginning of this chapter, public opinion research shows that science and scientists enjoy widespread admiration, trust, and support among Americans, no matter their political identification or religious views.

The unintended consequence of “war on science” claims is that given the miserly nature of the public, the framing strategy easily reinforces the partisan divide on issues such as stem cell research and climate change while promoting a false narrative that science is for Democrats and not for Republicans. Since 2004, when the Democratic Party began to use stem cell research and climate change as part of an electoral “wedge strategy,” public perceptions have predictably followed. With these partisan messages as a strong heuristic, polls show that the differences between Democrats and Republicans in views of embryonic stem cell research and climate change have widened to more than thirty percentage points respectively (Dunlap & McCright 2008; Pew 2008; VCU Life Sciences, 2008).

In fact, this persistent and widening gap in perceptions over the past decade suggests that climate change and stem cell research have joined a short list of issues such as gun control or taxes that define what it means to be a partisan in the United States. So like the New Atheists, while “war on science” claimants believe they are defending the integrity of science, they are more likely to be part of the communication problem, reinforcing partisan divisions across key issues.

CONCLUSION

In this chapter, I have argued that in communicating about science-related policy issues, scientists and journalists can adopt one of two roles, either serving as honest brokers or as issue advocates. In either role, the use of framing is unavoidable since it is a natural part of the communication process. In fact, past research points to a set of frames that apply consistently across science-related debates, serving as interpretative resources that society draws upon to collectively make sense of complex and uncertain policy choices.

Even though it is very difficult to “break the frame” on a science-related topic, this deductive typology for identifying the meanings and interpretations surrounding a policy-debate can be very useful. Audience-based research can and should inform communication planning and strategy, leading to a range of potential outcomes. Yet in applying framing research to public engagement efforts, there are four key ethical imperatives to keep in mind. These include:

* Emphasizing dialogue and the exchange of perspectives, rather than traditional top-down approaches to communication. This imperative can be promoted either through face-to-face deliberative forums, new models of digital participatory media, and/or as in the case of the National Academies, using research to identify frames that emphasize common ground and promote dialogue.

* Effectively and transparently communicating the values—or the second premise—that guides a policy decision rather than simplistically defining a policy debate as a matter of “sound science” or “driven by science.” In a policy debate, when scientists or journalists focus exclusively on these types of first premise claims, they create the incentives for interest groups to turn science into just another political resource, leading to distortion and exaggerations over scientific evidence and uncertainty.

* No matter their role as issue advocate or honest broker, accuracy in communication needs to be maintained. Both scientists and journalists must respect the uncertainty that is inherent to any technical question, resisting the tendency to engage in either false balance or exaggeration. As in the case of climate change, each time a scientific claim is proven false or inaccurate; it risks further alienating publics already distrustful of the science and scientists.

* Finally, scientists and journalists should avoid using framing to denigrate or attack religion or to define political parties and leaders as either “anti-science” or “pro-science.” Framing will always be an effective and legitimate part of social criticism and electoral politics, but for scientists and journalists to simplistically define critiques of religion or opposition to a political candidate as a “matter of science and reason” is not only inaccurate, but also alienates key publics, impairing efforts at dialogue and consensus-building.

REFERENCES

Bauer, M. (2008). Survey research and the public understanding of science. In M. Bucchi & B. Smart (Eds.), Handbook of Public Communication on Science and Technology. London: Routledge.

Besley, J. C., Kramer, V. L., Yao, Q., & Toumey, C. P. (2008). Interpersonal discussion following citizen engagement on emerging technology. Science Communication, 30 (4), 209-235.

Besley, J. C., & McComas, K. A. (2005). Framing justice: Using the concept of procedural justice to advance political communication research. Communication Theory, 4, 414-436.

Borchelt, R. & Hudson, K. (2008). Engaging the scientific community with the public. Science Progress, Spring/Summer, 78–81.

CSEC (2001) Public attitudes to agricultural biotechnologies in Europe: final report of PABE project, CSEC, Lancaster University. (http://www.lancs.ac.uk/users/csec/).

Croteau, D., Hoynes, W., & Ryan, C. (2005). Rhyming hope and history: Activists, academics, and social movement scholarship. University of Minnesota Press.

Dahinden, U. (2002). Biotechnology in Switzerland: Frames in a Heated Debate. Science Communication, 24, 184-197.

Dawkins, R. (2006). The God Delusion. New York: Houghton Mifflin.

Downs, A. (1957). An economic theory of democracy. New York: Harper.

Dunlap, R.E. & McCright, A.M (2008). A Widening Gap: Republican and Democratic Views on Climate Change. Environment 50, 5, 26-35.

Durant, J., Bauer, M.W., & Gaskell, G. (1998). Biotechnology in the Public Sphere: A European Sourcebook. Lansing, MI: Michigan State University Press.

Einsiedel, E. (2008). Public engagement and dialogue: a research review. In M. Bucchi & B. Smart (Eds.), Handbook of Public Communication on Science and Technology. London: Routledge.

Ferree, M.M., Gamson, W.A., Gerhards, J., & Rucht, J. (1992). Shaping Abortion Discourse: Democracy and the Public Sphere in Germany and the United States. New York, NY: Cambridge University Press.

Gamson, WA. & Modigliani, A. (1989). Media Discourse and Public Opinion on Nuclear Power: A Constructionist Approach. American Journal of Sociology, 95, 1-37.

Goidel, K. & Nisbet, M.C. (2006). Exploring the Roots of Public Participation in the Controversy over Stem Cell Research and Cloning. Political Behavior, 28 (2), 175-192.

Goldston, D. (2008). The Sputnik Myth. Nature  456, 561.

Hellsten, I. & Nerlich, B. (2008). Genetics and genomics: The politics and ethics of metaphorical framing. In M. Bucchi & B. Smart (Eds.), Handbook of Public Communication on Science and Technology. London: Routledge.

Hilgartner, S. (2000) Science on Stage: Expert Advice as Public Drama. Stanford,

CA: Stanford University Press.

Holland, E.A., Pleasant, A., Quatrano, S., Gerst, R., Nisbet, M.C. & Mooney, C. (2007). Letters: The Risks and Advantages of Framing Science. Science 317 (5842), 1168b.

Irwin, A. & Michael, M. (2003). Science, social theory, and public knowledge. Philadelphia: Open University Press.

Labov, J. & Pope, B.K. (2008). Understanding Our Audiences: The Design and Evolution of Science, Evolution, and Creationism. CBE-Life Sciences Education.

Leiserowitz, A., Maibach, E., & Roser-Renouf, C. (2008). Global warming’s six Americas. Yale University Project on Climate Change. Available at http://research.yale.edu/environment/uploads/SixAmericas.pdf.

Marris, C. (2001). Public views on GMOs: deconstructing the myths. EMBO Reports, 2, 7, 545–548.

Mooney, C. (2005). The Republican War on Science.  New York: Basic Books.

National Academies of Science and Institute of Medicine (2008). Science, Evolution, and Creationism. National Academies Press: Washington, DC.

National Science Board (NSB). 2008. Science and Engineering Indicators 2008. NSB 08-01; NSB 08-01A. Arlington, VA: National Science Foundation.

McKibbon, B. (2008). The Bill McKibbon Reader. New York: Holt Paperbacks.

Nisbet, M.C. (2008, Sept./Oct.) Ben Stein’s Trojan Horse: Mobilizing the state house and local news agenda. Skeptical Inquirer, 32, 5, 16-18.

Nisbet, M.C. (2009a). Framing Science: A New Paradigm in Public Engagement. In L. Kahlor & P. Stout (eds.), Understanding Science: New Agendas in Science Communication. New York: Taylor & Francis.

Nisbet, M.C. (2009b). Communicating Climate Change: Why Frames Matter to Public Engagement. Environment, 51 (2) 12-23.

Nisbet, M.C. & Goidel, K. (2007). Understanding citizen perceptions of science controversy: Bridging the ethnographic-survey research divide. Public Understanding of Science, 16, 4, 421-440.

Nisbet, M.C. & Huge, M (2006). Attention cycles and frames in the plant biotechnology debate: Managing power and participation through the press/policy connection. Harvard International Journal of Press/Politics, 11, 2, 3-40.

Nisbet, M.C. & Lewenstein, B.V. (2002). Biotechnology and the American media: The policy process and the elite press, 1970 to 1999. Science Communication, 23 (4) 359-391.

Pan, Z. & Kosicki, G. M. (1993). Framing analysis: An approach to news discourse. Political Communication, 10, 55-75.

Pew (2008, May 8). A Deeper Partisan Divide on Global Warming. Pew Research Center for the People and the Press. Retrieved on November 15, 2008, from http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming.

Pielke, R. (2007). The Honest Broker: Making sense of science in policy and politics. London: Cambridge University Press.

Popkin, S.L. (1991). The Reasoning Voter, Chicago, IL: Univ. of Chicago Press.

Powell, M. & Kleinman, D. (2008). Building citizen capacities for participation in nanotechnology decision-making. Public Understanding of Science, 17,  3, 329-348

Price, V., Nir, L., & Capella, J.N. (2005). Framing public discussion of gay civil unions. Public Opinion Quarterly, 69, (2), 179-212.

Prior, M. (2005). News v. Entertainment: How Increasing Media Choice Widens Gaps in Political Knowledge and Turnout. American Journal of  Political Science, 49, 577, 2005.

Revkin, A. (2007). Climate change as news: Challenges in communicating environmental science.  In J. C. DiMento & P. M. Doughman (Eds.), Climate change: What it means for us, our children, and our grandchildren (pp. 139-160). Boston, MA: MIT Press.

Revkin, A. (2009, Feb. 24). In climate debate, exaggeration is a pitfall. New York Times. Available at http://www.nytimes.com/2009/02/25/science/earth/25hype.html.

Scheufele, D.A. (1999).  “Framing as a Theory of Media Effects.”  Journal of Communication, 29, 103-123.

Scheufele, D. A. (2006). Messages and heuristics: How audiences form attitudes about emerging technologies. In J. Turney (Ed.), Engaging science: Thoughts, deeds, analysis and action (pp. 20-25). London: The Wellcome Trust.

Scheufele, D. A., & Tewksbury, D. (2007). Framing, agenda setting, and priming: The evolution of three media effects models. Journal of Communication, 57(1), 9-20.

Time. (2006, April 3). Global warming: Be worried. Be very worried. Retrieved on November 15, 2008, from (http://www.time.com/time/covers/0,16641,20060403,00.html.

The Scientist (2007). Here’s what you think. The Scientist, 21 (10), 42.

Wynne, B. (1992).  Misunderstood misunderstanding: Social identities and public uptake of science.  Public Understanding of Science, 1, 281-304.


[1] I owe the comparison of “communicating the first and second premise” to Oregon State University philosopher Kathleen Dean Moore, who organized a March 2009 workshop bringing together humanists, artists, social scientists, and scientists to strategize new ways of communicating “the second premise” on climate change, or a values-based reason for action. See the Web site of The Spring Creek Project at http://springcreek.oregonstate.edu/.

[2] The call to action and affiliated Web project can be found at http://christiansandclimate.org/learn/call-to-action/.

[3] In the next section, I discuss two other important imperatives, namely to avoid denigrating or stereotyping rival social groups and to avoid defining one political party or political candidate as either “pro-science” or “anti-science.”