Tag Archives: Kathleen Hall Jamieson

Credibility slips according to US survey on public perceptions of scientists

Figure 1. Perceptions of scientists’ credibility. [downloaded from https://www.annenbergpublicpolicycenter.org/annenberg-survey-finds-public-perceptions-of-scientists-credibility-slips/]

A June 26, 2024 news item on ScienceDaily describes the research, which resulted in the graphic you see in the above,

New analyses from the Annenberg Public Policy Center find that public perceptions of scientists’ credibility — measured as their competence, trustworthiness, and the extent to which they are perceived to share an individual’s values — remain high, but their perceived competence and trustworthiness eroded somewhat between 2023 and 2024. The research also found that public perceptions of scientists working in artificial intelligence (AI) differ from those of scientists as a whole.

A June 26, 2024 Annenberg Public Policy Center of the University of Pennsylvania news release (also on EurekAlert and also received by email), which originated the news item, describes a series of surveys, how the information was gathered, and offers more detail about he results, Note 1: All links removed; Note 2: You can find links and citations for papers mentioned in the news release at the end of this posting.

From 2018-2022, the Annenberg Public Policy Center (APPC) of the University of Pennsylvania relied on national cross-sectional surveys to monitor perceptions of science and scientists. In 2023-24, APPC moved to a nationally representative empaneled sample to make it possible to observe changes in individual perceptions.

The February 2024 findings, released today to coincide with the address by National Academy of  Sciences President Marcia McNutt on “The State of the Science,” come from an analysis of responses from an empaneled national probability sample of U.S. adults surveyed in February 2023 (n=1,638 respondents), November 2023 (n=1,538), and February 2024 (n=1,555).

Drawing on the 2022 cross-sectional data, in an article titled “Factors Assessing Science’s Self-Presentation model and their effect on conservatives’ and liberals’ support for funding science,” published in Proceedings of the National Academy of Sciences (September 2023), Annenberg-affiliated researchers Yotam Ophir (State University of New York at Buffalo and an APPC distinguished research fellow), Dror Walter (Georgia State University and an APPC distinguished research fellow), and Patrick E. Jamieson and Kathleen Hall Jamieson of the Annenberg Public Policy Center isolated factors that underlie perceptions of scientists (Factors Assessing Science’s Self-Presentation, or FASS). These factors predict public support for increased funding of science and support for federal funding of basic research.

The five factors in FASS are whether science and scientists are perceived to be credible and prudent, and whether they are perceived to overcome bias, correct error (self-correcting), and whether their work benefits people like the respondent and the country as a whole (beneficial). In a 2024 publication titled “The Politicization of Climate Science: Media Consumption, Perceptions of Science and Scientists, and Support for Policy” (May 26, 2024) in the Journal of Health Communication, the same team showed that these five factors mediate the relationship between exposure to media sources such as Fox News and outcomes such as belief in anthropogenic climate change, perception of the threat it poses, and support for climate-friendly policies such as a carbon tax.

Speaking about the FASS model, Jamieson, director of the Annenberg Public Policy Center and director of the survey, said that “because our 13 core questions reliably reduce to five factors with significant predictive power, the ASK survey’s core questions make it possible to isolate both stability and changes in public perception of science and scientists across time.” (See the appendix for the list of questions.)

The new research finds that while scientists are held in high regard, two of the three dimensions that make up credibility – perceptions of competence and trustworthiness – showed a small but statistically significant drop from 2023 to 2024, as did both measures of beneficial. The 2024 survey data also indicate that the public considers AI scientists less credible than scientists in general, with notably fewer people saying that AI scientists are competent and trustworthy and “share my values” than scientists generally.

“Although confidence in science remains high overall, the survey reveals concerns about AI science,” Jamieson said. “The finding is unsurprising. Generative AI is an emerging area of science filled with both great promise and great potential peril.”

The data are based on Annenberg Science Knowledge (ASK) waves of the Annenberg Science and Public Health (ASAPH) surveys conducted in 2023 and 2024. The findings labeled 2023 are based on a February 2023 survey, and the findings labeled 2024 are based on combined ASAPH survey half-samples surveyed in November 2023 and February 2024.

For further details, download the toplines and a series of figures that accompany this summary.

Perceptions of scientists overall

In the FASS model, perceptions of scientists’ credibility are assessed through perceptions of whether scientists are competent, trustworthy, and “share my values.” The first two of those values slipped in the most recent survey. In 2024, 70% of those surveyed strongly or somewhat agree that scientists are competent (down from 77% in 2023) and 59% strongly or somewhat agree that scientists are trustworthy (down from 67% in 2023). (See figure 1 [see the first item in this post], and figs. 2-4 for other findings.)

The survey also found that in 2024, fewer people felt that scientists’ findings benefit “the country as a whole” and “benefit people like me.” In 2024, 66% strongly or somewhat agreed that findings benefit the country as a whole (down from 75% in 2023). Belief that scientists’ findings “benefit people like me,” also declined, to 60% from 68%. Taken together those two questions make up the beneficial factor of FASS. (See fig. 5.)

The findings follow sustained attacks on climate and Covid-19-related science and, more recently, public concerns about the rapid development and deployment of artificial intelligence.

Comparing perceptions of scientists in general with climate and AI scientists

Credibility: When asked about three factors underlying scientists’ credibility, AI scientists have lower credibility in all three values. (See fig. 6.)

  • Competent: 70% strongly/somewhat agree that scientists are competent, but only 62% for climate scientists and 49% for AI scientists.
  • Trustworthy: 59% agree scientists are trustworthy, 54% agree for climate scientists, 28% for AI scientists.
  • Share my values: A higher number (38%) agree that climate scientists share my values than for scientists in general (36%) and AI scientists (15%). More people disagree with this for AI scientists (35%) than for the others.

Prudence: Asked whether they agree or disagree that science by various groups of scientists “creates unintended consequences and replaces older problems with new ones,” over half of those surveyed (59%) agree that AI scientists create unintended consequences and just 9% disagree. (See fig. 7.)

Overcoming bias: Just 42% of those surveyed agree that scientists “are able to overcome human and political biases,” but only 21% feel that way about AI scientists. In fact, 41% disagree that AI scientists are able to overcome human political biases. In another area, just 23% agree that AI scientists provide unbiased conclusions in their area of inquiry and 38% disagree. (See fig. 8.)

Self-correction: Self-correction, or “organized skepticism expressed in expectations sustaining a culture of critique,” as the FASS paper puts it, is considered by some as a “hallmark of science.” AI scientists are seen as less likely than scientists or climate scientists to take action to prevent fraud; take responsibility for mistakes; or to have mistakes that are caught by peer review. (See fig. 9.)

Benefits: Asked about the benefits from scientists’ findings, 60% agree that scientists’ “findings benefit people like me,” though just 44% agree for climate scientists and 35% for AI scientists. Asked about whether findings benefit the country as a whole, 66% agree for scientists, 50% for climate scientists and 41% for AI scientists. (See fig. 10.)

Your best interest: The survey also asked respondents how much trust they have in scientists to act in the best interest of people like you. (This specific trust measure is not a part of the FASS battery.) Respondents have less trust in AI scientists than in others: 41% have a great deal/a lot of trust in medical scientists; 39% in climate scientists; 36% in scientists; and 12% in AI scientists. (See fig. 11.)

The data from ASK surveys have been used to date in four peer-reviewed papers:

  • Using 2019 ASK data: Jamieson, K. H., McNutt, M., Kiermer, V., & Sever, R. (2019). Signaling the trustworthiness of science. Proceedings of the National Academy of Sciences, 116(39), 19231-19236.
  • Using 2022 ASK data: Ophir, Y., Walter, D., Jamieson, P. E., & Jamieson, K. H. (2023). Factors Assessing Science’s Self-Presentation model and their effect on conservatives’ and liberals’ support for funding science. Proceedings of the National Academy of Sciences, 120(38), e2213838120.
  • Using  2024 ASK data: Lupia, A., Allison, D. B., Jamieson, K. H., Heimberg, J., Skipper, M., & Wolf, S. M. (2024). Trends in US public confidence in science and opportunities for progress. Proceedings of the National Academy of Sciences, 121(11), e2319488121. 
  • Using Nov 2023 and Feb 2024 ASK data: Ophir, Y., Walter, D., Jamieson, P. E., & Jamieson, K. H. (2024). The politicization of climate science: Media consumption, perceptions of science and scientists, and support for policy. Journal of Health Communication, 29(sup1): 18-27.
     

APPC’s ASAPH survey

The survey data come from the 17th and 18th waves of a nationally representative panel of U.S. adults, first empaneled in April 2021, conducted for the Annenberg Public Policy Center by SSRS, an independent market research company. These waves of the Annenberg Science and Public Health (ASAPH) knowledge survey were fielded February 22-28, 2023, November 14-20, 2023, and February 6-12, 2024, and have margins of sampling error (MOE) of ± 3.2, 3.3 and 3.4 percentage points at the 95% confidence level. In November 2023, half of the sample was asked about “scientists” and the other half “climate scientists.” In February 2024, those initially asked about “scientists” were asked about “scientists studying AI” and the other half “scientists.” This provided two half samples addressing specific areas of study, while all panelists were asked about “scientists” generally. All figures are rounded to the nearest whole number and may not add to 100%. Combined subcategories may not add to totals in the topline and text due to rounding.

The policy center has been tracking the American public’s knowledge, beliefs, and behaviors regarding vaccination, Covid-19, flu, maternal health, climate change, and other consequential health issues through this survey panel for over three years. In addition to Jamieson, the APPC team includes Shawn Patterson Jr., who analyzed the data; Patrick E. Jamieson, director of the Annenberg Health and Risk Communication Institute, who developed the questions; and Ken Winneg, managing director of survey research, who supervised the fielding of the survey.

Here are links to and citations for the papers listed above in the June 26, 2024 news release,

Using 2019 ASK data: Signaling the trustworthiness of science by Kathleen Hall Jamieson, Marcia McNutt, Veronique Kiermer, and Richard Sever.. Proceedings of the National Academy of Sciences (PNAS), 116 (39), 19231-19236 September 23, 2019 DOI: https://doi.org/10.1073/pnas.1913039116

Using 2022 ASK data Factors Assessing Science’s Self-Presentation model and their effect on conservatives’ and liberals’ support for funding science by Yotam Ophir, Dror Walter, Patrick E. Jamieson, and Kathleen Hall Jamieson.. Proceedings of the National Academy of Sciences (PNAS), 120 (38), e2213838120 September 11, 2023 DOI: https://doi.org/10.1073/pnas.2213838120

Using  2024 ASK data: Trends in US public confidence in science and opportunities for progress by Arthur Lupia, David B. Allison, Kathleen Hall Jamieson, Jennifer Heimberg, Magdalena Skipper, and Susan M. Wolf.. Proceedings of the National Academy of Sciences (PNAS), 121 (11), e2319488121 March 4, 2024 DOI: https://doi.org/10.1073/pnas.2319488121 

Using Nov 2023 and Feb 2024 ASK data: The politicization of climate science: Media consumption, perceptions of science and scientists, and support for policy by by Yotam Ophir, Dror Walter, Patrick E. Jamieson & Kathleen Hall Jamieson.. Journal of Health Communication, 29 (sup1): 18-27 DOI: https://doi.org/10.1080/10810730.2024.2357571 Published online: 26 May 2024

The 2019 paper ‘Signaling …’ has been featured here before in a September 30, 2019 posting, “Do you believe in science?” In addition to some of my comments, I embedded Adam Lambert’s version of Cher’s song ‘Do You Believe in Love?’ where you’ll see Cher brush away a few tears as she listens to her dance hit made love ballad.

The 2024 paper ‘Trends ..’ has also been featured here, albeit briefly, in an April 8, 2024 posting, “Trust in science remains high but public questions scientists’ adherence to science’s norms.”

Trust in science remains high but public questions scientists’ adherence to science’s norms

A March 4, 2024 Annenberg Public Policy Center of the University of Pennsylvania news release (also on EurekAlert and received via email) announces research into public trust in science in the US,

Science is one of the most highly regarded institutions in America, with nearly three-quarters of the public expressing “a great deal” or “a fair amount” of confidence in scientists. But confidence in science has nonetheless declined over the past few years, since the early days of the Covid-19 pandemic, as it has for most other major social institutions.

In a new article, members of the Strategic Council of the National Academies of Sciences, Engineering, and Medicine [NASEM] examine what has happened to public confidence in science, why it has happened, and what can be done to elevate it. The researchers write that while there is broad public agreement about the values that should underpin science, the public questions whether scientists actually live up to these values and whether they can overcome their individual biases.

The paper, published in the Proceedings of the National Academy of Sciences (PNAS), relies in part on new data being released in connection with this article by the Annenberg Public Policy Center (APPC) of the University of Pennsylvania. The data come from the Annenberg Science Knowledge (ASK) survey conducted February 22-28, 2023, with an empaneled, nationally representative sample of 1,638 U.S. adults who were asked about their views on scientists and science. The margin of error for the entire sample is ± 3.2 percentage points at the 95% confidence level. (See the paper for the findings.) The survey is directed by APPC director Kathleen Hall Jamieson, a member of the Strategic Council and a co-author of the PNAS paper.

Decline in confidence comparable to other institutions

The researchers also examine trends in public confidence in science dating back 20 years from other sources, including the Pew Research Center and the General Social Survey of National Opinion Research at the University of Chicago. These show a recent decline consistent with the decline seen for other institutions.

“We’re of the view that trust has to be earned,” said lead author Arthur Lupia, a member of the NASEM’s Strategic Council for Research Excellence, Integrity, and Trust, and associate vice president for research at the University of Michigan. “We wanted to understand how trust in science is changing, and why, and is there anything that the scientific enterprise can do to regain trust?”

Highlights

“Confidence in science is high relative to nearly all other civic, cultural, and government institutions…,” the article states. In addition:

  • The public has high levels of confidence in scientists’ competence, trustworthiness, and honesty – 84% of survey respondents in February 2023 are very or somewhat confident that scientists provide the public with trustworthy information in the scientists’ area of inquiry.
  • Many in the public question whether scientists share their values and whether scientists can overcome their own biases. For instance, when asked whether scientists will or will not publish findings if a study’s results run counter to the interests of the organization running the study, 70% said scientists will not publish the findings.
  • The public has “consistent beliefs about how scientists should act and beliefs that support their confidence in science despite their concerns about scientists’ possible biases and distortive incentives.” For example, 84% of U.S. adults say it is somewhat or very important for scientists to disclose their funders and 92% say it is somewhat or very important that scientists be open to changing their minds based on new evidence.
  • However, when asked about scientists’ biases, just over half of U.S. adults (53%) say scientists provide the public with unbiased conclusions about their area of inquiry and just 42% say scientists generally are “able to overcome their human and political biases.”

Beyond measurements of trust in science

The Annenberg Public Policy Center’s ASK survey in February 2023 asked U.S. adults more nuanced questions about attitudes toward scientists.

“We’ve developed measures beyond trust or confidence in science in order to understand why some in the public are less supportive of science and scientists than others,” said Jamieson, who is also a professor of communication at the University of Pennsylvania’s Annenberg School for Communication. “Perceptions of whether scientists share one’s values, overcome their human and political biases, and correct mistakes are important as well.”

The ASK survey of U.S. adults found, for instance, that 81% regard scientists as competent, 70% as trustworthy, and 68% as honest, but only 42% say scientists “share my values.”

A more detailed analysis of the variables and effects seen in Annenberg’s surveys was published in September 2023 in PNAS in the paper “Factors Assessing Science’s Self-Presentation model and their effect on conservatives’ and liberals’ support for funding science.”

Confidence in science and Covid-19 vaccination status

The research published in PNAS was initiated by members of the NASEM’s Strategic Council for Research Excellence, Integrity, and Trust, which was established in 2021 to advance the integrity, ethics, resilience, and effectiveness of the research enterprise.

Lupia said the Strategic Council’s conversations about whether trust in science was declining and if so, why, began during the pandemic. “There was great science behind the Covid-19 vaccine, so why was the idea of people taking it so controversial?” he asked. “Covid deaths were so visible and yet the controversy over the vaccine was also so visible – kind of an icon of the public-health implications of declining trust in science.”

The article cites research from the Annenberg Public Policy Center that found important relationships between science-based forms of trust and the willingness to take a Covid-19 vaccine. Data from waves of another APPC survey of U.S. adults in five swing states during the 2020 campaign season – reported in a 2021 article in PNAS – showed that from July 2020 to February 2021, U.S. adults’ trust in health authorities was a significant predictor of the reported intention to get the Covid-19 vaccine. See the article “The role of non-COVID-specific and COVID-specific factors in predicting a shift in willingness to vaccinate: A panel study.”

How to raise confidence in science

Raising public confidence in science, the researchers write, “should not be premised on the assumption that society would be better off with higher levels of uncritical trust in the scientific community. Indeed, uncritical trust in science would violate the scientific norm of organized skepticism and be antithetical to science’s culture of challenge, critique, and self-correction.”

“Instead,” they propose, “researchers, scientific organizations, and the scientific community writ large need to redouble their commitment to conduct, communicate, critique, and – when error is found or misconduct detected – correct the published record in ways that both merit and earn public confidence.”

The data cited in the paper, they conclude, “suggest that the scientific community’s commitment to core values such as the culture of critique and correction, peer review, acknowledging limitations in data and methods, precise specification of key terms, and faithful accounts of evidence in every step of scientific practice and in every engagement with the public may help sustain confidence in scientific findings.”

“Trends in U.S. Public Confidence in Science and Opportunities for Progress” was published March 4, 2024, in PNAS. In addition to Jamieson and Lupia, the authors are David B. Allison, dean of the School of Public Health, Indiana University; Jennifer Heimberg, of the National Academies of Sciences, Engineering, and Medicine; Magdalena Skipper, editor-in-chief of the journal Nature; and Susan M. Wolf, of the University of Minnesota Law and Medical Schools. Allison is co-chair of the National Academies’ Strategic Council; Lupia, Jamieson, Skipper, and Wolf are members of the Council, and Heimberg is the director of the Council.

Here’s a link to and a citation for the paper,

Trends in U.S. public confidence in science and opportunities for progress by Arthur Lupia, David B. Allison, Kathleen Hall Jamieson, and Susan M. Wolf. PNAS March 4, 2024 121 (11) e2319488121 DOI: https://doi.org/10.1073/pnas.2319488121

This paper is open access.

Curiosity may not kill the cat but, in science, it might be an antidote to partisanship

I haven’t stumbled across anything from the Cultural Cognition Project at Yale Law School in years so before moving onto their latest news, here’s more about the project,

The Cultural Cognition Project is a group of scholars interested in studying how cultural values shape public risk perceptions and related policy beliefs. Cultural cognition refers to the tendency of individuals to conform their beliefs about disputed matters of fact (e.g., whether global warming is a serious threat; whether the death penalty deters murder; whether gun control makes society more safe or less) to values that define their cultural identities.Project members are using the methods of various disciplines — including social psychology, anthropology, communications, and political science — to chart the impact of this phenomenon and to identify the mechanisms through which it operates. The Project also has an explicit normative objective: to identify processes of democratic decisionmaking by which society can resolve culturally grounded differences in belief in a manner that is both congenial to persons of diverse cultural outlooks and consistent with sound public policymaking.

It’s nice to catch up with some of the project’s latest work, from a Jan. 26, 2017 Yale University news release (also on EurekAlert),

Disputes over science-related policy issues such as climate change or fracking often seem as intractable as other politically charged debates. But in science, at least, simple curiosity might help bridge that partisan divide, according to new research.

In a study slated for publication in the journal Advances in Political Psychology, a Yale-led research team found that people who are curious about science are less polarized in their views on contentious issues than less-curious peers.

In an experiment, they found out why: Science-curious individuals are more willing to engage with surprising information that runs counter to their political predispositions.

“It’s a well-established finding that most people prefer to read or otherwise be exposed to information that fits rather than challenges their political preconceptions,” said research team leader Dan Kahan, Elizabeth K. Dollard Professor of Law and professor of psychology at Yale Law School. “This is called the echo-chamber effect.”

But science-curious individuals are more likely to venture out of that chamber, he said.

“When they are offered the choice to read news articles that support their views or challenge them on the basis of new evidence, science-curious individuals opt for the challenging information,” Kahan said. “For them, surprising pieces of evidence are bright shiny objects — they can’t help but grab at them.”

Kahan and other social scientists previously have shown that information based on scientific evidence can actually intensify — rather than moderate — political polarization on contentious topics such as gun control, climate change, fracking, or the safety of certain vaccines. The new study, which assessed science knowledge among subjects, reiterates the gaping divide separating how conservatives and liberals view science.

Republicans and Democrats with limited knowledge of science were equally likely to agree or disagree with the statement that “there is solid evidence that global warming is caused by human activity. However, the most science-literate conservatives were much more likely to disagree with the statement than less-knowledgeable peers. The most knowledgeable liberals almost universally agreed with the statement.

“Whatever measure of critical reasoning we used, we always observed this depressing pattern: The members of the public most able to make sense of scientific evidence are in fact the most polarized,” Kahan said.

But knowledge of science, and curiosity about science, are not the same thing, the study shows.

The team became interested in curiosity because of its ongoing collaborative research project to improve public engagement with science documentaries involving the Cultural Cognition Project at Yale Law School, the Annenberg Public Policy Center of the University of Pennsylvania, and Tangled Bank Studios at the Howard Hughes Medical Institute.

They noticed that the curious — those who sought out science stories for personal pleasure — not only were more interested in viewing science films on a variety of topics but also did not display political polarization associated with contentious science issues.

The new study found, for instance, that a much higher percentage of curious liberals and conservatives chose to read stories that ran counter to their political beliefs than did their non-curious peers.

“As their science curiosity goes up, the polarizing effects of higher science comprehension dissipate, and people move the same direction on contentious policies like climate change and fracking,” Kahan said.

It is unclear whether curiosity applied to other controversial issues can minimize the partisan rancor that infects other areas of society. But Kahan believes that the curious from both sides of the political and cultural divide should make good ambassadors to the more doctrinaire members of their own groups.

“Politically curious people are a resource who can promote enlightened self-government by sharing scientific information they are naturally inclined to learn and share,” he said.

Here’s my standard link to and citation for the paper,

Science Curiosity and Political Information Processing by Dan M. Kahan, Asheley R Landrum, Katie Carpenter, Laura Helft, and Kathleen Hall Jamieson. Political Psychology Volume 38, Issue Supplement S1 February 2017 Pages 179–199 DOI: 10.1111/pops.12396View First published: 26 January 2017

This paper is open and it can also be accessed here.

I last mentioned Kahan and The Cultural Cognition Project in an April 10, 2014 posting (scroll down about 45% of the way) about responsible science.