Social Prejudice – Watts Up With That?

Reposted by Dr. Judith Curry’s Climate Etc.

Posted on Apr 25th, 2021 by curryja |

by Judith Curry

“Is the road to scientific hell paved with good intentions?” – political psychologist Philip Tetlock (1994)

Part I of this series dealt with logical errors. Part II dealt with prejudices related to a consensus-building process. Part II deals with the role of social conflict and prejudice.

Additional prejudices are sparked by social conflicts between an individual’s responsibility for responsible research and the larger ethical issues related to public and environmental welfare. In addition, social prejudices are triggered by career-oriented goals, loyalty to colleagues, and institutional loyalty.

Scientists are responsible for following the principles of ethical research and professional standards. But what if other responsibilities stand in the way of these professional standards? This may include responsibilities to their conscience, their colleagues, institutions, the public and / or the environment. One can imagine many different conflicts in this area of ​​responsibility that can affect the scientific process. For example, academics who have been heavily involved with the IPCC may be interested in preserving the importance of the IPCC and its consensus, which is central to their professional success, funding and impact.

Arguably the most important of these are conflicts between the responsible conduct of research and larger ethical issues related to the well-being of the public and the environment. Fuller and Mosher’s book Climategate: The CruTape Letters argued that “corruption for noble reasons” was a major motivation for Climategate’s deceptions. Noble cause Corruption is when the goals of protecting the climate (noble) justify the means to sabotage your scientific opponents (ignoble).

University of Virginia psychologist Brian Nosek claims that the most common and problematic tendency in science is “motivated thinking.” People who have a “dog in battle” (reputation, financial, ideological, political) interpret observations to match a particular idea that supports their particular “dog”. The term “motivated thinking” is usually reserved for political motivations, but maintaining its reputation or funding is also a powerful motivator among scholars.

Political values ​​are embedded in science when value statements or ideological claims are mistakenly treated as objective truth. Scientists have a range of attitudes towards the environment; The problem arises because there is a presumption that a number of attitudes are correct and those who disagree are in denial. This translates into “reality” a widespread political ideology about climate change.

Affirmation bias can become even worse when people are confronted with questions that spark moral emotions and group identity concerns. People’s beliefs become more extreme when surrounded by like-minded colleagues. They assume that their opinions are not only the norm but also the truth – they create what social psychologist Jonathan Haidt calls a “tribal moral community” with their own sacred values ​​about what it is worth to be studied and what is taboo. Such prejudices can lead to generally accepted claims that reflect the blind spots of the scientific community more than legitimate scientific conclusions.

Psychologists Cusiman and Lombrozo noted that people faced with a dilemma between believing in an impartial assessment of evidence and believing what would better fulfill a moral obligation often believe in line with the latter. Cuisman and Lombrozo found that morally good beliefs require less evidence than morally bad beliefs. They also found that people sometimes view the moral worth of a belief as an independent justification for the belief.

Motivated prejudices become particularly problematic when these prejudices are institutionalized, with endorsement from professional societies, editorials from magazine editors, and public statements from the IPCC leadership.

Like this:

To like Loading…

Comments are closed.