How Easy Can It Be to Impress Consumers With Scientific Bullshit?

Consumers occasionally encounter statements in different forms of media that seem impressive, scientific and knowledgeable. Scientific statements are often phrased as rules about relationships between entities, or law-like arguments about the effects of some entities or processes on others. However, while the content of ‘bullshit’ statements may sound convincing, they are false and may be essentially meaningless. Let alone that such statements can impress consumers, they might nonetheless misguide consumers and mislead them to adopt false beliefs and attitudes or take damaging actions. In some areas, like medicine and pharmaceuticals, the consequences may be more negative.

Science-based statements may be included in particular in advertising claims of brands about products or services, as a way to corroborate their capabilities or benefits. The expected advantage in adding scientific evidence is to increase the persuasive power of the message and enhance the confidence of consumers in buying and using the product or service. Yet, scientific statements may also appear in other kinds of marketing and political campaigns. When such statements are suspected to be false, they may be described as scientific bullshit, and may be linked with fake news in relevant cases of public issues. Additionally, science and technology are often closely tied, where the development of technological products relies on scientific progress and discoveries; hence statements in technological contexts that are science-oriented should be subject to a similar scrutiny.

A claim that is actually bullshit can sound important, clever and reasonable when it is really false and may mean nothing; moreover, after reading it more carefully one may realise it is odd, self-contradictory, or not so reasonable. In The Pocket Oxford Dictionary, the noun ‘bullshit’ (in coarse slang) is defined as “nonsense; pretended knowledge”; as a verb it means “talk nonsense or as if one has specialist knowledge”. Evans, Sleegers and Mlakar (2020) studied relationships with respect to the receptivity of people to two types of bullshit statements: scientific bullshit and pseudo-profound bullshit — does receptivity to scientific bullshit correlate with receptivity to pseudo-profound bullshit (i.e., do they co-occur), and how similar or different are the conditions in which individuals are receptive to them (e.g., background factors).

According to the definition by Evans et al., scientific bullshit is “a form of communication that relies on obtuse scientific jargon to convey a false sense of importance or significance” ([1] p. 402). They compare this kind of bullshit with pseudo-profound bullshit which makes the impression of saying something profound (e.g., by using New Age language and profound-sounding terms) while being inherently unclear and not possible to be clarified. Pseudo-profound bullshit statements touch on matters superficially related to the nature of the universe and existence; they may sound philosophical or spiritual. In common to pseudo-profound and scientific bullshit, those statements are usually “syntactically coherent, but impossible to verify as either true or false”, However, scientific bullshit uses scientific (rather than New Age) terminology, and it aims to sound true (not profound).

Evans and his colleagues found that scientific bullshit receptivity correlates positively (r=0.60) with receptivity to pseudo-profound bullshit. It provides support to a proposition that bullshit receptivity is a human tendency that manifests across content domains (as opposed to receptivity that is dependent on the particular domain concerned). Thus, consumers who are receptive to pseudo-profound bullshit are more likely to be receptive also to scientific bullshit. However, the researchers identified factors more specific to the occurrence of receptivity to scientific bullshit (while other factors are more characteristic of receptivity to pseudo-profound bullshit).

It was found that stronger belief in science coincides (correlates) more highly with receptivity to scientific bullshit than with pseudo-profound bullshit. It suggests, as might be expected, that people who have greater trust in science and tend to rely on scientific information more extensively and confidently, are at the same time more susceptible to fall in ‘traps’ of scientific bullshit. Other people who are more suspicious of science may be less swayed also by scientific bullshit. Yet, those suspicious of science still tend to be receptive to scientific bullshit because of other reasons, such as their greater reliance on intuitive thinking.

  • Note: This result seems somewhat less convincing because the correlations of belief in science with both types of receptivity are relatively weak (r=0.07 with pseudo-profound, r=0.12 with scientific), and while the difference in a Z-transform test for comparing the strength of correlations is statistically significant, the confidence intervals of those correlations largely overlap.

Where we do see a more striking difference in strength of relationships is in regard to social conservatism. Those who are more politically conservative (with emphasis on social aspects, e.g., abortion, limited government) are more inclined to be receptive to pseudo-profound bullshit than to scientific bullshit. In relation to the said above, the more ideologically conservative individuals tend to believe less in science (e.g., whereby rejecting scientific claims). Similarly, although not as strong, believing in free markets is also more associated with receptivity to pseudo-profound bullshit than with respect to scientific bullshit (where the correlation is not even statistically significant).

Evans and colleagues found evidence for an important relationship of cognitive (thinking) styles, particularly intuitive thinking, with receptivity to bullshit. Faith in intuition is positively correlated with receptivity to both pseudo-profound bullshit and scientific bullshit. Yet, the relationship of intuition with receptivity to pseudo-profound bullshit (r=0.39) is stronger than with scientific bullshit (r=0.25; substantiated with a Z-transform comparison test). Notably, no significant relationships of bullshit receptivity were found with a need for cognition, that is the propensity to think more deliberately and analytically. One might expect that at least the correlation of need for cognition would be more negatively and significantly correlated with receptivity to scientific bullshit (the r coefficients are negative but very close and non-significant). Nevertheless. the need for cognition is relevant in a different way, presented below. We do learn from these results that heavier reliance on intuition is more strongly associated with adhering to pseudo-profound bullshit, which sounds intuitively logical.

  • Those who prefer to rely on intuition could be less inclined to accept scientific bullshit, because it does not match with the more rational and deliberate reasoning associated with science. But it could be because intuition is just not the appropriate yardstick to assess the scientific statements and thus it yields less consistent conclusions.

Side by side, and no less important, is the moderating role of scientific literacy, that is, the explicit knowledge and understanding of people in the subject matter of scientific fields. It is shown that as the level of scientific literacy grows (from low through medium to high), the degree of correlation between pseudo-profound bullshit receptivity and scientific bullshit receptivity falls. It means that scientific literacy moderates the correspondence between receptivity to scientific bullshit versus pseudo-profound bullshit, whereby those with higher scientific literacy are better able to separate the scientific bullshit from the other kind of bullshit and be less receptive to and fooled by it.

Evans and his colleagues address in their research the sensitivity of people to scientific bullshit, pertaining to their ability to discriminate between statements of scientific facts and those of scientific bullshit. They find that greater faith in intuition is negatively associated with sensitivity to scientific bullshit (r=-0.12) whereas need for cognition is positively correlated with this sensitivity (r=0.12). It is here we find that turning to the ‘cognitive’, more reflective and systematic, thinking style can help in detecting and treating scientific bullshit more critically. The weakness of their test, of which the researchers are aware, is in being based on judging statements in the specific field of physics. This can make it truly difficult for someone without physics literacy to detect whether a statement is truthful (i.e., a fact) or not (i.e., bullshit). The results in this regard seem less convincing. Nevertheless, they point out that their test of sensitivity is sufficient in distinguishing between the value of reliance on reflective thinking versus intuitive thinking for discriminating between ‘fact’ and ‘bullshit’ scientific statements.

The issue discussed just above raises the question if the ability to discern statements of scientific bullshit and believe them may vary depending on even more specific domains of science (e.g., physics, biology, medicine, and alternately fields like economics and psychology). This also concerns the generality of the tendency of bullshit receptivity. Perhaps the skills that matter more to judgement on scientific statements are of scientific thinking, applicable across scientific fields. Yet, following Evans et al., explicit knowledge in those fields should also help individuals in making better evidence-based decisions. Intuition is less likely to be adequate. It appears, as the researchers also suggest, that “conservatives’ general mistrust of science only partially inoculates them from the allure of scientific bullshit” (p. 410).

Consumers are likely to encounter scientific claims more often in the areas of medical treatments and non-subscription medications (e.g., pain relief). Companies make promises about the capabilities and benefits of treatments (e.g., hair implants) and elective surgeries based on findings from scientific studies; however, they do not necessarily explain the limitations and risks revealed in those studies. The statements therefore may be inaccurate or exaggerated. Magical powers are sometimes attributed in advertising to medications and especially to natural supplements (e.g., vitamins, plant substances) for their healing effects and health improvement. Consumers need to consult and to examine more carefully such promises before using the products.

Statements about evidence and concepts derived from the neurosciences tend to contribute even more strongly these days to impressing and persuading people. Explanations driven by neuroscience have an allure that may be attributed to their perceived sophistication and their insights about brain functions they can provide (and possibly also the advanced technologies utilised). The brain and mind, and how they are inter-related, are topics that fascinate many people. For example, explanations of psychological phenomena that entail neuroscientific evidence are judged by non-expert individuals as more satisfying than explanations without neuroscience information. However, they are more satisfied also by flawed explanations that include neuroscience information that is not relevant to the phenomenon and does not contribute to the logic of the explanation, cases which experts can detect [2].

Statements used in the context of psychology, consumer behaviour and marketing often concern neuro-correlates between brain regions or structures and mental processes (e.g., cognitive, emotional). Yet, statements which claims of this type that are false or not truly related to a topical behavioural phenomenon may be deemed as ‘bullshit’. Research and knowledge in neuroscience have to be applied responsibly and ethically for marketing purposes, in the context known as neuromarketing; it is wrong, for instance, to argue on the basis of an activation of a particular brain region that an expected activity occurred without providing evidence at another level (e.g., verbal, behavioural)[3].

In the context of technology, the area of artificial intelligence (AI) presents in this era the greatest concern for potentially misguiding consumers. Many consumers are confused, even baffled by the methods and applications of AI while understanding modestly how they work. Science is involved in the development of analytical algorithms or machine learning, sensors, and the variety of virtual assistants powered by AI. Consider, for example, the great interest up to enthusiasm about the capabilities of natural language generators (e.g., GPT-4, ChatGPT). Along with helpful guidance, consumers might also be overwhelmed by misguiding information, including scientific bullshit.

The consumers are not difficult to impress (and mislead). Consumers can take measures to mitigate their mistakes in believing scientific bullshit statements or claims, however there is hardly a guarantee that they can avoid completely being misled or fooled by them. Belief in science or political conservatism seem to have role in receptivity to scientific bullshit, but making generalizations respectively could be weak or difficult. It seems more practical instead to focus on factors of reflective versus intuitive thinking styles, science literacy and scientific thinking in relation to scientific bullshit receptivity, as these may be crucial for consumers learning how to think more deliberately and critically about scientific statements. They should need this kind of scrutiny when challenged by impressive information about products or services that could be scientific bullshit.

Ron Ventura, Ph.D. (Marketing)

Notes:

[1] Individual Differences in Receptivity to Scientific Bullshit; Anthony Evans, Willem Sleegers, & Zan Mlakar, 2020; Judgment and Decision Making, 15 (3), pp. 401-412 (available open access)

[2] The Seductive Allure of Neuroscience Explanations; Deena Skolnick Weinberg, Frank C. Keil, Joshua Goodstein, Elizabeth Rawson, & Jeremy R. Gray, 2008; Journal of Cognitive Neuroscience, 20 (3), pp, 470-477; an author manuscript available for reading online

[3] Brains and Brands: Developing Mutually Informative Research in Neuroscience and Marketing; Tyler K. Perrachione and John R. Perrachione, 2008; Journal of Consumer Behaviour, 7, pp. 303-318.

One thought on “How Easy Can It Be to Impress Consumers With Scientific Bullshit?

Leave a reply to Danny Rainer Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.