Critical thinking, science, and pseudoscience : why we can't trust our brains

Available online.

  • EBSCO Academic Comprehensive Collection

More options

  • Find it at other libraries via WorldCat
  • Contributors

Description

Creators/contributors, contents/summary.

  • Section I: Understanding Critical Thinking, Science, and Pseudoscience
  • Chapter 1: Why Do We Need Critical Thinking?
  • Chapter 2: What is Science?
  • Chapter 3: What is Pseudoscience?
  • Chapter 4: What is Critical Thinking?
  • Chapter 5: How Do We Use Critical Thinking?
  • Chapter 6: Why Can't We Trust Our Brains?
  • Chapter 7: Why Can't We Trust Our World?Section II: Applying Critical Thinking to Pseudoscience Claims
  • Chapter 8: Extraterrestrial Life and Alien Abductions
  • Chapter 9: Ghosts and Hauntings
  • Chapter 10: ESP and Mediumship
  • Chapter 11: Cryptozoology
  • Chapter 12: Alternative Medicine for Physical Illness
  • Chapter 13: Alternative Treatments for Mental Illness
  • Chapter 14: Historical Revisionism
  • Chapter 15: Conspiracy Theories
  • Chapter 16: Conclusion.
  • (source: Nielsen Book Data)

Bibliographic information

Stanford University

  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Non-Discrimination
  • Accessibility

© Stanford University , Stanford , California 94305 .

Critical Thinking, Science, and Pseudoscience

Why we can't trust our brains, publisher description.

This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popular pseudoscientific claims through a multidisciplinary lens. Rather than merely focusing on critical thinking grounded in philosophy and psychology, the text incorporates the perspectives of biology, physics, medicine, and other disciplines to reinforce different categories of rational explanation. The book is also distinguished by its respectful approach to individuals whose ideas are, according to the authors, deeply flawed. Accessible and engaging, it describes what critical thinking is, why it is important, and how to learn and apply skillsóusing scientific methods--that promote it. The text also examines why critical thinking can be difficult to engage in and explores the psychological and social reasons why people are drawn to and find credence in extraordinary claims. From alien abductions and psychic phenomena to strange creatures and unsupported alternative medical treatments, the text uses examples from a wide range of pseudoscience fields and brings evidence from diverse disciplines to critically examine these erroneous claims. Particularly timely is the text's examination of how, using the narrative of today's "culture wars," religion and culture impact science. The authors focus on how the human brain, rife with natural biases, does not process information in a rational fashion, and the social factors that prevent individuals from gaining an unbiased, critical perspective on information. Authored by a psychologist and a philosopher who have extensive experience teaching and writing on critical thinking and skeptical inquiry, this work will help students to strengthen their skills in reasoning and debate, become intelligent consumers of research, and make well-informed choices as citizens. Key Features: Addresses the foundations of critical thinking and how to apply it through the popular activity of examining pseudoscience Explains why humans are vulnerable to pseudoscientific claims and how critical thinking can overcome fallacies and biases Reinforces critical thinking through multidisciplinary analyses of pseudoscience Examines how religion and culture impact science Enlightens using an engaging, entertaining approach Written by experienced and innovative scholar/educators well known in the skeptic community Features teaching resources including an Instructor's Guide and Powepoint slides

More Books Like This

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Science and Pseudo-Science

The demarcation between science and pseudoscience is part of the larger task of determining which beliefs are epistemically warranted. This entry clarifies the specific nature of pseudoscience in relation to other categories of non-scientific doctrines and practices, including science denial(ism) and resistance to the facts. The major proposed demarcation criteria for pseudo-science are discussed and some of their weaknesses are pointed out. There is much more agreement on particular cases of demarcation than on the general criteria that such judgments should be based upon. This is an indication that there is still much important philosophical work to be done on the demarcation between science and pseudoscience.

1. The purpose of demarcations

2. the “science” of pseudoscience, 3.1 non-, un-, and pseudoscience, 3.2 non-science posing as science, 3.3 the doctrinal component, 3.4 a wider sense of pseudoscience, 3.5 the objects of demarcation, 3.6 a time-bound demarcation, 4.1 the logical positivists, 4.2 falsificationism, 4.3 the criterion of puzzle-solving, 4.4 criteria based on scientific progress, 4.5 epistemic norms, 4.6 multi-criterial approaches, 5. two forms of pseudo-science, 6.1 scepticism, 6.2 resistance to facts, 6.3 conspiracy theories, 6.4 bullshit, 6.5 epistemic relativism, 7. unity in diversity, cited works, philosophically-informed literature on pseudosciences and contested doctrines, other internet resources, related entries.

Demarcations of science from pseudoscience can be made for both theoretical and practical reasons (Mahner 2007, 516). From a theoretical point of view, the demarcation issue is an illuminating perspective that contributes to the philosophy of science in much the same way that the study of fallacies contributes to our knowledge of informal logic and rational argumentation. From a practical point of view, the distinction is important for decision guidance in both private and public life. Since science is our most reliable source of knowledge in a wide range of areas, we need to distinguish scientific knowledge from its look-alikes. Due to the high status of science in present-day society, attempts to exaggerate the scientific status of various claims, teachings, and products are common enough to make the demarcation issue pressing in many areas. The demarcation issue is therefore important in practical applications such as the following:

Climate policy : The scientific consensus on ongoing anthropogenic climate change leaves no room for reasonable doubt (Cook et al. 2016; Powell 2019). Science denial has considerably delayed climate action, and it is still one of the major factors that impede efficient measures to reduce climate change (Oreskes and Conway 2010; Lewandowsky et al. 2019). Decision-makers and the public need to know how to distinguish between competent climate science and science-mimicking disinformation on the climate.

Environmental policies : In order to be on the safe side against potential disasters it may be legitimate to take preventive measures when there is valid but yet insufficient evidence of an environmental hazard. This must be distinguished from taking measures against an alleged hazard for which there is no valid evidence at all. Therefore, decision-makers in environmental policy must be able to distinguish between scientific and pseudoscientific claims.

Healthcare : Medical science develops and evaluates treatments according to evidence of their effectiveness and safety. Pseudoscientific activities in this area give rise to ineffective and sometimes dangerous interventions. Healthcare providers, insurers, government authorities and – most importantly – patients need guidance on how to distinguish between medical science and medical pseudoscience.

Expert testimony : It is essential for the rule of law that courts get the facts right. The reliability of different types of evidence must be correctly determined, and expert testimony must be based on the best available knowledge. Sometimes it is in the interest of litigants to present non-scientific claims as solid science. Therefore courts must be able to distinguish between science and pseudoscience. Philosophers have often had prominent roles in the defence of science against pseudoscience in such contexts. (Pennock 2011)

Science education : The promoters of some pseudosciences (notably creationism) try to introduce their teachings in school curricula. Teachers and school authorities need to have clear criteria of inclusion that protect students against unreliable and disproved teachings.

Journalism : When there is scientific uncertainty, or relevant disagreement in the scientific community, this should be covered and explained in media reports on the issues in question. Equally importantly, differences of opinion between on the one hand legitimate scientific experts and on the other hand proponents of scientifically unsubstantiated claims should be described as what they are. Public understanding of topics such as climate change and vaccination has been considerably hampered by organised campaigns that succeeded in making media portray standpoints that have been thoroughly disproved in science as legitimate scientific standpoints (Boykoff and Boykoff 2004; Boykoff 2008). The media need tools and practices to distinguish between legitimate scientific controversies and attempts to peddle pseudoscientific claims as science.

Attempts to define what we today call science have a long history, and the roots of the demarcation problem have sometimes been traced back to Aristotle’s Posterior Analytics (Laudan 1983). Cicero’s arguments for dismissing certain methods of divination in his De divinatione has considerable similarities with modern criteria for the demarcation of science (Fernandez-Beanato 2020). However it was not until the 20th century that influential definitions of science have contrasted it against pseudoscience. Philosophical work on the demarcation problem seems to have waned after Laudan’s (1983) much noted death certificate according to which there is no hope of finding a necessary and sufficient criterion of something as heterogeneous as scientific methodology. In more recent years, the problem has been revitalized. Philosophers attesting to its vitality maintain that the concept can be clarified by other means than necessary and sufficient criteria (Pigliucci 2013; Mahner 2013) or that such a definition is indeed possible although it has to be supplemented with discipline-specific criteria in order to become fully operative (Hansson 2013).

The Latin word “pseudoscientia” was used already in the first half of the 17th century in discussions about the relationship between religion and empirical investigations (Guldentops 2020, 288n). The oldest known use of the English word “pseudoscience” dates from 1796, when the historian James Pettit Andrew referred to alchemy as a “fantastical pseudo-science” (Oxford English Dictionary). The word has been in frequent use since the 1880s (Thurs and Numbers 2013). Throughout its history the word has had a clearly defamatory meaning (Laudan 1983, 119; Dolby 1987, 204). It would be as strange for someone to proudly describe her own activities as pseudoscience as to boast that they are bad science. Since the derogatory connotation is an essential characteristic of the word “pseudoscience”, an attempt to extricate a value-free definition of the term would not be meaningful. An essentially value-laden term has to be defined in value-laden terms. This is often difficult since the specification of the value component tends to be controversial.

This problem is not specific to pseudoscience, but follows directly from a parallel but somewhat less conspicuous problem with the concept of science. The common usage of the term “science” can be described as partly descriptive, partly normative. When an activity is recognized as science this usually involves an acknowledgement that it has a positive role in our strivings for knowledge. On the other hand, the concept of science has been formed through a historical process, and many contingencies influence what we call and do not call science. Whether we call a claim, doctrine, or discipline “scientific” depends both on its subject area and its epistemic qualities. The former part of the delimitation is largely conventional, whereas the latter is highly normative, and closely connected with fundamental epistemological and metaphysical issues.

Against this background, in order not to be unduly complex a definition of science has to go in either of two directions. It can focus on the descriptive contents, and specify how the term is actually used. Alternatively, it can focus on the normative element, and clarify the more fundamental meaning of the term. The latter approach has been the choice of most philosophers writing on the subject, and will be at focus here. It involves, of necessity, some degree of idealization in relation to common usage of the term “science”, in particular concerning the delimitation of the subject-area of science.

The English word “science” is primarily used about the natural sciences and other fields of research that are considered to be similar to them. Hence, political economy and sociology are counted as sciences, whereas studies of literature and history are usually not. The corresponding German word, “Wissenschaft”, has a much broader meaning and includes all the academic specialties, including the humanities. The German term has the advantage of more adequately delimiting the type of systematic knowledge that is at stake in the conflict between science and pseudoscience. The misrepresentations of history presented by Holocaust deniers and other pseudo-historians are very similar in nature to the misrepresentations of natural science promoted by creationists and homeopaths.

More importantly, the natural and social sciences and the humanities are all parts of the same human endeavour, namely systematic and critical investigations aimed at acquiring the best possible understanding of the workings of nature, people, and human society. The disciplines that form this community of knowledge disciplines are increasingly interdependent. Since the second half of the 20th century, integrative disciplines such as astrophysics, evolutionary biology, biochemistry, ecology, quantum chemistry, the neurosciences, and game theory have developed at dramatic speed and contributed to tying together previously unconnected disciplines. These increased interconnections have also linked the sciences and the humanities closer to each other, as can be seen for instance from how historical knowledge relies increasingly on advanced scientific analysis of archaeological findings.

The conflict between science and pseudoscience is best understood with this extended sense of science. On one side of the conflict we find the community of knowledge disciplines that includes the natural and social sciences and the humanities. On the other side we find a wide variety of movements and doctrines, such as creationism, astrology, homeopathy, and Holocaust denialism that are in conflict with results and methods that are generally accepted in the community of knowledge disciplines.

Another way to express this is that the demarcation problem has a deeper concern than that of demarcating the selection of human activities that we have for various reasons chosen to call “sciences”. The ultimate issue is “how to determine which beliefs are epistemically warranted” (Fuller 1985, 331). In a wider approach, the sciences are fact-finding practices , i.e., human practices aimed at finding out, as far as possible, how things really are (Hansson 2018). Other examples of fact-finding practices in modern societies are journalism, criminal investigations, and the methods used by mechanics to search for the defect in a malfunctioning machine. Fact-finding practices are also prevalent in indigenous societies, for instance in the forms of traditional agricultural experimentation and the methods used for tracking animal prey (Liebenberg 2013). In this perspective, the demarcation of science is a special case of the delimitation of accurate fact-finding practices. The delimitation between science and pseudoscience has much in common with other delimitations, such as that between accurate and inaccurate journalism and between properly and improperly performed criminal investigations (Hansson 2018).

3. The “pseudo” of pseudoscience

The phrases “demarcation of science” and “demarcation of science from pseudoscience” are often used interchangeably, and many authors seem to have regarded them as equal in meaning. In their view, the task of drawing the outer boundaries of science is essentially the same as that of drawing the boundary between science and pseudoscience.

This picture is oversimplified. All non-science is not pseudoscience, and science has non-trivial borders to other non-scientific phenomena, such as metaphysics, religion, and various types of non-scientific systematized knowledge. (Mahner (2007, 548) proposed the term “parascience” to cover non-scientific practices that are not pseudoscientific.) Science also has the internal demarcation problem of distinguishing between good and bad science.

A comparison of the negated terms related to science can contribute to clarifying the conceptual distinctions. “Unscientific” is a narrower concept than “non-scientific” (not scientific), since the former but not the latter term implies some form of contradiction or conflict with science. “Pseudoscientific” is in its turn a narrower concept than “unscientific”. The latter term differs from the former in covering inadvertent mismeasurements and miscalculations and other forms of bad science performed by scientists who are recognized as trying but failing to produce good science.

Etymology provides us with an obvious starting-point for clarifying what characteristics pseudoscience has in addition to being merely non- or un-scientific. “Pseudo-” (ψευδο-) means false. In accordance with this, the Oxford English Dictionary (OED) defines pseudoscience as follows:

“A pretended or spurious science; a collection of related beliefs about the world mistakenly regarded as being based on scientific method or as having the status that scientific truths now have.”

Many writers on pseudoscience have emphasized that pseudoscience is non-science posing as science. The foremost modern classic on the subject (Gardner 1957) bears the title Fads and Fallacies in the Name of Science . According to Brian Baigrie (1988, 438), “[w]hat is objectionable about these beliefs is that they masquerade as genuinely scientific ones.” These and many other authors assume that to be pseudoscientific, an activity or a teaching has to satisfy the following two criteria (Hansson 1996):

The former of the two criteria is central to the concerns of the philosophy of science. Its precise meaning has been the subject of important controversies among philosophers, to be discussed below in Section 4. The second criterion has been less discussed by philosophers, but it needs careful treatment not least since many discussions of pseudoscience (in and out of philosophy) have been confused due to insufficient attention to it. Proponents of pseudoscience often attempt to mimic science by arranging conferences, journals, and associations that share many of the superficial characteristics of science, but do not satisfy its quality criteria. Naomi Oreskes (2019) called this phenomenon “facsimile science”. Blancke and coworkers (2017) called it “cultural mimicry of science”.

An immediate problem with the definition based on (1) and (2) is that it is too wide. There are phenomena that satisfy both criteria but are not commonly called pseudoscientific. One of the clearest examples of this is fraud in science. This is a practice that has a high degree of scientific pretence and yet does not comply with science, thus satisfying both criteria. Nevertheless, fraud in otherwise legitimate branches of science is seldom if ever called “pseudoscience”. The reason for this can be clarified with the following hypothetical examples (Hansson 1996).

Case 1 : A biochemist performs an experiment that she interprets as showing that a particular protein has an essential role in muscle contraction. There is a consensus among her colleagues that the result is a mere artefact, due to experimental error.
Case 2 : A biochemist goes on performing one sloppy experiment after the other. She consistently interprets them as showing that a particular protein has a role in muscle contraction not accepted by other scientists.
Case 3 : A biochemist performs various sloppy experiments in different areas. One is the experiment referred to in case 1. Much of her work is of the same quality. She does not propagate any particular unorthodox theory.

According to common usage, 1 and 3 are regarded as cases of bad science, and only 2 as a case of pseudoscience. What is present in case 2, but absent in the other two, is a deviant doctrine . Isolated breaches of the requirements of science are not commonly regarded as pseudoscientific. Pseudoscience, as it is commonly conceived, involves a sustained effort to promote standpoints different from those that have scientific legitimacy at the time.

This explains why fraud in science is not usually regarded as pseudoscientific. Such practices are not in general associated with a deviant or unorthodox doctrine. To the contrary, the fraudulent scientist is usually anxious that her results be in conformity with the predictions of established scientific theories. Deviations from these would lead to a much higher risk of disclosure.

The term “science” has both an individuated and an unindividuated sense. In the individuated sense, biochemistry and astronomy are different sciences, one of which includes studies of muscle proteins and the other studies of supernovae. The Oxford English Dictionary (OED) defines this sense of science as “a particular branch of knowledge or study; a recognized department of learning”. In the unindividuated sense, the study of muscle proteins and that of supernovae are parts of “one and the same” science. In the words of the OED, unindividuated science is “the kind of knowledge or intellectual activity of which the various ‘sciences‘ are examples”.

Pseudoscience is an antithesis of science in the individuated rather than the unindividuated sense. There is no unified corpus of pseudoscience corresponding to the corpus of science. For a phenomenon to be pseudoscientific, it must belong to one or the other of the particular pseudosciences. In order to accommodate this feature, the above definition can be modified by replacing (2) by the following (Hansson 1996):

Most philosophers of science, and most scientists, prefer to regard science as constituted by methods of inquiry rather than by particular doctrines. There is an obvious tension between (2′) and this conventional view of science. This, however, may be as it should since pseudoscience often involves a representation of science as a closed and finished doctrine rather than as a methodology for open-ended inquiry.

Sometimes the term “pseudoscience” is used in a wider sense than that which is captured in the definition constituted of (1) and (2′). Contrary to (2′), doctrines that conflict with science are sometimes called “pseudoscientific” in spite of not being advanced as scientific. Hence, Grove (1985, 219) included among the pseudoscientific doctrines those that “purport to offer alternative accounts to those of science or claim to explain what science cannot explain.” Similarly, Lugg (1987, 227–228) maintained that “the clairvoyant’s predictions are pseudoscientific whether or not they are correct”, despite the fact that most clairvoyants do not profess to be practitioners of science. In this sense, pseudoscience is assumed to include not only doctrines contrary to science proclaimed to be scientific but doctrines contrary to science  tout court, whether or not they are put forward in the name of science. Arguably, the crucial issue is not whether something is called “science” but whether it is claimed to have the function of science, namely to provide the most reliable information about its subject-matter. To cover this wider sense of pseudoscience, (2′) can be modified as follows (Hansson 1996, 2013):

Common usage seems to vacillate between the definitions (1)+(2′) and (1)+(2″); and this in an interesting way: In their comments on the meaning of the term, critics of pseudoscience tend to endorse a definition close to (1)+(2′), but their actual usage is often closer to (1)+(2″).

The following examples serve to illustrate the difference between the two definitions and also to clarify why clause (1) is needed:

  • A creationist book gives a correct account of the structure of DNA.
  • An otherwise reliable chemistry book gives an incorrect account of the structure of DNA.
  • A creationist book denies that the human species shares common ancestors with other primates.
  • A preacher who denies that science can be trusted also denies that the human species shares common ancestors with other primates.

(a) does not satisfy (1), and is therefore not pseudoscientific on either account. (b) satisfies (1) but neither (2′) nor (2″) and is therefore not pseudoscientific on either account. (c) satisfies all three criteria, (1), (2′), and (2″), and is therefore pseudoscientific on both accounts. Finally, (d) satisfies (1) and (2″) and is therefore pseudoscientific according to (1)+(2″) but not according to (1)+(2′). As the last two examples illustrate, pseudoscience and anti-science are sometimes difficult to distinguish. Promoters of some pseudosciences (notably homeopathy) tend to be ambiguous between opposition to science and claims that they themselves represent the best science.

Various proposals have been put forward on exactly what elements in science or pseudoscience criteria of demarcation should be applied to. Proposals include that the demarcation should refer to a research program (Lakatos 1974a, 248–249), an epistemic field or cognitive discipline, i.e. a group of people with common knowledge aims, and their practices (Bunge 1982, 2001; Mahner 2007), a theory (Popper 1962, 1974), a practice (Lugg 1992; Morris 1987), a scientific problem or question (Siitonen 1984), and a particular inquiry (Kuhn 1974; Mayo 1996). It is probably fair to say that demarcation criteria can be meaningfully applied on each of these levels of description. A much more difficult problem is whether one of these levels is the fundamental level to which assessments on the other levels are reducible. However, it should be noted that appraisals on different levels may be interdefinable. For instance, it is not an unreasonable assumption that a pseudoscientific doctrine is one that contains pseudoscientific statements as its core or defining claims. Conversely, a pseudoscientific statement may be characterized in terms of being endorsed by a pseudoscientific doctrine but not by legitimate scientific accounts of the same subject area.

Derksen (1993) differs from most other writers on the subject in placing the emphasis in demarcation on the pseudoscientist, i.e. the individual person conducting pseudoscience. His major argument for this is that pseudoscience has scientific pretensions, and such pretensions are associated with a person, not a theory, practice or entire field. However, as was noted by Settle (1971), it is the rationality and critical attitude built into institutions, rather than the personal intellectual traits of individuals, that distinguishes science from non-scientific practices such as magic. The individual practitioner of magic in a pre-literate society is not necessarily less rational than the individual scientist in modern Western society. What she lacks is an intellectual environment of collective rationality and mutual criticism. “It is almost a fallacy of division to insist on each individual scientist being critically-minded” (Settle 1971, 174).

Some authors have maintained that the demarcation between science and pseudoscience must be timeless. If this were true, then it would be contradictory to label something as pseudoscience at one but not another point in time. Hence, after showing that creationism is in some respects similar to some doctrines from the early 18th century, one author maintained that “if such an activity was describable as science then, there is a cause for describing it as science now” (Dolby 1987, 207). This argument is based on a fundamental misconception of science. It is an essential feature of science that it methodically strives for improvement through empirical testing, intellectual criticism, and the exploration of new terrain. A standpoint or theory cannot be scientific unless it relates adequately to this process of improvement, which means as a minimum that well-founded rejections of previous scientific standpoints are accepted. The practical demarcation of science cannot be timeless, for the simple reason that science itself is not timeless.

Nevertheless, the mutability of science is one of the factors that renders the demarcation between science and pseudoscience difficult. Derksen (1993, 19) rightly pointed out three major reasons why demarcation is sometimes difficult: science changes over time, science is heterogenous, and established science itself is not free of the defects characteristic of pseudoscience.

4. Alternative demarcation criteria

Philosophical discussions on the demarcation of pseudoscience have usually focused on the normative issue, i.e. the missing scientific quality of pseudoscience (rather than on its attempt to mimic science. One option is to base the demarcation on the fundamental function that science shares with other fact-finding processes, namely to provide us with the most reliable information about its subject-matter that is currently available. This could lead to the specification of critierion (1) from Section 3.2 as follows:

This definition has the advantages of (i) being applicable across disciplines with highly different methodologies and (ii) allowing for a statement to be pseudoscientific at present although it was not so in an earlier period (or, although less commonly, the other way around). (Hansson 2013) At the same time it removes the practical determination whether a statement or doctrine is pseudoscientific from the purview of armchair philosophy to that of scientists specialized in the subject-matter that the statement or doctrine relates to. Philosophers have usually opted for demarcation criteria that appear not to require specialized knowledge in the pertinent subject area.

Around 1930, the logical positivists of the Vienna Circle developed various verificationist approaches to science. The basic idea was that a scientific statement could be distinguished from a metaphysical statement by being at least in principle possible to verify. This standpoint was associated with the view that the meaning of a proposition is its method of verification (see the section on Verificationism in the entry on the Vienna Circle ). This proposal has often been included in accounts of the demarcation between science and pseudoscience. However, this is not historically quite accurate since the verificationist proposals had the aim of solving a distinctly different demarcation problem, namely that between science and metaphysics.

Karl Popper described the demarcation problem as the “key to most of the fundamental problems in the philosophy of science” (Popper 1962, 42). He rejected verifiability as a criterion for a scientific theory or hypothesis to be scientific, rather than pseudoscientific or metaphysical. Instead he proposed as a criterion that the theory be falsifiable, or more precisely that “statements or systems of statements, in order to be ranked as scientific, must be capable of conflicting with possible, or conceivable observations” (Popper 1962, 39).

Popper presented this proposal as a way to draw the line between statements belonging to the empirical sciences and “all other statements – whether they are of a religious or of a metaphysical character, or simply pseudoscientific” (Popper 1962, 39; cf. Popper 1974, 981). This was both an alternative to the logical positivists’ verification criteria and a criterion for distinguishing between science and pseudoscience. Although Popper did not emphasize the distinction, these are of course two different issues (Bartley 1968). Popper conceded that metaphysical statements may be “far from meaningless” (1974, 978–979) but showed no such appreciation of pseudoscientific statements.

Popper’s demarcation criterion has been criticized both for excluding legitimate science (Hansson 2006) and for giving some pseudosciences the status of being scientific (Agassi 1991; Mahner 2007, 518–519). Strictly speaking, his criterion excludes the possibility that there can be a pseudoscientific claim that is refutable. According to Larry Laudan (1983, 121), it “has the untoward consequence of countenancing as ‘scientific’ every crank claim which makes ascertainably false assertions”. Astrology, rightly taken by Popper as an unusually clear example of a pseudoscience, has in fact been tested and thoroughly refuted (Culver and Ianna 1988; Carlson 1985). Similarly, the major threats to the scientific status of psychoanalysis, another of his major targets, do not come from claims that it is untestable but from claims that it has been tested and failed the tests.

Defenders of Popper have claimed that this criticism relies on an uncharitable interpretation of his ideas. They claim that he should not be interpreted as meaning that falsifiability is a sufficient condition for demarcating science. Some passages seem to suggest that he takes it as only a necessary condition (Feleppa 1990, 142). Other passages suggest that for a theory to be scientific, Popper requires (in addition to falsifiability) that energetic attempts are made to put the theory to test and that negative outcomes of the tests are accepted (Cioffi 1985, 14–16). A falsification-based demarcation criterion that includes these elements will avoid the most obvious counter-arguments to a criterion based on falsifiability alone.

However, in what seems to be his last statement of his position, Popper declared that falsifiability is a both necessary and a sufficient criterion. “A sentence (or a theory) is empirical-scientific if and only if it is falsifiable.” Furthermore, he emphasized that the falsifiability referred to here “only has to do with the logical structure of sentences and classes of sentences” (Popper [1989] 1994, 82). A (theoretical) sentence, he says, is falsifiable if and only if it logically contradicts some (empirical) sentence that describes a logically possible event that it would be logically possible to observe (Popper [1989] 1994, 83). A statement can be falsifiable in this sense although it is not in practice possible to falsify it. It would seem to follow from this interpretation that a statement’s status as scientific or non-scientific does not shift with time. On previous occasions he seems to have interpreted falsifiability differently, and maintained that “what was a metaphysical idea yesterday can become a testable scientific theory tomorrow; and this happens frequently” (Popper 1974, 981, cf. 984).

Logical falsifiability is a much weaker criterion than practical falsifiability. However, even logical falsifiability can create problems in practical demarcations. Popper once adopted the view that natural selection is not a proper scientific theory, arguing that it comes close to only saying that “survivors survive”, which is tautological. “Darwinism is not a testable scientific theory, but a metaphysical research program” (Popper 1976, 168). This statement has been criticized by evolutionary scientists who pointed out that it misrepresents evolution. The theory of natural selection has given rise to many predictions that have withstood tests both in field studies and in laboratory settings (Ruse 1977; 2000).

In a lecture in Darwin College in 1977, Popper retracted his previous view that the theory of natural selection is tautological. He now admitted that it is a testable theory although “difficult to test” (Popper 1978, 344). However, in spite of his well-argued recantation, his previous standpoint continues to be propagated in defiance of the accumulating evidence from empirical tests of natural selection.

Thomas Kuhn is one of many philosophers for whom Popper’s view on the demarcation problem was a starting-point for developing their own ideas. Kuhn criticized Popper for characterizing “the entire scientific enterprise in terms that apply only to its occasional revolutionary parts” (Kuhn 1974, 802). Popper’s focus on falsifications of theories led to a concentration on the rather rare instances when a whole theory is at stake. According to Kuhn, the way in which science works on such occasions cannot be used to characterize the entire scientific enterprise. Instead it is in “normal science”, the science that takes place between the unusual moments of scientific revolutions, that we find the characteristics by which science can be distinguished from other activities (Kuhn 1974, 801).

In normal science, the scientist’s activity consists in solving puzzles rather than testing fundamental theories. In puzzle-solving, current theory is accepted, and the puzzle is indeed defined in its terms. In Kuhn’s view, “it is normal science, in which Sir Karl’s sort of testing does not occur, rather than extraordinary science which most nearly distinguishes science from other enterprises”, and therefore a demarcation criterion must refer to the workings of normal science (Kuhn 1974, 802). Kuhn’s own demarcation criterion is the capability of puzzle-solving, which he sees as an essential characteristic of normal science.

Kuhn’s view of demarcation is most clearly expressed in his comparison of astronomy with astrology. Since antiquity, astronomy has been a puzzle-solving activity and therefore a science. If an astronomer’s prediction failed, then this was a puzzle that he could hope to solve for instance with more measurements or adjustments of the theory. In contrast, the astrologer had no such puzzles since in that discipline “particular failures did not give rise to research puzzles, for no man, however skilled, could make use of them in a constructive attempt to revise the astrological tradition” (Kuhn 1974, 804). Therefore, according to Kuhn, astrology has never been a science.

Popper disapproved thoroughly of Kuhn’s demarcation criterion. According to Popper, astrologers are engaged in puzzle solving, and consequently Kuhn’s criterion commits him to recognize astrology as a science. (Contrary to Kuhn, Popper defined puzzles as “minor problems which do not affect the routine”.) In his view Kuhn’s proposal leads to “the major disaster” of a “replacement of a rational criterion of science by a sociological one” (Popper 1974, 1146–1147).

Popper’s demarcation criterion concerns the logical structure of theories. Imre Lakatos described this criterion as “a rather stunning one. A theory may be scientific even if there is not a shred of evidence in its favour, and it may be pseudoscientific even if all the available evidence is in its favour. That is, the scientific or non-scientific character of a theory can be determined independently of the facts” (Lakatos 1981, 117).

Instead, Lakatos (1970; 1974a; 1974b; 1981) proposed a modification of Popper’s criterion that he called “sophisticated (methodological) falsificationism”. On this view, the demarcation criterion should not be applied to an isolated hypothesis or theory, but rather to a whole research program that is characterized by a series of theories successively replacing each other. In his view, a research program is progressive if the new theories make surprising predictions that are confirmed. In contrast, a degenerating research programme is characterized by theories being fabricated only in order to accommodate known facts. Progress in science is only possible if a research program satisfies the minimum requirement that each new theory that is developed in the program has a larger empirical content than its predecessor. If a research program does not satisfy this requirement, then it is pseudoscientific.

According to Paul Thagard (1978, 228), a theory or discipline is pseudoscientific if it satisfies two criteria. One of these is that the theory fails to progress, and the other that “the community of practitioners makes little attempt to develop the theory towards solutions of the problems, shows no concern for attempts to evaluate the theory in relation to others, and is selective in considering confirmations and disconfirmations”. A major difference between this approach and that of Lakatos is that Lakatos would classify a nonprogressive discipline as pseudoscientific even if its practitioners work hard to improve it and turn it into a progressive discipline. (In later work, Thagard has abandoned this approach and instead promoted a form of multi-criterial demarcation (Thagard 1988, 157-173).)

In a somewhat similar vein, Daniel Rothbart (1990) emphasized the distinction between the standards to be used when testing a theory and those to be used when determining whether a theory should at all be tested. The latter, the eligibility criteria, include that the theory should encapsulate the explanatory success of its rival, and that it should yield testable implications that are inconsistent with those of the rival. According to Rothbart, a theory is unscientific if it is not testworthy in this sense.

George Reisch proposed that demarcation could be based on the requirement that a scientific discipline be adequately integrated into the other sciences. The various scientific disciplines have strong interconnections that are based on methodology, theory, similarity of models etc. Creationism, for instance, is not scientific because its basic principles and beliefs are incompatible with those that connect and unify the sciences. More generally speaking, says Reisch, an epistemic field is pseudoscientific if it cannot be incorporated into the existing network of established sciences (Reisch 1998; cf. Bunge 1982, 379).

Paul Hoyninengen-Huene (2013) identifies science with systematic knowledge, and proposes that systematicity can be used as a demarcation criterion. However as shown by Naomi Oreskes, this is a problematic criterion, not least since some pseudosciences seem to satisfy it (Oreskes 2019).

A different approach, namely to base demarcation criteria on the value base of science, was proposed by sociologist Robert K. Merton ([1942] 1973). According to Merton, science is characterized by an “ethos”, i.e. spirit, that can be summarized as four sets of institutional imperatives. The first of these, universalism , asserts that whatever their origins, truth claims should be subjected to preestablished, impersonal criteria. This implies that the acceptance or rejection of claims should not depend on the personal or social qualities of their protagonists.

The second imperative, communism , says that the substantive findings of science are the products of social collaboration and therefore belong to the community, rather than being owned by individuals or groups. This is, as Merton pointed out, incompatible with patents that reserve exclusive rights of use to inventors and discoverers. The term “communism” is somewhat infelicitous; “communality” probably captures better what Merton aimed at.

His third imperative, disinterestedness , imposes a pattern of institutional control that is intended to curb the effects of personal or ideological motives that individual scientists may have. The fourth imperative, organized scepticism , implies that science allows detached scrutiny of beliefs that are dearly held by other institutions. This is what sometimes brings science into conflicts with religions and ideologies.

Merton described these criteria as belonging to the sociology of science, and thus as empirical statements about norms in actual science rather than normative statements about how science should be conducted (Merton [1942] 1973, 268). His criteria have often been dismissed by sociologists as oversimplified, and they have only had limited influence in philosophical discussions on the demarcation issue (Dolby 1987; Ruse 2000). Their potential in the latter context does not seem to have been sufficiently explored.

Popper’s method of demarcation consists essentially of the single criterion of falsifiability (although some authors have wanted to combine it with the additional criteria that tests are actually performed and their outcomes respected, see Section 4.2). Most of the other criteria discussed above are similarly mono-criterial, of course with Merton’s proposal as a major exception.

Most authors who have proposed demarcation criteria have instead put forward a list of such criteria. A large number of lists have been published that consist of (usually 5–10) criteria that can be used in combination to identify a pseudoscience or pseudoscientific practice. This includes lists by Langmuir ([1953] 1989), Gruenberger (1964), Dutch (1982), Bunge (1982), Radner and Radner (1982), Kitcher (1982, 30–54), Grove (1985), Thagard (1988, 157–173), Glymour and Stalker (1990), Derksen (1993, 2001), Vollmer (1993), Ruse (1996, 300–306) and Mahner (2007). Many of the criteria that appear on such lists relate closely to criteria discussed above in Sections 4.2 and 4.4. One such list reads as follows:

  • Belief in authority : It is contended that some person or persons have a special ability to determine what is true or false. Others have to accept their judgments.
  • Unrepeatable experiments : Reliance is put on experiments that cannot be repeated by others with the same outcome.
  • Handpicked examples : Handpicked examples are used although they are not representative of the general category that the investigation refers to.
  • Unwillingness to test : A theory is not tested although it is possible to test it.
  • Disregard of refuting information : Observations or experiments that conflict with a theory are neglected.
  • Built-in subterfuge : The testing of a theory is so arranged that the theory can only be confirmed, never disconfirmed, by the outcome.
  • Explanations are abandoned without replacement . Tenable explanations are given up without being replaced, so that the new theory leaves much more unexplained than the previous one.

Some of the authors who have proposed multicriterial demarcations have defended this approach as being superior to any mono-criterial demarcation. Hence, Bunge (1982, 372) asserted that many philosophers have failed to provide an adequate definition of science since they have presupposed that a single attribute will do; in his view the combination of several criteria is needed. Dupré (1993, 242) proposed that science is best understood as a Wittgensteinian family resemblance concept. This would mean that there is a set of features that are characteristic of science, but although every part of science will have some of these features, we should not expect any part of science to have all of them. Irzik and Nola (2011) proposed the use of this approach in science education.

However, a multicriterial definition of science is not needed to justify a multicriterial account of how pseudoscience deviates from science. Even if science can be characterized by a single defining characteristic, different pseudoscientific practices may deviate from science in widely divergent ways.

Some forms of pseudoscience have as their main objective the promotion of a particular theory of their own, whereas others are driven by a desire to fight down some scientific theory or branch of science. The former type of pseudoscience has been called pseudo-theory promotion , and the latter science denial(ism) (Hansson 2017). Pseudo-theory promotion is exemplified by homeopathy, astrology, and ancient astronaut theories. The term “denial” was first used about the pseudo-scientific claim that the Nazi holocaust never took place. The phrase “holocaust denial” was in use already in the early 1980s (Gleberzon 1983). The term “climate change denial” became common around 2005 (e.g. Williams 2005). Other forms of science denial are relativity theory denial, tobacco disease denial, hiv denialism, and vaccination denialism.

Many forms of pseudoscience combine pseudo-theory promotion with science denialism. For instance, creationism and its skeletal version “intelligent design” are constructed to support a fundamentalist interpretation of Genesis. However, as practiced today, creationism has a strong focus on the repudiation of evolution, and it is therefore predominantly a form of science denialism.

The most prominent difference between pseudo-theory promotion and science denial is their different attitudes to conflicts with established science. Science denialism usually proceeds by producing false controversies with legitimate science, i.e. claims that there is a scientific controversy when there is in fact none. This is an old strategy, applied already in the 1930s by relativity theory deniers (Wazeck 2009, 268–269). It has been much used by tobacco disease deniers sponsored by the tobacco industry (Oreskes and Conway 2010; Dunlap and Jacques 2013), and it is currently employed by climate science denialists (Boykoff and Boykoff 2004; Boykoff 2008). However, whereas the fabrication of fake controversies is a standard tool in science denial, it is seldom if ever used in pseudo-theory promotion. To the contrary, advocates of pseudosciences such as astrology and homeopathy tend to describe their theories as conformable to mainstream science.

6. Some related terms

The term scepticism (skepticism) has at least three distinct usages that are relevant for the discussion on pseudoscience. First, scepticism is a philosophical method that proceeds by casting doubt on claims usually taken to be trivially true, such as the existence of the external world. This has been, and still is, a highly useful method for investigating the justification of what we in practice consider to be certain beliefs. Secondly, criticism of pseudoscience is often called scepticism. This is the term most commonly used by organisations devoted to the disclosure of pseudoscience. Thirdly, opposition to the scientific consensus in specific areas is sometimes called scepticism. For instance, climate science deniers often call themselves “climate sceptics”.

To avoid confusion, the first of these notions can be specified as “philosophical scepticism”, the second as “scientific scepticism” or “defence of science”, and the third as “science denial(ism)”. Adherents of the first two forms of scepticism can be called “philosophical sceptics”, respectively “science defenders”. Adherents of the third form can be called “science deniers” or “science denialists”. Torcello (2016) proposed the term “pseudoscepticism” for so-called climate scepticism.

Unwillingness to accept strongly supported factual statements is a traditional criterion of pseudoscience. (See for instance item 5 on the list of seven criteria cited in Section 4.6.) The term “fact resistance” or “resistance to facts” was used already in the 1990s, for instance by Arthur Krystal (1999, p. 8), who complained about a “growing resistance to facts”, consisting in people being “simply unrepentant about not knowing things that do not reflect their interests”. The term “fact resistance” can refer to unwillingness to accept well-supported factual claims whether or not that support originates in science. It is particularly useful in relation to fact-finding practices that are not parts of science. (Cf. Section 2.)

Generally speaking, conspiracy theories are theories according to which there exists some type of secret collusion for any type of purpose. In practice, the term mostly refers to implausible such theories, used to explain social facts that have other, considerably more plausible explanations. Many pseudosciences are connected with conspiracy theories. For instance, one of the difficulties facing anti-vaccinationists is that they have to explain the overwhelming consensus among medical experts that vaccines are efficient. This is often done by claims of a conspiracy:

At the heart of the anti-vaccine conspiracy movement [lies] the argument that large pharmaceutical companies and governments are covering up information about vaccines to meet their own sinister objectives. According to the most popular theories, pharmaceutical companies stand to make such healthy profits from vaccines that they bribe researchers to fake their data, cover up evidence of the harmful side effects of vaccines, and inflate statistics on vaccine efficacy. (Jolley and Douglas 2014)

Conspiracy theories have peculiar epistemic characteristics that contribute to their pervasiveness. (Keeley 1999) In particular, they are often associated with a type of circular reasoning that allows evidence against the conspiracy to be interpreted as evidence for it.

The term “bullshit” was introduced into philosophy by Harry Frankfurt, who first discussed it in a 1986 essay ( Raritan Quarterly Review ) and developed the discussion into a book (2005). Frankfurt used the term to describe a type of falsehood that does not amount to lying. A person who lies deliberately chooses not to tell the truth, whereas a person who utters bullshit is not interested in whether what (s)he says is true or false, only in its suitability for his or her purpose. Moberger (2020) has proposed that pseudoscience should be seen as a special case of bullshit, understood as “a culpable lack of epistemic conscientiousness”.

Epistemic relativism is a term with many meanings; the meaning most relevant in discussions on pseudoscience is denial of the common assumption that there is intersubjective truth in scientific matters, which scientists can and should try to approach. Epistemic relativists claim that (natural) science has no special claim to knowledge, but should be seen “as ordinary social constructions or as derived from interests, political-economic relations, class structure, socially defined constraints on discourse, styles of persuasion, and so on” (Buttel and Taylor 1992, 220). Such ideas have been promoted under different names, including “social constructivism”, the “strong programme”, “deconstructionism”, and “postmodernism”. The distinction between science and pseudoscience has no obvious role in epistemic relativism. Some academic epistemic relativists have actively contributed to the promotion of doctrines such as AIDS denial, vaccination denial, creationism, and climate science denial (Hansson 2020, Pennock 2010). However, the connection between epistemic relativism and pseudoscience is controversial. Some proponents of epistemic relativism have maintained that that relativism “is almost always more useful to the side with less scientific credibility or cognitive authority” (Scott et al. 1990, 490). Others have denied that epistemic relativism facilitates or encourages standpoints such as denial of anthropogenic climate change or other environmental problems (Burningham and Cooper 1999, 306).

Kuhn observed that although his own and Popper’s criteria of demarcation are profoundly different, they lead to essentially the same conclusions on what should be counted as science respectively pseudoscience (Kuhn 1974, 803). This convergence of theoretically divergent demarcation criteria is a quite general phenomenon. Philosophers and other theoreticians of science differ widely in their views on what science is. Nevertheless, there is virtual unanimity in the community of knowledge disciplines on most particular issues of demarcation. There is widespread agreement for instance that creationism, astrology, homeopathy, Kirlian photography, dowsing, ufology, ancient astronaut theory, Holocaust denialism, Velikovskian catastrophism, and climate change denialism are pseudosciences. There are a few points of controversy, for instance concerning the status of Freudian psychoanalysis, but the general picture is one of consensus rather than controversy in particular issues of demarcation.

It is in a sense paradoxical that so much agreement has been reached in particular issues in spite of almost complete disagreement on the general criteria that these judgments should presumably be based upon. This puzzle is a sure indication that there is still much important philosophical work to be done on the demarcation between science and pseudoscience.

Philosophical reflection on pseudoscience has brought forth other interesting problem areas in addition to the demarcation between science and pseudoscience. Examples include related demarcations such as that between science and religion, the relationship between science and reliable non-scientific knowledge (for instance everyday knowledge), the scope for justifiable simplifications in science education and popular science, the nature and justification of methodological naturalism in science (Boudry et al 2010), and the meaning or meaninglessness of the concept of a supernatural phenomenon. Several of these problem areas have as yet not received much philosophical attention.

  • Agassi, Joseph, 1991. “Popper’s demarcation of science refuted”, Methodology and Science , 24: 1–7.
  • Baigrie, B.S., 1988. “Siegel on the Rationality of Science”, Philosophy of Science , 55: 435–441.
  • Bartley III, W. W., 1968. “Theories of demarcation between science and metaphysics”, pp. 40–64 in Imre Lakatos and Alan Musgrave (eds.), Problems in the Philosophy of Science, Proceedings of the International Colloquium in the Philosophy of Science, London 1965 (Volume 3), Amsterdam: North-Holland Publishing Company.
  • Blancke, Stefaan, Maarten Boudry and Massimo Pigliucci, 2017. “Why do irrational beliefs mimic science? The cultural evolution of pseudoscience”, Theoria , 83(1): 78–97.
  • Boudry, Maarten, Stefaan Blancke, and Johan Braeckman, 2010. “How not to attack intelligent design creationism: Philosophical misconceptions about methodological naturalism.” Foundations of Science , 153: 227–244.
  • Boykoff, M. T., 2008. “Lost in translation? United States television news coverage of anthropogenic climate change, 1995–2004”, Climatic Change , 86: 1–11.
  • Boykoff, M. T. and J. M. Boykoff, 2004. “Balance as bias: global warming and the U.S. prestige press”, Global Environmental Change , 14: 125–136.
  • Bunge, Mario, 1982. “Demarcating Science from Pseudoscience”, Fundamenta Scientiae , 3: 369–388.
  • –––, 2001. “Diagnosing pseudoscience”, in Mario Bunge, Philosophy in Crisis. The Need for Reconstruction , Amherst, N.Y.; Prometheus Books, pp. 161–189.
  • Burningham, K., and G. Cooper, 1999. “Being constructive: Social constructionism and the environment”, Sociology , 33(2): 297–316.
  • Buttel, Frederick H. and Peter J. Taylor, 1992. “Environmental sociology and global environmental change: A critical assessment”, Society and Natural Resources , 5(3): 211–230.
  • Carlson, Shawn, 1985. “A Double Blind Test of Astrology”, Nature , 318: 419–425.
  • Cioffi, Frank, 1985. “Psychoanalysis, pseudoscience and testability”, pp 13–44 in Gregory Currie and Alan Musgrave, (eds.) Popper and the Human Sciences , Dordrecht: Martinus Nijhoff Publishers.
  • Cook, John, Naomi Oreskes, Peter T. Doran, William RL Anderegg, Bart Verheggen, Ed W. Maibach, J. Stuart Carlton, et al., 2016. “Consensus on consensus: A synthesis of consensus estimates on human-caused global warming”, Environmental Research Letters , 11: 048002.
  • Culver, Roger and Ianna, Philip, 1988. Astrology: True or False , Buffalo: Prometheus Books.
  • Derksen, A.A., 1993. “The seven sins of pseudoscience”, Journal for General Philosophy of Science , 24: 17–42.
  • –––, 2001. “The seven strategies of the sophisticated pseudoscience: a look into Freud’s rhetorical tool box”, Journal for General Philosophy of Science , 32: 329–350.
  • Dolby, R.G.A., 1987. “Science and pseudoscience: the case of creationism”, Zygon , 22: 195–212.
  • Dunlap, Riley E., and Peter J. Jacques, 2013. “Climate change denial books and conservative think tanks: exploring the connection”, American Behavioral Scientist , 57(6): 699–731.
  • Dupré, John, 1993. The Disorder of Things: Metaphysical Foundations of the Disunity of Science , Cambridge, MA: Harvard University Press.
  • Dutch, Steven I, 1982. “Notes on the nature of fringe science”, Journal of Geological Education , 30: 6–13.
  • Feleppa, Robert, 1990. “Kuhn, Popper, and the Normative Problem of Demarcation”, pp. 140–155 in Patrick Grim (ed.) Philosophy of Science and the Occult , 2 nd edition, Albany: State University of New York Press.
  • Fernandez-Beanato, Damian, 2020. “Cicero’s demarcation of science: A report of shared criteria”, Studies in History and Philosophy of Science (Part A), 83: 97–102.
  • Frankfurt, Harry G., 2005. On Bullshit , Princeton: Princeton University Press; see also the essay with the same title in Raritan Quarterly Review , 6(2): 81–100.
  • Fuller, Steve, 1985. “The demarcation of science: a problem whose demise has been greatly exaggerated”, Pacific Philosophical Quarterly , 66: 329–341.
  • Gardner, Martin, 1957. Fads and Fallacies in the Name of Science , Dover 1957; expanded version of his In the Name of Science , 1952.
  • Gleberzon, William, 1983. “Academic freedom and Holocaust denial literature: Dealing with infamy”, Interchange , 14(4): 62–69.
  • Glymour, Clark and Stalker, Douglas, 1990. “Winning through Pseudoscience”, pp 92–103 in Patrick Grim (ed.) Philosophy of Science and the Occult , 2 nd edition, Albany: State University of New York Press.
  • Grove , J.W., 1985. “Rationality at Risk: Science against Pseudoscience”, Minerva , 23: 216–240.
  • Gruenberger, Fred J., 1964. “A measure for crackpots”, Science , 145: 1413–1415.
  • Guldentops, Guy, 2020. “Nicolaus Ellenbog’s ‘Apologia for the Astrologers’: A Benedictine’s View on Astral Determinism”, Bulletin de Philosophie Médiévale , 62: 251–334.
  • Hansson, Sven Ove, 1996. “Defining Pseudoscience”, Philosophia Naturalis , 33: 169–176.
  • –––, 2006. “Falsificationism Falsified”, Foundations of Science , 11: 275–286.
  • –––, 2013. “Defining pseudoscience and science”, pp. 61–77 in Pigliucci and Boudry (eds.) 2013.
  • –––, 2017. “Science denial as a form of pseudoscience”, Studies in History and Philosophy of Science , 63: 39–47.
  • –––, 2018. “How connected are the major forms of irrationality? An analysis of pseudoscience, science denial, fact resistance and alternative facts”, Mètode Science Study Journal , 8: 125–131.
  • –––, 2020. “Social constructivism and climate science denial”, European Journal for Philosophy of Science , 10: 37.
  • Hoyninengen-Huene, Paul, 2013. Systematicity. The nature of science , Oxford: Oxford University Press.
  • Irzik, Gürol, and Robert Nola, 2011. “A family resemblance approach to the nature of science for science education”, Science and Education , 20(7): 591–607.
  • Jolley, Daniel, and Karen M. Douglas, 2014. “The effects of anti-vaccine conspiracy theories on vaccination intentions”, PloS One , 9(2): e89177.
  • Keeley, Brian L., 1999. “Of Conspiracy Theories”, The Journal of Philosophy , 96(3): 109–126.
  • Kitcher, Philip, 1982. Abusing Science. The Case Against Creationism , Cambridge, MA: MIT Press.
  • Krystal, Arthur, 1999. “At Large and at Small: What Do You Know?”, American Scholar , 68(2): 7–13.
  • Kuhn, Thomas S., 1974. “Logic of Discovery or Psychology of Research?”, pp. 798–819 in P.A. Schilpp, The Philosophy of Karl Popper , The Library of Living Philosophers, vol xiv, book ii. La Salle: Open Court.
  • Lakatos, Imre, 1970. “Falsification and the Methodology of Research program”, pp 91–197 in Imre Lakatos and Alan Musgrave (eds.) Criticism and the Growth of Knowledge . Cambridge: Cambridge University Press.
  • –––, 1974a. “Popper on Demarcation and Induction”, pp. 241–273 in P.A. Schilpp, The Philosophy of Karl Popper (The Library of Living Philosophers, Volume 14, Book 1). La Salle: Open Court.
  • –––, 1974b. “Science and pseudoscience”, Conceptus , 8: 5–9.
  • –––, 1981. “Science and pseudoscience”, pp. 114–121 in S. Brown, et al . (eds.) Conceptions of Inquiry: A Reader , London: Methuen.
  • Langmuir, Irving, [1953] 1989. “Pathological Science”, Physics Today , 42(10): 36–48.
  • Laudan, Larry, 1983. “The demise of the demarcation problem”, in R.S. Cohan and L. Laudan (eds.), Physics, Philosophy, and Psychoanalysis , Dordrecht: Reidel, pp. 111–127.
  • Lewandowsky, Stephan, Toby D. Pilditch, Jens K. Madsen, Naomi Oreskes, and James S. Risbey, 2019. “Influence and seepage: An evidence-resistant minority can affect public opinion and scientific belief formation”, Cognition , 188: 124–139.
  • Liebenberg, L., 2013. The Origin of Science. The evolutionary roots of scientific reasoning and its implications for citizen science , Cape Town: CyberTracker.
  • Lugg, Andrew, 1987. “Bunkum, Flim-Flam and Quackery: Pseudoscience as a Philosophical Problem”, Dialectica , 41: 221–230.
  • –––, 1992. “Pseudoscience as nonsense”, Methodology and Science , 25: 91–101.
  • Mahner, Martin, 2007. “Demarcating Science from Non-Science”, pp 515-575 in Theo Kuipers (ed.) Handbook of the Philosophy of Science: General Philosophy of Science – Focal Issues , Amsterdam: Elsevier.
  • –––, 2013. “Science and pseudoscience. How to demarcate after the (alleged) demise of the demarcation problem”, pp. 29–43 in Pigliucci and Boudry (eds.) 2013.
  • Mayo, Deborah G., 1996. “Ducks, rabbits and normal science: Recasting the Kuhn’s-eye view of Popper’s demarcation of science”, British Journal for the Philosophy of Science , 47: 271–290.
  • Merton, Robert K., [1942] 1973. “Science and Technology in a Democratic Order”, Journal of Legal and Political Sociology , 1: 115–126, 1942; reprinted as “The Normative Structure of Science”, in Robert K. Merton, The Sociology of Science. Theoretical and Empirical Investigations , Chicago: University of Chicago Press, pp. 267–278.
  • Moberger, Victor, 2020. “Bullshit, Pseudoscience and Pseudophilosophy”, Theoria , 86(5): 595–611.
  • Morris, Robert L., 1987. “Parapsychology and the Demarcation Problem”, Inquiry , 30: 241–251.
  • Oreskes, Naomi, 2019. “Systematicity is necessary but not sufficient: on the problem of facsimile science”, Synthese , 196(3): 881–905.
  • Oreskes, Naomi and Erik M. Conway, 2010. Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming , New York: Bloomsbury Press.
  • Pennock, Robert T., 2010. “The postmodern sin of intelligent design creationism” Science and Education , 19(6–8): 757–778.
  • –––, 2011. “Can’t philosophers tell the difference between science and religion?: Demarcation revisited”, Synthese , 178(2): 177–206.
  • Pigliucci, Massimo, 2013. “The demarcation problem. A (belated) response to Laudan”, in Pigliucci and Boudry (eds.) 2013, pp. 9–28.
  • Pigliucci, Massimo and Maarten Boudry (eds.), 2013. Philosophy of Pseudoscience. Reconsidering the demarcation problem. Chicago: Chicago University Press.
  • Popper, Karl, 1962. Conjectures and refutations. The growth of scientific knowledge , New York: Basic Books.
  • –––, 1974 “Reply to my critics”, in P.A. Schilpp, The Philosophy of Karl Popper (The Library of Living Philosophers, Volume XIV, Book 2), La Salle: Open Court, pp. 961–1197.
  • –––, 1976. Unended Quest London: Fontana.
  • –––, 1978. “Natural Selection and the Emergence of the Mind”, Dialectica , 32: 339–355.
  • –––, [1989] 1994. “Falsifizierbarkeit, zwei Bedeutungen von”, pp. 82–86 in Helmut Seiffert and Gerard Radnitzky, Handlexikon zur Wissenschaftstheorie , 2 nd edition, München: Ehrenwirth GmbH Verlag.
  • Powell, James, 2019. “Scientists reach 100% consensus on anthropogenic global warming”, Bulletin of Science, Technlogy and Society , 37(4): 183–184.
  • Radner, Daisie and Michael Radner, 1982. Science and Unreason , Belmont CA: Wadsworth.
  • Reisch, George A., 1998. “Pluralism, Logical Empiricism, and the Problem of Pseudoscience”, Philosophy of Science , 65: 333–348.
  • Rothbart, Daniel, 1990 “Demarcating Genuine Science from Pseudoscience”, in Patrick Grim, ed, Philosophy of Science and the Occult , 2nd edition, Albany: State University of New York Press, pp. 111–122.
  • Ruse, Michael, 1977. “Karl Popper’s Philosophy of Biology”, Philosophy of Science , 44: 638–661.
  • –––, 2000. “Is evolutionary biology a different kind of science?”, Aquinas , 43: 251–282.
  • Ruse, Michael (ed.), (1996). But is it science? The philosophical question in the creation/evolution controversy , Amherst, NY: Prometheus Books.
  • Scott, P., Richards, E., and Martin, B., 1990. “Captives of controversy. The Myth of the Neutral Social Researcher in Contemporary Scientific Controversies”, Science, Technology, and Human Values , 15(4): 474–494.
  • Settle, Tom, 1971. “The Rationality of Science versus the Rationality of Magic”, Philosophy of the Social Sciences , 1: 173–194.
  • Siitonen, Arto, 1984. “Demarcation of science from the point of view of problems and problem-stating”, Philosophia Naturalis , 21: 339–353.
  • Thagard, Paul R., 1978. “Why Astrology Is a Pseudoscience”, Philosophy of Science Association ( PSA 1978 ), 1: 223–234.
  • –––, 1988. Computational Philosophy of Science , Cambridge, MA: MIT Press.
  • Thurs, Daniel P. and Ronald L. Numbers, 2013. “Science, pseudoscience and science falsely so-called”, in Pigliucci and Boudry (eds.) 2013, pp. 121–144.
  • Torcello, Lawrence, 2016. “The ethics of belief, cognition, and climate change pseudoskepticism: implications for public discourse”, Topics in Cognitive Science , 8: 19–48.
  • Vollmer, Gerhard, 1993. Wissenschaftstheorie im Einsatz, Beiträge zu einer selbstkritischen Wissenschaftsphilosophie Stuttgart: Hirzel Verlag.
  • Wazeck, Milena, 2009. Einsteins Gegner. Die öffentliche Kontroverse um die Relativitätstheorie in den 1920er Jahren . Frankfurt: campus.
  • Williams, Nigel, 2005. “Heavyweight attack on climate-change denial”, Current Biology , 15(4): R109–R110.

Anthroposophy

  • Hansson, Sven Ove, 1991. “Is Anthroposophy Science?”, Conceptus 25: 37–49.
  • Staudenmaier, Peter, 2014. Between Occultism and Nazism. Anthroposophy and the Politics of Race in the Fascist Era , Leiden: Brill.
  • James, Edward W, 1990. “On Dismissing Astrology and Other Irrationalities”, in Patrick Grim (ed.) Philosophy of Science and the Occult , 2nd edition, State University of New York Press, Albany, pp. 28–36.
  • Kanitscheider, Bernulf, 1991. “A Philosopher Looks at Astrology”, Interdisciplinary Science Reviews , 16: 258–266.

Climate science denialism

  • McKinnon, Catriona, 2016. “Should We Tolerate Climate Change Denial?”, Midwest Studies in Philosophy , 40(1): 205–216.
  • Torcello, Lawrence, 2016. “The Ethics of Belief, Cognition, and Climate Change Pseudoskepticism: Implications for Public Discourse”, Topics in Cognitive Science , 8(1): 19–48.

Creationism

  • Lambert, Kevin, 2006. “Fuller’s folly, Kuhnian paradigms, and intelligent design”, Social Studies of Science , 36(6): 835–842.
  • Pennock, Robert T., 2010. “The postmodern sin of intelligent design creationism”, Science and Education , 19(6–8): 757–778.
  • Ruse, Michael (ed.), 1996. But is it science? The philosophical question in the creation/evolution controversy , Prometheus Books.
  • Matthews, Michael R., 2019. Feng Shui: Teaching about science and pseudoscience , Springer.

Holocaust denial

  • Lipstadt, Deborah E., 1993. Denying the Holocaust: the growing assault on truth and memory , New York : Free Press.

Parapsychology

  • Edwards, Paul, 1996. Reincarnation: A Critical Examination , Amherst NY: Prometheus.
  • Flew, Antony, 1980. “Parapsychology: Science or Pseudoscience”, Pacific Philosophical Quarterly , 61: 100–114.
  • Hales, Steven D., 2001. “Evidence and the afterlife”, Philosophia , 28(1–4): 335–346.

Psychoanalysis

  • Boudry, Maarten, and Filip Buekens, 2011. “The epistemic predicament of a pseudoscience: Social constructivism confronts Freudian psychoanalysis”, Theoria , 77(2): 159–179.
  • Cioffi, Frank, 1998. Freud and the Question of Pseudoscience . Chigago: Open Court.
  • –––, 2013. “Pseudoscience. The case of Freud’s sexual etiology of the neuroses”, in Pigliucci and Boudry (eds.) 2013, pp. 321–340.
  • Grünbaum, Adolf, 1979. “Is Freudian psychoanalytic theory pseudoscientific by Karl Popper’s criterion of demarcation?”, American Philosophical Quarterly , 16: 131–141.

Quackery and non–scientific medicine

  • Jerkert, Jesper, 2013. “Why alternative medicine can be scientifically evaluated. Countering the evasions of pseudoscience”, in Pigliucci and Boudry (eds.) 2013, pp. 305–320.
  • Smith, Kevin, 2012a. “Against homeopathy–a utilitarian perspective”, Bioethics , 26(8): 398–409.
  • –––, 2012b. “Homeopathy is unscientific and unethical”, Bioethics , 26(9): 508–512.
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Skeptic’s Dictionary , contains information, links and references about a wide variety of contested claims and phenomena.
  • Committee for Skeptical Inquiry , the major international organization promoting scientific investigations of contested phenomena.
  • Quackwatch , devoted to critical assessment of scientifically unvalidated health claims.
  • Views of modern philosophers , a summary of the views that modern philosophers have taken on astrology, expanded from an article published in Correlation: Journal of Research into Astrology , 14/2 (1995): 33–34.

creationism | evolution | -->Freud, Sigmund --> | Kuhn, Thomas | Lakatos, Imre | -->logical positivism --> | natural selection | Popper, Karl | skepticism | Vienna Circle

Copyright © 2021 by Sven Ove Hansson < soh @ kth . se >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Change location

  • Connecticut
  • District of Columbia
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Puerto Rico
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Contact Sales

Pseudoscience examples for critical thinking skills

Snake oil, grapefruit diets, flat-earth theories—pseudoscience is something to be ignored, right? Not in science class! Studying pseudoscience is actually a great way to help students think like scientists, not to mention savvy citizens. We guarantee it, or your money back! (Kidding.)

critical thinking science and pseudoscience

MIRACLE HAIR GROWTH! 

Quantum hair activation technology: This groundbreaking innovation goes beyond conventional science, delving into the realm of quantum energy to stimulate hair growth at the subatomic level. Blended with rare botanicals from ancient civilizations for luster and shine. Limited-time offer: Act now and receive a vial of stardust-infused hair serum!

Effective product…or pseudoscience ? We’ll bet you guessed it. (Sorry, no stardust serum for you!)

While this hair product itself sounds like junk, reading about it can be a valuable experience for science students.

Teaching your students to identify pseudoscience in the world around them helps them learn to protect themselves from false claims that can be money-wasting at best, dangerous at worst.

And as they learn to discern , they also develop lifelong critical thinking skills!

“We say knowledge is power but it's not enough to know things, and there's too much to know. Being able to think and not fall for someone's bunk is my goal for my students.”  —Melanie Trecek-King, biology professor and guest in Science Connections podcast Season 3, Episode 5: Thinking is power

Let’s explore how educators can use examples of pseudoscience to develop critical thinking skills—and incorporate NGSS (Next Generation Science Standards) science and engineering practices into their approach.

What’s the difference between science and pseudoscience?

Science is grounded in empirical evidence, rigorous testing, and the scientific method. Pseudoscience presents itself as scientific but lacks the fundamental elements of genuine scientific inquiry: evidence, peer review, and the capacity to generate accurate predictions.

Though pseudoscience may make vague claims, it has clear characteristics. When something is pseudoscience, it:

  • Can’t be proven wrong: Makes claims that are unobservable or too vague.
  • Professes “proof” without presenting actual evidence: Presents only anecdotal evidence, if any.
  • Uses technobabble: See: “Quantum hair activation technology.”

For more characteristics of pseudoscience, check out Melanie Trecek-King ’s episode of Science Connections!

To be sure, not all pseudoscience is harmful—pursuits and activities such as aromatherapy and astrology can be positive experiences in people’s lives—it just should not be defined as or considered science.

How addressing pseudoscience encourages critical thinking

When you teach students to identify pseudoscience, you are teaching them to use an evidence- and research-based approach when analyzing claims. Which is…science!

You are also:

  • Teaching them to engage in thoughtful and educational argument/debate.
  • Encouraging them to use their knowledge of science in the real world.
  • Creating real-world impact.

When students learn to identify pseudoscience—faulty products, myths, and disprovable “discoveries”—they’ll be prepared and informed when making real-world decisions.

Critical thinking exercises inspired by pseudoscience

We’ve talked about “miracle” hair growth treatments, which are more commonly targeted to adults. Students may have more commonly encountered claims about or ads for alkaline water or detox diets , conspiracy theories and instances of science denial, astrology, and more. These examples offer great opportunities to discuss how to determine the difference between science and pseudoscience.

Suggested activities:

  • Pseudoscience Sherlock: Ask students to find examples of pseudoscience in real life via social media, products sold in stores, or on the internet. Tell them to pay close attention to “articles” that are really ads.
  • Pseudoscience lab: Prompt students to back up their claim that a given example represents pseudoscience with evidence: e.g., lack of empirical evidence, controlled experiments, or unbiased sample; absence of peer-reviewed research; reliance on anecdotes; hyperbolic and unprovable claims.
  • Snake oil! Ask students to practice identifying pseudoscience by creating their own advertisements, commercials, or news segments for fake products or scientific “advancements.”
  • Spread the word: Ask students to create flyers, PSAs, or articles on how to identify the characteristics of pseudoscience.

Other activities that incorporate the NGSS while also sniffing out pseudoscience:

  • Asking questions: Encourage students to ask probing questions about pseudoscientific claims. How does this claim defy our current understanding of the natural world? What empirical evidence is missing?
  • Developing and using models: Have students create models that illustrate the differences between a pseudoscientific claim and a well-established scientific concept. This visual representation supports understanding and critical analysis.
  • Engaging in argument from evidence: Arrange debates where students argue for or against a pseudoscientific claim using evidence-based reasoning. This practice sharpens their ability to critically evaluate information.
  • Obtaining, evaluating, and communicating information: Ask students to research the history and impact of a specific pseudoscientific belief. Have them present their findings, highlighting how critical thinking could have prevented widespread acceptance of the claim.

Using examples of pseudoscience in your science classroom can help students learn to not only think like scientists, but navigate the real world, too.

Bertha Vasquez, former teacher and current director of education at the Center for Inquiry, has used these approaches with her students. As she shared on Season 3, Episode 6 of Science Connections : “I guarantee you that those students, when they walked into a store with their parents and they saw a product [with] a money-back guarantee [that] cures way too many things, and it’s based on ‘ancient plant wisdom’ and has ‘scientific’ language on the box, they may go, ‘Mom, I think these people are trying to sell you some pseudoscience.’”

More to explore

Science connections.

  • Season 3, Episode 5: Thinking is power
  • Season 3, Episode 6: Identifying and addressing pseudoscience

Back-to-school science toolkit for administrators, teachers, and caregivers

Back-to-school science toolkit for administrators, teachers, and caregivers

Related resources.

critical thinking science and pseudoscience

Science Connections is a podcast created for K—8 science educators. Along with guests, host, Eric Cross explores the best ways to improve K–8 science teaching p...

critical thinking science and pseudoscience

Amplify Science

Amplify Science is a K–8 science curriculum that blends hands-on investigations, literacy-rich activities, and interactive digital tools to empower students to...

critical thinking science and pseudoscience

Resource site

Amplify Science resource site

Inspiring the next generation of scientists, engineers, and curious citizens.

Sed posuere consectetur est at lobortis. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum.

critical thinking science and pseudoscience

Main navigation

  • Our Articles
  • Dr. Joe's Books
  • Media and Press
  • Our History
  • Public Lectures
  • Past Newsletters

Subscribe to the OSS Weekly Newsletter!

Register for the oss 25th anniversary event, what’s trending in the world of pseudoscience.

clipart of a megaphone, bicep flexing, yin and yang, DNA double helix, vaccination needle, and yogi with lotus flower

  • Add to calendar
  • Tweet Widget

Our office’s mission is to separate sense from nonsense, which may well be a Sisyphean task. There is a lot of pseudoscience—meaning ideas and interventions that look scientific but that are not—especially around health. Having a mental map of what is trending right now can help us better understand the landscape so that we can intervene more effectively. I have been interested in health-related pseudoscience for over a decade now. Here is what I see being popular at the moment and who is pushing (and often profiting from) these narratives.

Trend #1: Scienceploitation

A term coined by Timothy Caulfield from the University of Alberta , scienceploitation is the abuse of real but preliminary findings in an emerging field of research in order to sell you a product or service that is not ready for primetime. It’s bundling hype into a package that can make money.

The microbiome, for example, fits squarely in this niche. The fact that our body has been colonized by a myriad of microorganisms is not pseudoscience; it is a scientific fact. These bacteria have a real impact on our health, and the effectiveness of fecal transplants for C. difficile infections shows that we can devise science-based treatments that modify the microbiome. But the fervour around this field of study has outpaced the evidence , and we can now buy probiotics, prebiotics and postbiotics that carry little proof they do anything to our bodies, as well as testing kits that promise insight.

Likewise, scienceploitation has touched disciplines as diverse as stem cells, cannabis, and nutrigenomics , churning theoretical knowledge and laboratory data into profitable cure-alls. The people marketing scienceploitation tend not to be solo influencers on social media so much as companies built around these early scientific promises. Beware of products that leverage the hype around red light therapy , psychedelics, epigenetics , and cancer immunotherapy . There may be justified applications, and more to come in the future, but these concepts are rife for exploitation of both real science and consumers.

Trend #2: Body optimization

The epicentre of the body optimization movement seems to be Silicon Valley in California. In the land of the tech entrepreneur, science can be used to “hack” human biology and increase productivity and longevity. Two of the main ambassadors for body optimization are podcasters: Joe Rogan and Andrew Huberman .

With body optimization, the goal is to beat the lab mice to it and jump on early findings not to treat disease (as was the case for scienceploitation) but to fulfill the possibilities of the human body in day-to-day life. Nootropics (so-called smart drugs) are popular in this space: they are alleged to help you think faster and more clearly. The problem is that the evidence so far does not live up to the hype , and there are less sexy but more effective ways of improving your thinking, like getting a good night’s sleep.

Be cautious around the masculine certainty with which body optimization tools like intermittent fasting and assorted anti-aging interventions are endorsed. The interventions themselves are often based on in vitro or animal studies, and the certainty has not been earned.

Trend #3: Integrative medicine

Pushed by advocates like Andrew Weil and adopted by hospitals all over the world (and even by the World Health Organization ), the latest rebranding of alternative medicine is alive and well. Its tenet is that conventional medicine is imperfect (which is true), but so are complementary and alternative medical practices like Reiki , homeopathy , and reflexology . To get the best of both worlds, they must be integrated. But as Dr. Mark Crislip of Science-Based Medicine famously wrote, “if you mix cow pie with apple pie, it does not make the cow pie taste better; it makes the apple pie worse.” The integration of disproven and unproven ideas to medicine does not make it stronger.

And yet, large medical institutions have fallen for the siren call of integrative medicine , often because patients clamour for it and wealthy donors who believe in it are willing to finance it. Many of these treatment modalities have been poorly studied and rely on pre-scientific ideas about how the human body works, often hinging on a single cause for all diseases and a miraculous panacea. When the intervention is convincingly shown in rigorous trials to be no better than placebo, this invalidation is reinterpreted in a positive light: it works through the placebo effect, which is redefined as the power of the mind to convince the body to heal itself. Don’t fall for this bit of magical thinking.

Trend #4: Wellness

As we make our way through these trends, you will notice our moving away from a very pro-science and pro-technology attitude to one that becomes more and more anti-science and anti-technology. With wellness, we see the embrace of a lifestyle which is sold as all-natural and which is meant to prevent disease in the first place. Wellness figures like Joe Mercola and Gwyneth Paltrow will tell you that if you fill your body with natural substances, you will never need a doctor.

The central dogma of wellness is that modernity has polluted our world with ill-defined toxins. To be healthy, we need to detox. This is a gross exaggeration, bordering on a falsehood, and the toxin-flushing remedies sold within this lucrative industry carry no real benefits but very disturbing risks, such as tearing the intestine during a colon cleanse.

Public health interventions are of very little interest to the wellness crowd; instead, wellness zeroes in on a hyper-individualized form of health, in which improvements are credited to a person’s adherence to the right wellness regime… and any health decline can similarly be blamed on them for not doing enough to keep themselves thriving.

While wellness authorities decry the pharmaceuticals pushed by physicians, they deceptively act similarly with dietary supplements, binders and adaptogens , piles of poorly regulated pills that are marketed as natural and therefore salutary.

Trend #5: The fear of genetic modifications

Anxieties over genetically modified organisms (GMOs) have been transposed over messenger RNA (mRNA) vaccines. The public in general has a poor understanding of genetics and molecular biology. This vulnerability can be exploited by people like Joe Mercola and Peter McCullough , who fuel the fear that these new technologies might violate the integrity and sanctity of a person’s DNA.

With GMOs, fearmongers were quick to spread Frankenstein-inspired imagery online, as genetically-engineered food products (real and imagined) were painted as unnatural and potentially harmful. The mRNA vaccines against COVID-19, meanwhile, were vilified as potentially introducing mutations in our DNA (an event with such low probability as to be practically impossible). Now, it’s the mRNA vaccines used on livestock, like pigs (and soon in shrimps too ), that provide fodder for food anxieties: will eating pork damage our own DNA? The answer is no, and the safety of these technologies has been well documented . Activists fanning the flames of this genetic anxiety, however, remain mum on this documentation, preferring to point to the long shadow of potential long-term side effects. Within these communities, gene-based technologies have never been studied long enough and will never be safe.

We should be ready for similar arguments to be made if/when self-amplifying RNA vaccines , in which the gene introduced by the vaccine comes with its own replication machinery that can make copies of it for weeks, are rolled out.

Trend #6: Anti-vaccination

The COVID-19 pandemic could have been a stark reminder to anti-vaccine activists and to people who are vaccine hesitant of the importance of vaccines: the threat was significant in both scope and severity. Vaccines can be victims of their own success when the diseases they prevent are no longer on people’s minds. COVID-19 was much harder to ignore.

Despite the reality of the pandemic, the anti-vaccine movement was reenergized. The creed of its adherents is that vaccines are dangerous and ineffective, while the diseases they prevent are mild. The movement has many prominent and vocal figures, like Robert F. Kennedy Jr. , Del Bigtree, Stew Peters, and Steve Kirsch , and has enticed new recruits during the pandemic, such as the evolutionary biologist Bret Weinstein and the oncologist Vinay Prasad. The playbook is simple: deny official statistics on safety and efficacy and trawl vaccine safety databases like VAERS to extract so-called vaccine injuries like myocarditis in order to delineate a conspiracy by authorities to suppress the truth.

Some anti-vaxxers oppose all vaccines, while newer acolytes in the movement simply draw the line at the COVID-19 vaccines. Once they start networking with others in the movement and the distrust in institutions builds up, however, the leap from COVID anti-vaxxer to general anti-vaxxer becomes very tempting.

As an alternative to vaccines, supplements and off-patent drugs (like ivermectin and hydroxychloroquine) are often promoted as safer treatments and preventatives. The flawed argument here is that because a pharmaceutical company cannot rake in the profits with these older drugs, the drugs must be effective. Pharma then suppresses this knowledge in order to market more profitable and more toxic drugs to the masses.

From pseudoscience to science denial to conspiracy thinking

Believing in a pseudoscience often goes hand-in-hand with denying actual science . If homeopathy, the absurd dilution of natural substances to make them stronger and cure disease, is real, then there is little need for modern medicine. If this is true, then how can we explain why mainstream experts disagree with us on this? The answer often found within these communities is that the experts know that homeopathy works. They are silencing the truth because corporate interests cannot make money off of it, or because they want to cause harm to the world at large and keep us all sick. Therefore, as we scratch the surface of a pseudoscience, we commonly find conspiracy theories thriving in the dark.

Which conspiracy theories are blooming right now under the veneer of pseudoscience? The Great Reset Initiative (a plan put forth by the World Economic Forum and interpreted by conspiracists as the arrival of a world government that will take away ownership of goods and currency), the Great Replacement Theory (the idea that white people are being replaced by immigrants of colour), the concept of 15-minute cities (in which improved urban planning is framed as a totalitarian restriction on movements), transhumanism (the fear that powerful people want to fuse us with machines so that we lose our soul), the fiction that the COVID-19 pandemic was planned by governments, the panic around fake meat and edible insects and what it means for our future dietary choices, and of course Big Pharma (the accusation that the pharmaceutical industry is so big and powerful, it now controls academia, governments, and the media).

Within this conspiracy-mongering discourse, you will hear mentions of “medical freedom” (the idea that the institutionalized practice of medicine is hopelessly corrupt and that doctors must be sought outside the pharmaceutical and health insurance system) and of “sovereign citizenship” (the claim that human laws do not really exist and that we should only obey God’s laws). The echoes of the QAnon political movement can still be heard, especially its more feminine, pastel-coloured version’s slogan of “Save the Children,” which dovetails with the current wave of transphobia. And, as with most conspiracy theories, when we dig all the way to the bottom, what we often find is old-fashioned antisemitism.

It is important to note that the above trends are not discrete entities: they often overlap. The fear of genetic modifications feeds our contemporary anti-vaccine movement, and dietary supplements are endorsed in most of these trends. Yet as we go down the list, we do observe worsening attitudes toward science and technology and a wider embrace of conspiratorial thinking. This inventory is also not comprehensive: there is a lot of pseudoscience out there , but these six trends represent particularly popular types of health-related pseudoscientific beliefs and interventions at the moment.

We could devise an entirely different classification based on activity: food pseudoscience, sleep pseudoscience, fitness pseudoscience, etc. I chose instead to highlight trends based on their underlying philosophy, because I think it better exposes the beliefs that spread within these communities and it helps us understand why people gravitate toward them.

The good news is that many of those who fall under the spell of these pseudoscientific trends have simply been misinformed. Explaining to them why these interventions are not based on good evidence, why they are implausible, and why they may appear to work (through placebo effects, for example, or through a reliance on carefully curated testimonials) can often lead them to change their mind and to make better-informed decisions about their health.

But there are those who strongly identify with these trends and who may fully buy into their underlying conspiracy theories. Facts, in these cases, will not be enough. If these people are close to you, try to keep the lines of communication open if you can in case they ever feel ready to question their beliefs.

With any criticism of pseudoscientific activities, being kind is more likely to be productive than calling people “idiots.”

I hope that this survey is more instructive than overwhelming. The first step in addressing pseudoscience is making sure we have a good lay of the land.

@CrackedScience

What to read next

Is nutmeg really hallucinogenic 19 apr 2024.

critical thinking science and pseudoscience

Dirty Dozen a Good Movie Name, but Gets Thumbs Down As List of Pesticide Residue 12 Apr 2024

critical thinking science and pseudoscience

Crank Magnetism at the Weston A. Price Foundation 12 Apr 2024

critical thinking science and pseudoscience

MOSH Bars Are Starved of Good Science 5 Apr 2024

critical thinking science and pseudoscience

You Probably Don’t Need that Green AG1 Smoothie 22 Mar 2024

critical thinking science and pseudoscience

The Chilled Truth: Debunking Myths About Icing Your Face for Better Skin 22 Mar 2024

critical thinking science and pseudoscience

Department and University Information

Office for science and society.

Office for Science and Society

COMMENTS

  1. Critical Thinking, Science, and Pseudoscience

    Release Date: March 8, 2016. Paperback / softback. 304 Pages. Trim Size: 7in x 10in. ISBN: 9780826194190. eBook ISBN: 9780826194268. Reviews. This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popularpseudoscientific claims through a multidisciplinary lens.

  2. Critical thinking, science, and pseudoscience: Why we can't trust our

    This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popular pseudoscientific claims through a multidisciplinary lens. Rather than merely focusing on critical thinking, the text incorporates the perspectives of psychology, biology, physics, medicine, and other disciplines to reinforce different categories of ...

  3. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    This item: Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our Brains . $64.24 $ 64. 24. Get it as soon as Monday, Feb 5. In Stock. Ships from and sold by Amazon.com. + A Field Guide to Lies: Critical Thinking in the Information Age. $29.32 $ 29. 32. Get it as soon as Monday, Feb 5.

  4. Critical Thinking, Science, and Pseudoscience

    This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popular pseudoscientific claims through a multidisciplinary lens. Rather than merely focusing on critical thinking grounded in philosophy and psychology, the text incorporates the perspectives of biology, physics, medicine, and other disciplines to ...

  5. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    Rather than merely focusing on critical thinking grounded in philosophy and psychology, the text incorporates the perspectives of biology, physics, medicine, and other disciplines to reinforce different categories of rational explanation. The book is also distinguished by its respectful approach to individuals whose ideas are, according to the ...

  6. Critical thinking, science, and pseudoscience : why we can't trust our

    This text for undergraduate courses in critical thinking across disciplines uses the intriguing and appealing exploration of pseudoscience to apply these principles and skills.Providing an accessible foundation of what critical thinking is, why it's important, and how to apply these skills, the book explores the psychological and social reasons ...

  7. Critical Thinking, Science, and Pseudoscience

    Mental Health, Pop Psychology, and the Misunderstanding of Clinical Psychology. Caleb W. Lack J. Rousseau. Psychology. Reference Module in Neuroscience and…. 2020. Semantic Scholar extracted view of "Critical Thinking, Science, and Pseudoscience" by Caleb W. Lack et al.

  8. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    This item: Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our Brains . $128.46 $ 128. 46. Get it 26 Apr - May 6. In stock. Ships from and sold by Amazon US. + Weaponized Lies: How to Think Critically in the Post-Truth Era. $36.40 $ 36. 40. Get it 26 Apr - May 6. Only 1 left in stock.

  9. Critical Thinking, Science, and Pseudoscience

    Key Features: Addresses the foundations of critical thinking and how to apply it through the popular activity of examining pseudoscience. Explains why humans are vulnerable to pseudoscientific claims and how critical thinking can overcome fallacies and biases. Reinforces critical thinking through multidisciplinary analyses of pseudoscience.

  10. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    Rather than merely focusing on critical thinking, the text incorporates the perspectives ofpsychology, biology, physics, medicine, and other disciplines to reinforce different categories of rational explanation. Accessible and engaging, itdescribes what critical thinking is, why it is important, and how to learn and apply skills that promote it.

  11. Critical Thinking, Science, and Pseudoscience

    This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popular pseudoscientific claims through a multidisciplinary lens. Rather than merely focusing on critical thinking grounded in philosophy and psychology, the tex…

  12. Critical Thinking, Science, and Pseudoscience 1st Edition

    Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our Brains 1st Edition is written by Caleb W. Lack, PhD; Jacques Rousseau, MA and published by Springer Publishing Company. The Digital and eTextbook ISBNs for Critical Thinking, Science, and Pseudoscience are 9780826194268, 0826194265 and the print ISBNs are 9780826194190, 0826194192.

  13. Science and Pseudo-Science

    Science and Pseudo-Science. First published Wed Sep 3, 2008; substantive revision Thu May 20, 2021. The demarcation between science and pseudoscience is part of the larger task of determining which beliefs are epistemically warranted. This entry clarifies the specific nature of pseudoscience in relation to other categories of non-scientific ...

  14. Critical Thinking, Science, and Pseudoscience

    Request PDF | On Mar 1, 2016, Caleb W. Lack and others published Critical Thinking, Science, and Pseudoscience | Find, read and cite all the research you need on ResearchGate

  15. Science, Pseudo-science, Non-sense, and Critical Thinking

    ABSTRACT. Science, Pseudo-science, Non-sense, and Critical Thinking shines an unforgiving light on popular and lucrative 'miraculous' practices that promise to offer answers during times of trouble. Throughout the book, the authors unfold the fallacies underlying these practices, as well as consumers' need and desire to believe in them.

  16. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    Buy Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our Brains 1 by Lack, Caleb W., Rousseau, Jacques (ISBN: 9780826194190) from Amazon's Book Store. Everyday low prices and free delivery on eligible orders.

  17. Pseudoscience examples for critical thinking skills

    Critical thinking exercises inspired by pseudoscience. We've talked about "miracle" hair growth treatments, which are more commonly targeted to adults. Students may have more commonly encountered claims about or ads for alkaline water or detox diets, conspiracy theories and instances of science denial, astrology, and more.

  18. Critical Thinking, Science, and Pseudoscience: Why We Can't Trust Our

    I made this the required textbook for my junior/senior-level class, "Pseudoscience In The Real World," last spring, in which students are required to identify, research, and design a solution to a current pseudoscientific problem (e.g., one group researched the purported vaccine-autism link and its implications for public health).

  19. What's Trending in the World of Pseudoscience

    Trend #2: Body optimization. The epicentre of the body optimization movement seems to be Silicon Valley in California. In the land of the tech entrepreneur, science can be used to "hack" human biology and increase productivity and longevity. Two of the main ambassadors for body optimization are podcasters: Joe Rogan and Andrew Huberman.

  20. Science, Pseudo-science, Non-sense, and Critical Thinking

    Description. Science, Pseudo-science, Non-sense, and Critical Thinking shines an unforgiving light on popular and lucrative 'miraculous' practices that promise to offer answers during times of trouble. Throughout the book, the authors unfold the fallacies underlying these practices, as well as consumers' need and desire to believe in them.

  21. Critical thinking, epistemological beliefs, and the science

    This study used the structural equation model to examine teachers' scientific epistemological beliefs, critical thinking skills, and beliefs about the distinction between science and pseudoscience.

  22. Critical Thinking, Science, and Pseudoscience 1st edition

    Rent 📙Critical Thinking, Science, and Pseudoscience 1st edition (978-0826194268) today, or search our site for other 📚textbooks by Caleb W. Lack. Every textbook comes with a 21-day "Any Reason" guarantee. Published by Springer Publishing Company.

  23. Science and pseudo-science

    The difference between science and pseudo-science has to do with the attitudes and methods that are brought to an enquiry, not whether a particular claim is true. The scientific model gives better grounds for adopting beliefs because it is more likely to identify beliefs that are false. It is not that the claims of science are more plausible in ...