You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

What is fake news in science?

There seems to be a growing concern about ‘fake news’, but it is not always clear what that means. Strictly speaking ‘fake’ is more of an invention, synonyms being ‘sham’, ‘forgery’, ‘imitation’, or ‘pretend’. ‘False’ is similar but is more concerned with deliberate distortion of evidence that actually exists and is ‘faulty’, ‘wrong’, or ‘misinformed’. For review of some of the more discussed issues around ‘fake news’, the easiest way is to go to:, but we wish to emphasise some more complexities that are missing in the current debates.

It may be clear and very damaging if someone deliberately deletes, changes, suppresses factual, material evidence; or invents it. Other situations are not so clear as to allow any individual or group to set themselves up as arbiters of what is ‘true’ or not.

There are few truths in life that are indisputable, and science uses the main stages of first hypothesis/discovery, then analysis and finally interpretation of data, using those three activities to produce evidence that we act upon. Data can be checked, but the discovery methods, selection of the data (which in turn relies on descriptions and definitions), and the aims and context of its discovery are often significant and not always available or easy to come by and evaluate by a third party. How the findings are analysed depend on tools that are used such as experimental methods and statistics, which must be appropriate and carried out properly. Interpretation of the results involve value judgements which are affected by the researchers’ experience, knowledge, biases and more.

In many situations, decision makers rely on experts to guide them on scientific matters; for them to say “We are listening to the scientists” has become synonymous with “We’re doing the right thing”. But are experts always right? They may be biased in favour of their own preconceptions, they may also not be expert in all the circumstances relevant to a particular project, nor is anyone free of any number of errors. Being part of a panel of experts takes care of some issues, though we usually cannot be sure how any disagreements might have been resolved, nor the compromises made.

There is a further risk to relying on the wisdom of expert groups to decide on falsehood. The argument that something is false is often based upon prior knowledge. Prior knowledge is used in many epidemiological studies, though prospective data are often sought and can be analysed continuously. Not only can data suffer from being out-of-date, but it can also be out of context in relation to data obtained in a different place or circumstances.

The Erice Declaration, from a meeting in 1997, contains the advice that all communications on scientific matters must make clear that ‘…..Facts, hypotheses and conclusions should be distinguished, uncertainty acknowledged, and information provided in ways that meet both general and individual needs’ [1]. These are the basic responsibilities of scientific communication that allow for a critical review of anyone’s work: being careful and critical are the essence of determining what one should throw out and what might be useful.

Peer review of scientific publications is the obvious way of seeking to eliminate misleading information and its subsequent use for actions that may be harmful. But here again, the evaluations are based on the assessors’ experience and other values. Are they valid? To make those judgments the reader must be given reasoning and discussions about why the assessors take that stance, and it must be articulated clearly. That includes a self-assessment of peer assessors’ own biases, experience, the aims and context of their own work, and more general conflicts of interest in the field.

We can now consider some of the less obvious implications of amassing information and knowledge from the multitude of sources available to us, reasoning from a well-known quote from Frank Zappa: “Without deviation from the norm, progress is not possible” [2].

As a current example, during the COVID-19 pandemic, the use of face masks has been the subject of debate, with considerable doubt expressed as to its effectiveness, up to about April 2020 [3], some of that doubt being because of poor scientific evidence [4].

Since then, there has been a great number of studies on the effectiveness of masks based on technical studies on particle filtration [5,6], as well as an editorial and a paper [7,8], and a continuous Science Brief from the American Centre of Disease Control [9] summarizing the experimental and epidemiological evidence and informing the public in a clear way about the value of masks in the epidemic. The evidence is much more extensive than the references above suggest, but our intention is to give an idea, with examples that one can check, about the rigour of the evidence on the efficacy of masks obtained progressively and summarized expertly for public consideration over less than one year.

Science and medicine are progressive and cumulative disciplines in every respect, evolving all the time. The reliable and tested from the past may still be true, but dogmatically sticking to ‘we’ve always done it this way!’ will be a barrier to progress. The attitude ‘we’ve never done it this way!’ is equally detrimental. Good scientists know that there will always be some degree of uncertainty about existing knowledge, and that some level of risk is involved in pursuing new ideas [10].

Perhaps most important is that new ideas, and insights, are the initiators of hypotheses that with the accumulation of various data will become increasingly solid evidence. We must not fall into the trap of dismissing opinions and ‘gut feelings’, so long as they are identified as such. We must also not allow ourselves to be swayed too strongly by expert interpretations of evidence or by extrapolations, modeling and predictions unless we are convinced of the origins of the evidence and reasoning behind any conclusions.

In the end, there is an obligation on scientists to do the best, honest work they can being open about any assumptions made in its development and limitations about the value of their work and use in practice. Both researchers and readers and evaluators of research must use ‘phronesis’ translated as ‘practical wisdom’ in doing and critically evaluating results.

I. Ralph Edwards and Marie Lindquist




The Erice Declaration: On communicating drug safety information. Prescrire Int. 1998; 7(38): 191.


Zappa F, Occhiogrosso P. The Real Frank Zappa (New Edition). Publ. Pan MacMillan; 1990.


Lifesaver or false protection: Do face masks stop coronavirus? Financial Times. Available from: Accessed 21 June 2021.


Bae S, Kim M-C, Kim JY Effectiveness of surgical and cotton masks in blocking SARS-CoV-2 [Notice of Retraction]. Annals of Internal Medicine. 2020;173(1):79.


Eikenberry SE, Mancuso Marina, Iboi Enahoro To mask or not to mask: Modeling the potential for face mask use by the general public to curtail the COVID-19 pandemic. Infectious Disease Modelling. 2020(5):293–308.


Sickbert-Bennett EE, Samet JM, Clapp PW Filtration efficiency of hospital face mask alternatives available for use during the COVID-19 pandemic. JAMA Intern Med. 2020;180(12):1607–12.


Steinbrook R. Filtration efficiency of face masks used by the public during the COVID-19 pandemic. JAMA Intern Med. 2021;181(4):470.


Clapp PW, Sickbert-Bennett EE, Samet JM US centers for disease control and prevention epicenters program. Evaluation of cloth masks and modified procedure masks as personal protective equipment for the public during the COVID-19 pandemic. JAMA Intern Med. 2021;181(4):463–9.


US Centers for Disease Control and Prevention. Scientific brief: Community use of cloth masks to control the spread of SARS-CoV-2. Available from: Accessed 21 June 2021.


Knapton S, How doom-filled predictions were based on out-of-date data. Available from:!preferred/0/package/634/pub/634/page/4/article/185632. Accessed 17 June 2021.