This article may require copy editing for grammar, style, cohesion, tone, or spelling. (October 2021)
Misinformation is incorrect or misleading information unintentionally presented as fact. This is contrasted with disinformation in that disinformation is deliberately deceptive. Misinformation is different from rumor, which is speculative. Even if later retracted, misinformation can continue to influence actions and memory.
Early examples include the insults and smears spread among political rivals in Imperial and Renaissance Italy in the form of "pasquinades." These are anonymous and witty verses named for the Pasquino piazza and "talking statue" in Rome. An example from pre-revolutionary France is "canards", or printed broadsides that sometimes included an engraving to help convince readers to take their wild tales seriously.
The first recorded large-scale misinformation campaign was the "Great Moon Hoax," published in 1835 in the New York Sun. This was a series of six articles claiming to describe life on the Moon, "complete with illustrations of humanoid bat-creatures and bearded blue unicorns". The fast pace and sometimes strife-filled work of mass-producing news broadsheets led to copies with factual errors and mistakes. An example of such is the Chicago Tribune's infamous 1948 headline "Dewey Defeats Truman".
Identification and correction
According to Anne Mintz, editor of Web of Deception: Misinformation on the Internet, one of the best ways to determine whether the information is factual is to use common sense. Mintz advises that the reader check whether the information makes sense, and to check whether the founders or reporters who are spreading the information are biased or have an agenda. Professional journalists and researchers look at other sites (particularly verified sources like news channels) for information, as the information is more likely to be reviewed by multiple people or to have been heavily researched, providing more reliable details.
Martin Libicki, author of Conquest In Cyberspace: National Security and Information Warfare, noted that the trick to working with misinformation is the idea that readers must have a balance of what is correct or incorrect. Readers cannot be gullible, but also should not be paranoid that all information is incorrect. There is always the chance that even readers who strike this balance will believe an error to be true, or a truth to be an error.
A person's formal education level and information literacy, or media literacy, correlates with their ability to recognize misinformation. This means if a person is more familiar with the content and process of how the information is researched and presented or is better at critically evaluating information of any source, they are more likely to correctly identify misinformation. Increasing literacy may not lead to improved ability to detect misinformation, as a certain level of literacy could be used to "justify belief in misinformation." Further research reveals that content descriptors can have varying effects t on people's ability to detect misinformation.
Prior research suggests it can be very difficult to undo the effects of misinformation once individuals believe it to be truthful, and that fact-checking can even backfire. Individuals may have a desire to reach a certain conclusion, causing them to accept information that supports that conclusion. This is known as motivated reasoning. Individuals create mental models and schemas to understand their physical and social environments. Misinformation that becomes incorporated into a mental model, especially for long periods of time, will be more difficult to address as individuals prefer to have a complete mental model. In this instance, it is necessary to correct the misinformation by both refuting it and providing accurate information that can function in the mental model. When attempting to correct misinformation, it is important to consider previous research which has identified effective and ineffective strategies. Simply providing the corrected information is insufficient to correct the effects of misinformation, and it may even have a negative effect. Due to the familiarity heuristic—information that is familiar is more likely to be believed to be true—corrective messages which contain a repetition of the original misinformation may result in an increase in familiarity and cause a backfire effect.
Factors that contribute to the effectiveness of a corrective message include an individual's mental model or worldview beliefs, repeated exposure to the misinformation, time-lag between misinformation and correction, credibility and reliability of the sources, and relative coherency of the misinformation and corrective message. Corrective messages will be more effective when they are coherent and/or consistent with the target audience's worldview beliefs. They will be less effective when misinformation is believed to come from a credible source, is repeated prior to correction (even if the repetition occurs in the process of debunking), and/or when there is a time lag between the misinformation exposure and corrective message. Additionally, corrective messages delivered by the original source of the misinformation tend to be more effective.
A suggested solution that would focus on primary prevention of misinformation is the use of a distributed consensus mechanism to validate the accuracy of claims, with appropriate flagging or removal of content that is determined to be false or misleading. Another approach to correcting misinformation is to "inoculate" against it by delivering misinformation in a weakened form by warning of the dangers of the misinformation and including counterarguments showing the misleading techniques at work in the misinformation. One way to apply this approach is to use parallel argumentation, in which the flawed logic is transferred to a parallel situation (E.g. shared extremity or absurdity). This approach exposes bad logic without the need for complicated explanations.
Flagging or eliminating news media containing false statements using algorithmic fact checkers is becoming the front line in the battle against the spread of misinformation. Computer programs that automatically detect misinformation are still just beginning to emerge, but similar algorithms are already in place with Facebook and Google. Google provides supplemental information pointing to fact check websites in response to its users searching controversial search terms. Likewise, algorithms detect and alert Facebook users that what they are about to share is likely false, hoping to reduce the chances of the user sharing.
A common related issue brought up is the over censorship of platforms like Facebook and Twitter. Many free speech activists argue that their voices are not being heard and their rights being taken away. To combat the spread of misinformation, social media platforms must be able to find common ground between allowing free speech, while also not allowing conspiracy theories to be spread throughout their platform.
Websites have been created to help people to discern fact from fiction. For example, the site FactCheck.org has a mission to fact check the media, especially political speeches and stories going viral on the Internet. The site also includes a forum where people can openly ask questions about the information they are not sure is true in both the media and the internet. Similar sites give individuals the option to be able to copy and paste misinformation into a search engine and the site will investigate the truthfulness of the inputted data. Famous online resources such as Facebook and Google added automatic fact checker programs to their sites, and created the option for users to flag information that they think are false on their website. A way that fact-checking programs find misinformation involves analyzing the language and syntax of news stories. Another way is fact-checkers can search for existing information on the subject and compare it to the news broadcasts being put online. Other sites such as Wikipedia and Snopes are also widely used resources for verifying information.
Historically, people have relied on journalists and other information professionals to relay facts and truths. Many different things cause miscommunication but the underlying factor is information literacy. Information is distributed by various means, and because of this, it is often hard for users to ask questions of the credibility of what they are seeing. Many online sources of misinformation use techniques to fool users into thinking their sites are legitimate and the information they generate is factual. Often misinformation can be politically motivated. For example, websites such as USConservativeToday.com have posted false information for political and monetary gain. Another role misinformation serves is to distract the public eye from negative information about a given person and/or issues of policy, which as a result can go unremarked with the public eye preoccupied with fake-news. In addition to the sharing of misinformation for political and monetary gain it is also spread unintentionally.
Misinformation is sometimes an unintended side effect of bias. Misguided opinions can lead to the unintentional spread of misinformation, where individuals do not intend on spreading false propaganda, yet the false information they share is not checked and referenced. While that may be the case, there are plenty of instances where information is intentionally skewed, or leaves out major defining details and facts, causing people to make irrational decisions. Misinformation does not simply mean information that is false. In other words, it is "misleading information."
Another reason for the recent spread of misinformation is the lack of consequences. With little to no repercussions, there is nothing to stop people from posting misleading information. The gain they get from the power of influencing other peoples' minds is greater than the impact of a taken down post or temporary ban on Twitter. This forces individual companies to be the ones to mandate rules and policies regarding when people's "free speech" impedes other users' quality of life.
Social media can contribute to the spread of misinformation when users share information without first checking the legitimacy of the information they have found.
In the Information Age, social networking sites have become a notable agent for the spread of misinformation, fake news, and propaganda. Misinformation on social media spreads quickly in comparison to traditional media because of the lack of regulation and examination required before posting. These sites provide users with the capability to spread information quickly to other users without requiring the permission of a gatekeeper such as an editor who might otherwise require confirmation of the truth before allowing publication. Journalists today are criticized for helping to spread false information on these social platforms, but research shows they also play a role in curbing the spread of misinformation on social media through debunking and denying false rumors.
Social media platforms offer a rich ground for the spread of misinformation. The exact sharing and motivation behind why misinformation spreads through social media so easily remain unknown. A 2018 study of Twitter determined that, compared to accurate information, false information spread significantly faster, further, deeper, and more broadly. Similarly, a research study of Facebook found that misinformation was more likely to be clicked on than factual information. Combating its spread is difficult for two reasons: the profusion of information sources, and the generation of "echo chambers". The profusion of information sources makes the reader's task of weighing the reliability of information more challenging, heightened by the untrustworthy social signals that go with such information. Echo chambers and filter bubbles come from the inclination of people to follow or support like-minded individuals. With no differing information to counter the untruths or the general agreement within isolated social clusters, some writers argue the outcome is a dearth, and even worse absence of a collective reality. Although social media sites have changed their algorithms to prevent the spread of fake news, the problem still exists. Furthermore, research has shown that while people may know what the scientific community has proved as a fact, they may still refuse to accept it as such.
Misinformation thrives in a social media landscape frequently used and spread by college students. This can be supported by scholars such as Ghosh and Scott, who indicated that misinformation is "becoming unstoppable." It has also been observed that misinformation and disinformation come back, multiple times on social media sites. A research study watched the process of thirteen rumors appearing on Twitter and noticed that eleven of those same stories resurfaced multiple times, after much time had passed.
Another reason that misinformation spreads on social media is from the users themselves. In a study, it was shown that the most common reasons that Facebook users were sharing misinformation for socially motivated reasons, rather than taking the information seriously. Although users may not be spreading false information for malicious reasons, the misinformation is still being spread. A research study shows that misinformation that is introduced through a social format influences individuals drastically more than misinformation delivered non-socially. Facebook's coverage of misinformation has become a hot topic with the spread of COVID-19, as some reports indicated Facebook recommended pages containing health misinformation. For example, this can be seen when a user likes an anti-vax Facebook page. Automatically, more and more anti-vax pages are recommended to the user. Some people go even farther, referencing Facebook's inconsistent censorship of misinformation leading to deaths from COVID-19.
Twitter is one of the most concentrated platforms for engagement with political fake news. 80% of fake news sources are shared by 0.1% of users, who are "super-sharers". Over 70% of adults in the United States have Facebook accounts, and 70% of those with accounts visit the site daily. Older, more conservative social users are also more likely to interact with fake news. On Facebook, adults older than 65 were seven times more likely to share fake news than adults ages 18–29. Another source of misinformation on Twitter is bot accounts. Some misinformation, especially surrounding climate change, is centered around bots on Twitter sharing stories. In addition, the presence of bots used to spread wilful misinformation has been a problem for social media platforms to address. Facebook estimated the existence of up to 60 million troll bots actively spreading misinformation on their platform. Facebook has taken measures to stop the spread of misinformation. The misinformation appearing on Facebook has dropped but is still present.
Spontaneous spread misinformation on social media usually occurs from users sharing posts from friends or other mutual followers. These posts are often shared by someone the sharer believes they can trust. Other misinformation is created and spread with malicious intent. Sometimes to cause anxiety, other times to deceive audiences.  There are times when rumors are created with malicious intent, but shared by unknowing users.
With the large audiences that can be reached and the experts on various subjects on social media, social media could also be the key to correcting misinformation.
Lack of peer review
Due to the decentralized nature and structure of the Internet, content creators can easily publish content without being required to undergo peer review, prove their qualifications, or provide backup documentation. While library books have generally been reviewed and edited by an editor, publishing company, etc., Internet sources cannot be assumed to be vetted by anyone other than their authors. Misinformation may be produced, reproduced, and posted immediately on most online platforms.
Social media sites such as Facebook and Twitter have found themselves defending accusations of censorship for removing posts they have deemed to be misinformation. Social media censorship policies relying on government agency-issued guidance to determine information validity have garnered criticism that such policies have the unintended effect of stifling dissent and criticism of government positions and policies. Most recently, social media companies have faced criticism over allegedly prematurely censoring the discussion of the SARS-CoV 2 Lab Leak Hypothesis. Other cases of censorship appear to be aimed at preventing social media consumers from self-harm through the use of unproven COVID-19 treatments. For example, in July 2020, a video went viral showing Dr. Stella Immanuel claiming hydroxychloroquine was an effective cure for COVID-19. In the video, Immanuel suggested that there was no need for masks, school closures, or any kind of economic shut down; attesting that her alleged cure was highly effective in treating those infected with the virus. The video was shared 600,000 times and received nearly 20 million views on Facebook before it was taken down for violating community guidelines on spreading misinformation. The video was also taken down on Twitter overnight, but not before former president Donald Trump shared it to his page, which was followed by over 85 million Twitter users. NIAID director Dr. Anthony Fauci and members of the World Health Organization (WHO) quickly discredited the video, citing larger scale studies of hydroxychloroquine showing it is not an effective treatment of COVID-19, and the FDA cautioned against using it to treat COVID-19 patients following evidence of serious heart problems arising in patients who have taken the drug.
Another prominent example of misinformation cited as an example of censorship was the New York Post's report on the Hunter Biden laptops, which was used to promote the Biden–Ukraine conspiracy theory. Social media companies quickly removed this report, and the Post's Twitter account was temporarily suspended. Over 50 intelligence officials found the disclosure of emails allegedly belonging to Joe Biden’s son had all the "classic earmarks of a Russian information operation". Later evidence emerged that at least some of the laptop's contents were authentic. Because the laptop's emails were used to promote the false narrative that Joe Biden misused his decades of public service to enrich himself and his family by taking kickbacks for securing jobs for Hunter, the laptop story is an example of facts used to mislead.
Inaccurate information from media sources
A Gallup poll made public in 2016 found that only 32% of Americans trust the mass media "to report the news fully, accurately and fairly", the lowest number in the history of that poll. An example of bad information from media sources that led to the spread of misinformation occurred in November 2005, when Chris Hansen on Dateline NBC claimed that law enforcement officials estimate 50,000 predators are online at any moment. Afterward, the U.S. attorney general at the time, Alberto Gonzales, repeated the claim. However, the number that Hansen used in his reporting had no backing. Hansen said he received the information from Dateline expert Ken Lanning, but Lanning admitted that he made up the number 50,000 because there was no solid data on the number. According to Lanning, he used 50,000 because it sounds like a real number, not too big and not too small, and referred to it as a "Goldilocks number". Reporter Carl Bialik says that the number 50,000 is used often in the media to estimate numbers when reporters are unsure of the exact data.
Competition in news and media
Because news organizations and websites compete for viewers, there is a need for great efficiency in releasing stories to the public. The news media landscape in the 1970s offered American consumers access to a limited, but overall consistent and trusted selection of news offerings, whereas today consumers are confronted with an overabundance of voices online. This explosion of consumer choice when it comes to news media allows the consumer to pick and choose a news source that hits their preferred agenda, which consequently increases the likelihood that they are misinformed. 47% of Americans reported social media as their main news source in 2017 as opposed to traditional news sources. News media companies broadcast stories 24 hours a day, and break the latest news in hopes of taking audience share from their competitors. News is also produced at a pace that does not always allow for fact-checking, or for all of the facts to be collected or released to the media at one time, letting readers or viewers insert their own opinions, and possibly leading to the spread of misinformation.
Misinformation and disinformation have often been associated with the concept of fake news, which some scholars define as "fabricated information that mimics news media content in form but not in organizational process or intent." Intentional misinformation has become normalized in politics, as well as around topics of great import to the public, such as climate change and the COVID-19 pandemic. Intentional misinformation has caused irreversible damage to public understanding and trust. (Egelhofer et al., 2021) argued that the media’s wide adoption of the term “fake news” has served to normalize this concept and help to stabilize the use of this buzzword in our everyday language. Goldstein (2021) discussed the need for government agencies and organizations to increase transparency of their practices or services by using social media. Companies can then utilize the platforms offered by social media and bring forth full transparency to the public. If used in strategic ways, social media can in fact offer an agency or agenda (ex: political campaigns or vaccines) a way to for the public to feel connected and offer a place for people to keep up to date with real time news and developments.
Disinformation is not just a problem in the United States. In the United Kingdom, many people followed and believed a conspiracy theory that Coronavirus was linked to the 5G network, a very popular conspiracy theory that arose from a series of hashtags on Twitter originating in the United Kingdom. This just is one example of how misinformation is a global threat. Misinformation can also be used to deflect accountability. For example, Syria’s repeated use of chemical weapons was the subject of a disinformation campaign intended to prevent accountability [cite Steward, M. (2021). In his paper Defending Weapons Inspections from the Effects of Disinformation, Stewart shows how disinformation was used to conceal and purposely misinform the public about Syria’s violations of international law, with the intention to create plausible deniability of the violations, making discussion of possible violations to be regarded as untruthful rumors. Because the disinformation campaigns have been so effective and normalized, the opposing side has also started relying on disinformation to prevent repercussions for unfavorable behavior from those pushing a counter narrative.
From the work of (Freeze et al., 2020) we do learn the unfortunate truth that in most cases the damage of misinformation can be irreparable. (Freeze et al., 2020) explored whether people can recollect an event accurately when presented with misinformation after the event occurred. Findings showed that an individuals’ recollection of political events can be altered when presented with misinformation about the event. This study also found that if one is able to identify warning signs of misinformation, people still have a hard time retaining the pieces of information which are accurate vs inaccurate. Furthermore, their results showed that people can completely discard accurate information if they incorrectly deem a news source as “fake news” or untrustworthy and potentially disregard completely credible information. (Damstra et al., 2021) reminds us that misinformation has been around since the establishment of press, thus leaving little room to wonder how it has been normalized today.
Lanoszka, A. (2019) argued that fake news does not have to be looked at as an un winnable war. Misinformation can create a sense of chaos and anarchy through society, if people mistrust on these levels than no single idea can successfully move forward. There must be trust in other people and their processes in order for agendas to progress, this is a good thing when considering the very active and real intentional efforts to misinform and cause harm.
Misinformation was a major talking point during the 2016 American Presidential Election in terms of if different social media sites were allowing "fake news" to be spread throughout their platform. Social media became polarized and political, with some arguing that misinformation about COVID-19 has been circulating, creating skepticism of things such as vaccines and Dr. Fauci. Others argued that platforms such as Facebook had been unconstitutionally censoring conservative voices, spreading misinformation to persuade voters.
All of this polarization on social media platforms has caused people to question the source of their information. Skepticism in news platforms created a large distrust in the news world. Many times, misinformation is blended to seem true. Misinformation does not simply imply false information. Social media platforms are an easy place to skew and manipulate facts to show a different view on a topic, many times trying to paint a bad picture on different events.
Misinformation can affect all aspects of life. Allcott, Gentzkow, and Yu concur that the diffusion of misinformation through social media is a potential threat to democracy and broader society. The effects of misinformation can lead to the accurateness of information and details of the occurrence to decline. When eavesdropping on conversations, one can gather facts that may not always be true, or the receiver may hear the message incorrectly and spread the information to others. On the Internet, one can read content that is stated to be factual but that may not have been checked or may be erroneous. In the news, companies may emphasize the speed at which they receive and send information but may not always be correct in the facts. These developments contribute to the way misinformation will continue to complicate the public's understanding of issues and to serve as a source for belief and attitude formation.
In regards to politics, some view being a misinformed citizen as worse than being an uninformed citizen. Misinformed citizens can state their beliefs and opinions with confidence and in turn affect elections and policies. This type of misinformation comes from speakers not always being upfront and straightforward, yet may appear both "authoritative and legitimate" on the surface. When information is presented as vague, ambiguous, sarcastic, or partial, receivers are forced to piece the information together and make assumptions about what is correct. Aside from political propaganda, misinformation can also be employed in industrial propaganda. Using tools such as advertising, a company can undermine reliable evidence or influence belief through a concerted misinformation campaign. For instance, tobacco companies employed misinformation in the second half of the twentieth century to diminish the reliability of studies that demonstrated the link between smoking and lung cancer. In the medical field, misinformation can immediately lead to life endangerment as seen in the case of the public's negative perception towards vaccines or the use of herbs instead of medicines to treat diseases. In regards to the COVID-19 pandemic, the spread of misinformation has proven to cause confusion as well as negative emotions such as anxiety and fear. Misinformation regarding proper safety measures for the prevention of the virus that go against information from legitimate institutions like the World Health Organization can also lead to inadequate protection and possibly place individuals at risk for exposure.
Misinformation has the power to sway public elections and referendums if it has the chance to gain enough momentum in the public discourse. Leading up to the 2016 United Kingdom European Union membership referendum, for example, a figure widely circulated by the Vote Leave campaign claimed the UK would save £350 million a week by leaving the EU, and that the money would be redistributed to the British National Health Service. This was later deemed a "clear misuse of official statistics" by the UK statistics authority. The advert infamously shown off on the side of London's renowned double-decker busses did not take into account the UK's budget rebate, and the idea that 100% of the money saved would go to the NHS was unrealistic. A poll published in 2016 by Ipsos MORI found that nearly half of the British public believed this misinformation to be true. Even when information is proven to be misinformation, it may continue to shape people's attitudes towards a given topic, meaning misinformation has the power to swing political decisions if it gains enough traction in public discussion.
Some scholars and activists are pioneering a movement to eliminate the mis/disinformation and information pollution in the digital world. The theory they are developing, "information environmentalism," has become a curriculum in some universities and colleges.
- Big lie
- List of common misconceptions
- List of fake news websites
- List of satirical news websites
- Character assassination
- Defamation (also known as "slander")
- Counter Misinformation Team
- List of fallacies
- Junk science
- Flat earth
- Social engineering (in political science and cybercrime)
- Merriam-Webster Dictionary (19 August 2020). "Misinformation". Retrieved 19 August 2020.
- Merriam-Webster Dictionary (19 August 2020). "disinformation". Merriam-Webster. Retrieved 19 August 2020.
- Woolley, Samuel C.; Howard, Philip N. (2016). "Political Communication, Computational Propaganda, and Autonomous Agents". International Journal of Communication. 10: 4882–4890. Archived from the original on 2019-10-22. Retrieved 2019-10-22.
- Caramancion, Kevin Matthe (March 2020). "An Exploration of Disinformation as a Cybersecurity Threat". 2020 3rd International Conference on Information and Computer Technologies (ICICT). IEEE: 440–444. doi:10.1109/icict50521.2020.00076. ISBN 978-1-7281-7283-5. S2CID 218651389.
- Ecker, Ullrich K.H.; Lewandowsky, Stephan; Cheung, Candy S.C.; Maybery, Murray T. (November 2015). "He did it! She did it! No, she did not! Multiple causal explanations and the continued influence of misinformation" (PDF). Journal of Memory and Language. 85: 101–115. doi:10.1016/j.jml.2015.09.002.
- "The True History of Fake News". The New York Review of Books. 2017-02-13. Archived from the original on 2019-02-05. Retrieved 2019-02-24.
- "A short guide to the history of 'fake news' and disinformation". International Center for Journalists. Archived from the original on 2019-02-25. Retrieved 2019-02-24.
- Mintz, Anne. "The Misinformation Superhighway?". PBS. Archived from the original on 2 April 2013. Retrieved 26 February 2013.
- Jain, Suchita; Sharma, Vanya; Kaushal, Rishabh (September 2016). "Towards automated real-time detection of misinformation on Twitter". 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI). IEEE Conference Publication. pp. 2015–2020. doi:10.1109/ICACCI.2016.7732347. ISBN 978-1-5090-2029-4. S2CID 17767475.
- Libicki, Martin (2007). Conquest in Cyberspace: National Security and Information Warfare. New York: Cambridge University Press. pp. 51–55. ISBN 978-0521871600.
- Khan, M. Laeeq; Idris, Ika Karlina (2019-02-11). "Recognise misinformation and verify before sharing: a reasoned action and information literacy perspective". Behaviour & Information Technology. 38 (12): 1194–1212. doi:10.1080/0144929x.2019.1578828. ISSN 0144-929X. S2CID 86681742.
- Lazer, David M. J.; Baum, Matthew A.; Benkler, Yochai; Berinsky, Adam J.; Greenhill, Kelly M.; Menczer, Filippo; Metzger, Miriam J.; Nyhan, Brendan; Pennycook, Gordon; Rothschild, David; Schudson, Michael; Sloman, Steven A.; Sunstein, Cass R.; Thorson, Emily A.; Watts, Duncan J.; Zittrain, Jonathan L. (2018). "The science of fake news". Science. 359 (6380): 1094–1096. Bibcode:2018Sci...359.1094L. doi:10.1126/science.aao2998. PMID 29590025. S2CID 4410672.
- Vraga, Emily K.; Bode, Leticia (December 2017). "Leveraging Institutions, Educators, and Networks to Correct Misinformation: A Commentary on Lewandosky, Ecker, and Cook". Journal of Applied Research in Memory and Cognition. 6 (4): 382–388. doi:10.1016/j.jarmac.2017.09.008. ISSN 2211-3681.
- Caramancion, Kevin Matthe (September 2020). "Understanding the Impact of Contextual Clues in Misinformation Detection". 2020 IEEE International IOT, Electronics and Mechatronics Conference (IEMTRONICS): 1–6. doi:10.1109/IEMTRONICS51293.2020.9216394. ISBN 978-1-7281-9615-2. S2CID 222297695.
- Ecker, Ullrich K. H.; Lewandowsky, Stephan; Chadwick, Matthew (2020-04-22). "Can Corrections Spread Misinformation to New Audiences? Testing for the Elusive Familiarity Backfire Effect". Cognitive Research: Principles and Implications. 5 (1): 41. doi:10.31219/osf.io/et4p3. PMC 7447737. PMID 32844338.
- Busselle, Rick (2017), "Schema Theory and Mental Models", The International Encyclopedia of Media Effects, American Cancer Society, pp. 1–8, doi:10.1002/9781118783764.wbieme0079, ISBN 978-1-118-78376-4, retrieved 2021-03-28
- Plaza, Mateusz; Paladino, Lorenzo (2019). "The use of distributed consensus algorithms to curtail the spread of medical misinformation". International Journal of Academic Medicine. 5 (2): 93–96. doi:10.4103/IJAM.IJAM_47_19. S2CID 201803407.
- "Supplemental Material for The Role of Familiarity in Correcting Inaccurate Information". Journal of Experimental Psychology: Learning, Memory, and Cognition. 2017. doi:10.1037/xlm0000422.supp. ISSN 0278-7393.
- Walter, Nathan; Tukachinsky, Riva (2019-06-22). "A Meta-Analytic Examination of the Continued Influence of Misinformation in the Face of Correction: How Powerful Is It, Why Does It Happen, and How to Stop It?". Communication Research. 47 (2): 155–177. doi:10.1177/0093650219854600. ISSN 0093-6502. S2CID 197731687.
- Cook, John (May–June 2020). "Using Humor And Games To Counter Science Misinformation". Skeptical Inquirer. Vol. 44 no. 3. Amherst, New York: Center for Inquiry. pp. 38–41. Archived from the original on 31 December 2020. Retrieved 31 December 2020.
- Lewandowsky, Stephan; Ecker, Ullrich K.H.; Cook, John (December 2017). "Beyond Misinformation: Understanding and Coping with the "Post-Truth" Era". Journal of Applied Research in Memory and Cognition. 6 (4): 353–369. doi:10.1016/j.jarmac.2017.07.008. hdl:1983/1b4da4f3-009d-4287-8e45-a0a1d7b688f7. ISSN 2211-3681.
- "Facebook exposed over its handling of COVID - ProQuest". www.proquest.com.ProQuest 2553642687. Retrieved 2021-10-07.
- "When Misinformation is Misinformation - ProQuest". www.proquest.com.ProQuest 2477885938. Retrieved 2021-10-10.
- "Ask FactCheck". www.factcheck.org. Archived from the original on 2016-03-31. Retrieved 2016-03-31.
- Fernandez, Miriam; Alani, Harith (2018). "Online Misinformation" (PDF). Companion of the Web Conference 2018 on the Web Conference 2018 – WWW '18. New York: ACM Press: 595–602. doi:10.1145/3184558.3188730. ISBN 978-1-4503-5640-4. S2CID 13799324. Archived (PDF) from the original on 2019-04-11. Retrieved 2020-02-13.
- Zhang, Chaowei; Gupta, Ashish; Kauten, Christian; Deokar, Amit V.; Qin, Xiao (December 2019). "Detecting fake news for reducing misinformation risks using analytics approaches". European Journal of Operational Research. 279 (3): 1036–1052. doi:10.1016/j.ejor.2019.06.022. ISSN 0377-2217. S2CID 197492100.
- Calvert, Philip (December 2002). "Web of Deception: Misinformation on the Internet". The Electronic Library. 20 (6): 521. doi:10.1108/el.2002.20.6.521.7. ISSN 0264-0473.
- Conspiracy theories have long lurked in the background of American history, said Dustin Carnahan, a Michigan State University professor who studies political misinformation: Conspiracy theories paint fraudulent reality of Jan. 6 riot, By DAVID KLEPPER, AP news, 1° Jan. 2022.
- Marwick, Alice E. (2013-01-31), "Online Identity", in John Hartley; Jean Burgess; Axel Bruns (eds.), A Companion to New Media Dynamics, Wiley-Blackwell, pp. 355–364, doi:10.1002/9781118321607.ch23, ISBN 978-1-118-32160-7
- Chen, Xinran; Sin, Sei-Ching Joanna (2013). "'Misinformation? What of it?' Motivations and individual differences in misinformation sharing on social media". Proceedings of the American Society for Information Science and Technology. 50 (1): 1–4. doi:10.1002/meet.14505001102. ISSN 1550-8390.
- Stawicki, Stanislaw; Firstenberg, Michael; Papadimos, Thomas. "The Growing Role of Social Media in International Health Security: The Good, the Bad, and the Ugly". Global Health Security. 1 (1): 341–357.
- Vosoughi, Soroush; Roy, Deb; Aral, Sinan (2018-03-09). "The spread of true and false news online" (PDF). Science. 359 (6380): 1146–1151. Bibcode:2018Sci...359.1146V. doi:10.1126/science.aap9559. PMID 29590045. S2CID 4549072. Archived from the original (PDF) on 2019-04-29. Retrieved 2019-08-21.
- Tucker, Joshua A.; Guess, Andrew; Barbera, Pablo; Vaccari, Cristian; Siegel, Alexandra; Sanovich, Sergey; Stukal, Denis; Nyhan, Brendan. "Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature". Hewlett Foundation White Paper. Archived from the original on 2019-03-06. Retrieved 2019-03-05.
- Machado, Caio; Kira, Beatriz; Narayanan, Vidya; Kollanyi, Bence; Howard, Philip (2019). "A Study of Misinformation in WhatsApp groups with a focus on the Brazilian Presidential Elections". Companion Proceedings of the 2019 World Wide Web Conference on – WWW '19. New York: ACM Press: 1013–1019. doi:10.1145/3308560.3316738. ISBN 978-1450366755. S2CID 153314118.
- Chen, Xinran; Sin, Sei-Ching Joanna; Theng, Yin-Leng; Lee, Chei Sian (September 2015). "Why Students Share Misinformation on Social Media: Motivation, Gender, and Study-level Differences". The Journal of Academic Librarianship. 41 (5): 583–592. doi:10.1016/j.acalib.2015.07.003.
- Caramancion, Kevin Matthe (2021), "The Role of Information Organization and Knowledge Structuring in Combatting Misinformation: A Literary Analysis", Computational Data and Social Networks, Cham: Springer International Publishing, pp. 319–329, ISBN 978-3-030-91433-2, retrieved 2021-12-19
- Starbird, Kate; Dailey, Dharma; Mohamed, Owla; Lee, Gina; Spiro, Emma (2018). "Engage Early, Correct More: How Journalists Participate in False Rumors Online during Crisis Events". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). doi:10.1145/3173574.3173679. S2CID 5046314. Retrieved 2019-02-24.
- Arif, Ahmer; Robinson, John; Stanck, Stephanie; Fichet, Elodie; Townsend, Paul; Worku, Zena; Starbird, Kate (2017). "A Closer Look at the Self-Correcting Crowd: Examining Corrections in Online Rumors" (PDF). Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17): 155–169. doi:10.1145/2998181.2998294. ISBN 978-1450343350. S2CID 15167363. Archived (PDF) from the original on 26 February 2019. Retrieved 25 February 2019.
- Swire-Thompson, Briony; Lazer, David (2020). "Public Health and Online Misinformation: Challenges and Recommendations". Annual Review of Public Health. 41: 433–451. doi:10.1146/annurev-publhealth-040119-094127. PMID 31874069.
- Dwoskin, Elizabeth. "Misinformation on Facebook got six times more clicks than factual news during the 2020 election, study says". The Washington Post.
- Messerole, Chris (2018-05-09). "How misinformation spreads on social media – And what to do about it". Brookings Institution. Archived from the original on 25 February 2019. Retrieved 24 February 2019.
- Benkler, Y. (2017). "Study: Breitbart-led rightwing media ecosystem altered broader media agenda". Archived from the original on 4 June 2018. Retrieved 8 June 2018.
- Allcott, Hunt (October 2018). "Trends in the Diffusion of Misinformation on Social Media" (PDF). Stanford Education. arXiv:1809.05901. Bibcode:2018arXiv180905901A. Archived (PDF) from the original on 2019-07-28. Retrieved 2019-05-10.
- Krause, Nicole M.; Scheufele, Dietram A. (2019-04-16). "Science audiences, misinformation, and fake news". Proceedings of the National Academy of Sciences. 116 (16): 7662–7669. doi:10.1073/pnas.1805871115. ISSN 0027-8424. PMC 6475373. PMID 30642953.
- Allcott, Hunt; Gentzkow, Matthew; Yu, Chuan (2019-04-01). "Trends in the diffusion of misinformation on social media". Research & Politics. 6 (2): 2053168019848554. doi:10.1177/2053168019848554. ISSN 2053-1680. S2CID 52291737.
- Shin, Jieun; Jian, Lian; Driscoll, Kevin; Bar, François (June 2018). "The diffusion of misinformation on social media: Temporal pattern, message, and source". Computers in Human Behavior. 83: 278–287. doi:10.1016/j.chb.2018.02.008. ISSN 0747-5632. S2CID 41956979.
- Chen, Xinran; Sin, Sei-Ching Joanna; Theng, Yin-Leng; Lee, Chei Sian (2015). "Why Do Social Media Users Share Misinformation?". Proceedings of the 15th ACM/IEEE-CE on Joint Conference on Digital Libraries – JCDL '15. New York: ACM Press: 111–114. doi:10.1145/2756406.2756941. ISBN 978-1-4503-3594-2. S2CID 15983217.
- Gabbert, Fiona; Memon, Amina; Allan, Kevin; Wright, Daniel B. (September 2004). "Say it to my face: Examining the effects of socially encountered misinformation" (PDF). Legal and Criminological Psychology. 9 (2): 215–227. doi:10.1348/1355325041719428. ISSN 1355-3259.
- "Revealed: a quarter of all tweets about climate crisis produced by bots". The Guardian. 2020-02-21. Retrieved 2021-04-20.
- Milman, Oliver (2020-02-21). "Revealed: quarter of all tweets about climate crisis produced by bots". The Guardian. ISSN 0261-3077. Archived from the original on 2020-02-22. Retrieved 2020-02-23.
- Massey, Douglas S.; Iyengar, Shanto (2019-04-16). "Scientific communication in a post-truth society". Proceedings of the National Academy of Sciences. 116 (16): 7656–7661. doi:10.1073/pnas.1805868115. ISSN 0027-8424. PMC 6475392. PMID 30478050.
- Allcott, Hunt; Gentzkow, Matthew; Yu, Chuan (2019-04-01). "Trends in the diffusion of misinformation on social media". Research & Politics. 6 (2): 2053168019848554. doi:10.1177/2053168019848554. ISSN 2053-1680.
- Thai, My T.; Wu, Weili; Xiong, Hui (2016-12-01). Big Data in Complex and Social Networks. CRC Press. ISBN 978-1-315-39669-9.
- Bode, Leticia; Vraga, Emily K. (2018-09-02). "See Something, Say Something: Correction of Global Health Misinformation on Social Media". Health Communication. 33 (9): 1131–1140. doi:10.1080/10410236.2017.1331312. ISSN 1041-0236. PMID 28622038. S2CID 205698884.
- Stapleton, Paul (2003). "Assessing the quality and bias of web-based sources: implications for academic writing". Journal of English for Academic Purposes. 2 (3): 229–245. doi:10.1016/S1475-1585(03)00026-2.
- "Facebook's Lab-Leak About-Face". WSJ.
- "Covid origin: Why the Wuhan lab-leak theory is being taken seriously". BBC News. 27 May 2021.
- "Hydroxychloroquine: Why a video promoted by Trump was pulled on social media". BBC News. 2020-07-28. Retrieved 2021-11-24.
- "Stella Immanuel - the doctor behind unproven coronavirus cure claim". BBC News. 2020-07-29. Retrieved 2020-11-23.
- Bertrand, Natasha (October 19, 2020). "Hunter Biden story is Russian disinfo, dozens of former intel officials say". Politico. Archived from the original on October 20, 2020. Retrieved October 20, 2020.
- Lizza, Ryan (September 21, 2021). "POLITICO Playbook: Double Trouble for Biden". Politico.
- Marwick, Alice; Lewis, Rebecca (2017). Media Manipulation and Disinformation Online. New York: Data & Society Research Institute. pp. 40–45.
- Gladstone, Brooke (2012). The Influencing Machine. New York: W. W. Norton & Company. pp. 49–51. ISBN 978-0393342468.
- Shearer, Elisa; Gottfried, Jeffrey (2017-09-07). "News Use Across Social Media Platforms 2017". Pew Research Center's Journalism Project. Retrieved 2021-03-28.
- Croteau, David; Hoynes, William; Milan, Stefania. "Media Technology" (PDF). Media Society: Industries, Images, and Audiences. pp. 285–321. Archived (PDF) from the original on January 2, 2013. Retrieved March 21, 2013.
- "Misinformation - ProQuest". www.proquest.com. Retrieved 2021-12-16.
- Egelhofer, Jana Laura; Aaldering, Loes; Eberl, Jakob-Moritz; Galyga, Sebastian; Lecheler, Sophie (2020-03-30). "From Novelty to Normalization? How Journalists Use the Term "Fake News" in their Reporting". Journalism Studies. 21 (10): 1323–1343. doi:10.1080/1461670x.2020.1745667. ISSN 1461-670X.
- Stewart, Mallory (2021). "Defending Weapons Inspections from the Effects of Disinformation". AJIL Unbound. 115: 106–110. doi:10.1017/aju.2021.4. ISSN 2398-7723.
- Damstra, Alyt; Boomgaarden, Hajo G.; Broda, Elena; Lindgren, Elina; Strömbäck, Jesper; Tsfati, Yariv; Vliegenthart, Rens (2021-09-29). "What Does Fake Look Like? A Review of the Literature on Intentional Deception in the News and on Social Media". Journalism Studies. 22 (14): 1947–1963. doi:10.1080/1461670x.2021.1979423. ISSN 1461-670X.
- Lanoszka, Alexander (June 2019). "Disinformation in international politics". European Journal of International Security. 4 (2): 227–248. doi:10.1017/eis.2019.6. ISSN 2057-5637.
- "Clarifying misinformation Clarifying - ProQuest". www.proquest.com.ProQuest 1771695334. Retrieved 2021-10-10.
- Bodner, Glen E.; Musch, Elisabeth; Azad, Tanjeem (2009). "Reevaluating the potency of the memory conformity effect". Memory & Cognition. 37 (8): 1069–1076. doi:10.3758/mc.37.8.1069. ISSN 0090-502X. PMID 19933452.
- Southwell, Brian G.; Thorson, Emily A.; Sheble, Laura (2018). Misinformation and Mass Audiences. University of Texas Press. ISBN 978-1477314586.
- Barker, David (2002). Rushed to Judgement: Talk Radio, Persuasion, and American Political Behavior. New York: Columbia University Press. pp. 106–109.
- O'Connor, Cailin; Weatherall, James Owen (2019). The Misinformation Age: How False Beliefs Spread. New Haven: Yale University Press. pp. 10. ISBN 978-0300234015.
- Sinha, P.; Shaikh, S.; Sidharth, A. (2019). India Misinformed: The True Story. Harper Collins. ISBN 978-9353028381.
- Bratu, Sofia (May 24, 2020). "The Fake News Sociology of COVID-19 Pandemic Fear: Dangerously Inaccurate Beliefs, Emotional Contagion, and Conspiracy Ideation". Linguistic and Philosophical Investigations. 19: 128–134. doi:10.22381/LPI19202010.
- Gayathri Vaidyanathan (22 July 2020). "News Feature: Finding a vaccine for misinformation". Proceedings of the National Academy of Sciences of the United States of America. 117 (32): 18902–18905. Bibcode:2020PNAS..11718902V. doi:10.1073/PNAS.2013249117. ISSN 0027-8424. PMC 7431032. PMID 32699146. Wikidata Q97652640.
- "Misinformation on coronavirus is proving highly contagious". AP NEWS. 2020-07-29. Retrieved 2020-11-23.
- "The misinformation that was told about Brexit during and after the referendum". The Independent. 2018-07-27. Retrieved 2020-11-23.
- "Info-Environmentalism: An Introduction". Archived from the original on 2018-07-03. Retrieved 2018-09-28.
- "Information Environmentalism". Digital Learning and Inquiry (DLINQ). 2017-12-21. Archived from the original on 2018-09-28. Retrieved 2018-09-28.
|Library resources about |
- Machado, Caio; Kira, Beatriz; Narayanan, Vidya; Kollanyi, Bence; Howard, Philip (2019). "A Study of Misinformation in WhatsApp groups with a focus on the Brazilian Presidential Elections". Companion Proceedings of the 2019 World Wide Web Conference on – WWW '19. New York: ACM Press: 1013–1019. doi:10.1145/3308560.3316738. ISBN 978-1450366755. S2CID 153314118.
- Allcott, H.; Gentzkow, M. (2017). "Social Media and Fake News in the 2016 Election". Journal of Economic Perspectives. 31 (2): 211–236. doi:10.1257/jep.31.2.211. S2CID 32730475.
- Baillargeon, Normand (4 January 2008). A short course in intellectual self-defense. Seven Stories Press.ISBN 978-1-58322-765-7. Retrieved 22 June 2011.
- Bakir, V.; McStay, A. (2017). "Fake News and The Economy of Emotions: Problems, causes, solutions". Digital Journalism. 6: 154–175. doi:10.1080/21670811.2017.1345645. S2CID 157153522.
- Christopher Cerf, and Victor Navasky, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, Pantheon Books, 1984.
- Cook, John; Stephan Lewandowsky; Ullrich K. H. Ecker (2017-05-05). "Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence". PLOS One. 12 (5): e0175799. Bibcode:2017PLoSO..1275799C. doi:10.1371/journal.pone.0175799. PMC 5419564. PMID 28475576.
- Christopher Murphy (2005). Competitive Intelligence: Gathering, Analysing And Putting It to Work. Gower Publishing, Ltd.. pp. 186–189.ISBN 0-566-08537-2. A case study of misinformation arising from simple error
- O'Connor, Cailin, and James Owen Weatherall, "Why We Trust Lies: The most effective misinformation starts with seeds of truth", Scientific American, vol. 321, no. 3 (September 2019), pp. 54–61.
- Jürg Strässler (1982). Idioms in English: A Pragmatic Analysis. Gunter Narr Verlag. pp. 43–44.ISBN 3-87808-971-6.
- Comic: Fake News Can Be Deadly. Here's How To Spot It (audio tutorial, graphic tutorial)
Media files used on this page
Rough image of the lithograph of the "ruby amphitheater" described in The Sun (New York City) newspaper (August 28, 1835):
Our plain was of course immediately covered with the ruby front of this mighty amphitheater, its tall figures, leaping cascades, and rugged caverns. As its almost interminable sweep was measured off on the canvass, we frequently saw long lines of some yellow metal hanging from the crevices of the horizontal strata in will net-work, or straight pendant branches. We of course concluded that this was virgin gold, and we had no assay-master to prove to the contrary.