HOME

TheInfoList



OR:

Misinformation is incorrect or misleading information. It differs from disinformation, which is ''deliberately'' deceptive. Rumors are information not attributed to any particular source, and so are unreliable and often unverified, but can turn out to be either true or false. Even if later retracted, misinformation can continue to influence actions and memory. People may be more prone to believe misinformation because they are emotionally connected to what they are listening to or are reading. The role of social media has made information readily available to us at anytime, and it connects vast groups of people along with their information at one time. Advances in technology has impacted the way we communicate information and the way misinformation is spread. Misinformation has impacts on our societies' ability to receive information which then influences our communities, politics, and medical field.


History

Early examples include the insults and smears spread among political rivals in Imperial and
Renaissance Italy The Italian Renaissance ( it, Rinascimento ) was a period in Italian history covering the 15th and 16th centuries. The period is known for the initial development of the broader Renaissance culture that spread across Europe and marked the trans ...
in the form of
pasquinade A pasquinade or pasquil is a form of satire, usually an anonymous brief lampoon in verse or prose, and can also be seen as a form of literary caricature. The genre became popular in early modern Europe, in the 16th century, though the term had ...
s. These are anonymous and witty verses named for the Pasquino piazza and talking statues in Rome. In pre-revolutionary France, "canards", or printed broadsides, sometimes included an engraving to convince readers to take them seriously. According to writers Renée DiResta and Tobias Rose-Stockwell, in 1588, false news of the victory of the Spanish Armada over the English (which had been expected) spread throughout Europe, and news of the actual English victory came many days later. The first recorded large-scale disinformation campaign was the
Great Moon Hoax The "Great Moon Hoax", also known as the "Great Moon Hoax of 1835", was a series of six articles published in '' The Sun'', a New York newspaper, beginning on August 25, 1835, about the supposed discovery of life and even civilization on the Mo ...
, published in 1835 in the New York '' The Sun'', in which a series of articles claimed to describe life on the Moon, "complete with illustrations of humanoid bat-creatures and bearded blue unicorns". The challenges of mass-producing news on a short deadline can lead to factual errors and mistakes. An example of such is the ''Chicago Tribune'''s infamous 1948 headline "
Dewey Defeats Truman "Dewey Defeats Truman" was an incorrect banner headline on the front page of the ''Chicago Daily Tribune'' (later ''Chicago Tribune'') on November 3, 1948, the day after incumbent United States president Harry S. Truman won an upset victory ...
". The advent of the Internet has changed traditional ways that misinformation spreads. During the 2016 United States presidential election, is was seen that content from websites deemed 'untrustworthy' were reaching up to 40% of Americans, despite misinformation making up only 6% of overall news media. Later during the
COVID-19 pandemic The COVID-19 pandemic, also known as the coronavirus pandemic, is an ongoing global pandemic of coronavirus disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The novel virus was first identi ...
, both intentional and unintentional misinformation combined with a general lack of literacy regarding health science and medicine was proliferated, creating further misinformation. What makes those susceptable to misinformation is still debated, however.


Identification and correction

According to Anne Mintz, editor of ''Web of Deception: Misinformation on the Internet'', one of the best ways to determine whether the information is factual is to use common sense. Mintz advises that the reader check whether the information makes sense, and to check whether the founders or reporters who are spreading the information are biased or have an agenda. Journalists and researchers look at other sites (particularly verified sources like news channels) for information, as the information is more likely to be reviewed by multiple people or have been heavily researched, providing more reliable details. Martin Libicki, author of ''Conquest In Cyberspace: National Security and Information Warfare'', noted that readers must balance what is correct or incorrect. Readers cannot be gullible, but also should not be paranoid that all information is incorrect. There is always the chance that even readers who strike this balance will believe an error to be true, or a truth to be an error. A person's formal education level and media literacy correlates with their ability to recognize misinformation. This means if a person is more familiar with the content and process of how the information is researched and presented or is better at critically evaluating information of any source, they are more likely to correctly identify misinformation. Increasing literacy may not lead to improved ability to detect misinformation, as a certain level of literacy could be used to "justify belief in misinformation." Further research reveals that content descriptors can have varying effects on people's ability to detect misinformation. Based on the work by Scheufele and Krause, misinformation has different social layers that occur at the individual, group and sociostructural levels. At the Individual Root level of misinformation, efforts have sought to focus on the citizen's individual ability to recognize disinformation or misinformation and thus correct their views based on what they received. Hence, the proposed solutions for these cases utilize side of news which range from altering algorithms that find the root of fake news or fact check these different sites. The concern is that having the "inability to recognize misinformation" leads to assumption that all citizens are misinformed and thus unable to discern and logically evaluate information that emerges from social media. What poses the largest threat is "evaluation skill" that is lacking amongst individuals to understand and identify the sources with biased, dated or exploitative sources. Interestingly enough, Pew Research reports shared that approximately one in four American adults admitted to sharing misinformation on their social media platforms. The quality of media literacy is also part of the problem contributing to the individual root level of misinformation. Hence, the call for improving media literacy is a necessity to educate individual citizens on fake news. Other factors that influence misinformation at the individual level is motivations and emotion that influence motivated reasoning processes. The second root is at the group level. People's social networks have truly changed as the social media environment has evolved. Thus, allowing a different web of social networks to persist allowing individuals to ""selectively disclose"" information which unfortunately is in a biased format. As we all have seen the effects of playing the Telephone Game with a large group of people, the same concept with the beliefs that are most widespread become the most repeated. The problem with debunking misinformation is that this can backfire due to people relying only on the familiar information they had just been exposed to. The problem with the homogenous social groups is that it nurtures a misinformation mindset allowing for falsehood to be accepted since it appears as perhaps a social "norm" due to the decrease in contradictory information. Due to these social networks, it creates "clustering" effect which can end up being "specific rumor variations". These rumor variations lead to beliefs being perceived as more popular than they actually are causing a rumor cascade on these social networks. The third level of misinformation is the Societal level which is influenced by both the individual and group levels. The common figures associated with misinformation include Politicians as well as other political actors who attempt to shape the public opinion in their favor. The role of the mass media is to be a corrective agent to prevent misinformation to American citizens. Objectivity has been a common thread that American media has lacked being a contributor to the plague of misinformation. As print media evolved into radio, television and now the internet which go hand in hand with paid commercial actors to generate tailored content to attract viewers. The intent is to reach target audiences which has dramatically shifted with examples such as Facebook utilize their sources to have data collection as well as ""profiling"" tools that track each users' preferences for products and allow for ads that are hypertargeted for that viewer. Not only are these hypertargeted ads but they also compete for younger audiences attention on social media which limit the amount of news sources viewed on a daily basis. The condition of our society at this point is quoted best by the
Axios Axios commonly refers to: * Axios (river), a river that runs through Greece and North Macedonia * ''Axios'' (website), an American news and information website Axios may also refer to: Brands and enterprises * Axios, a brand of suspension produ ...
cofounder Jim VandeHei who stated that ""Survival...depends on giving readers what they really want, how they want it, when they want it, and on not spending too much money producing what they don't want."" Unfortunately, this is the climate of our culture when it comes to news quality. The change of these news realities are attributed to ""social mega trends"" which have been a huge contributor to the misinformation problem of the United States. In addition, the decline in social capital, political polarization, gap in economic inequalities, decline in trust in science, and how the parties are susceptible also to misinformation.


Cognitive factors

Prior research suggests it can be difficult to undo the effects of misinformation once individuals believe it to be true, and that fact-checking can backfire. Individuals may desire to reach a certain conclusion, causing them to accept information that supports that conclusion. Individuals are more likely to hang onto information and share information if it emotionally resonates with them. Individuals create
mental model A mental model is an explanation of someone's thought process about how something works in the real world. It is a representation of the surrounding world, the relationships between its various parts and a person's intuitive perception about thei ...
s and schemas to understand their physical and social environments. Misinformation that becomes incorporated into a mental model, especially for long periods of time, will be more difficult to address as individuals prefer to have a complete mental model. In this instance, it is necessary to correct the misinformation by both refuting it and providing accurate information that can function in the mental model. When attempting to correct misinformation, it is important to consider previous research which has identified effective and ineffective strategies. Simply providing the corrected information is insufficient to correct the effects of misinformation, and it may even have a negative effect. Due to the familiarity heuristic, information that is familiar is more likely to be believed to be true—corrective messages which contain a repetition of the original misinformation may result in an increase in familiarity and cause a
backfire effect Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring ...
. Factors that contribute to the effectiveness of a corrective message include an individual's mental model or
worldview A worldview or world-view or ''Weltanschauung'' is the fundamental cognitive orientation of an individual or society encompassing the whole of the individual's or society's knowledge, culture, and point of view. A worldview can include natural ...
, repeated exposure to the misinformation, time between misinformation and correction, credibility of the sources, and relative coherency of the misinformation and corrective message. Corrective messages will be more effective when they are coherent and/or consistent with the audience's worldview. They will be less effective when misinformation is believed to come from a credible source, is repeated prior to correction (even if the repetition occurs in the process of debunking), and/or when there is a time lag between the misinformation exposure and corrective message. Additionally, corrective messages delivered by the original source of the misinformation tend to be more effective.


Countering misinformation

One suggested solution for prevention of misinformation is a distributed consensus mechanism to validate the accuracy of claims, with appropriate flagging or removal of content that is determined to be false or
misleading Deception or falsehood is an act or statement that misleads, hides the truth, or promotes a belief, concept, or idea that is not true. It is often done for personal gain or advantage. Deception can involve dissimulation, propaganda and sleight o ...
. Another approach is to "inoculate" against it by delivering weakened misinformation that warns of the dangers of the misinformation. This includes counterarguments and showing the techniques used to mislead. One way to apply this is to use parallel argumentation, in which the flawed logic is transferred to a parallel situation (E.g. shared extremity or absurdity). This approach exposes bad logic without the need for complicated explanations. Flagging or eliminating false statements in media using algorithmic fact checkers is becoming an increasingly common tactic to fight misinformation. Computer programs that automatically detect misinformation are just emerging, but similar algorithms are already in place on
Facebook Facebook is an online social media and social networking service owned by American company Meta Platforms. Founded in 2004 by Mark Zuckerberg with fellow Harvard College students and roommates Eduardo Saverin, Andrew McCollum, Dustin Mosk ...
and Google. Google provides supplemental information pointing to fact-checking websites in response to its users searching controversial search terms. Likewise, algorithms detect and alert Facebook users that what they are about to share is likely false. A common related issue brought up is the over censorship of platforms like Facebook and Twitter. Many free speech activists argue that their voices are not being heard and their rights being taken away. To combat the spread of misinformation, social media platforms are often tasked with finding common ground between allowing free speech, while also not allowing misinformation to be spread throughout their respective platforms. Websites have been created to help people to discern fact from fiction. For example, the site FactCheck.org aims to fact check the media, especially viral political stories. The site also includes a forum where people can openly ask questions about the information. Similar sites allow individuals to copy and paste misinformation into a search engine and the site will investigate it. Some sites exist to address misinformation about specific topics, such as climate change misinformation. DeSmog, formerly The DeSmogBlog, publishes factually accurate information in order to counter the well-funded disinformation campaigns spread by motivated deniers of climate change. Facebook and Google added automatic fact-checking programs to their sites, and created the option for users to flag information that they think is false. A way that fact-checking programs find misinformation involves analyzing the language and syntax of news stories. Another way is fact-checkers can search for existing information on the subject and compare it to the news broadcasts being put online. Other sites such as Wikipedia and
Snopes ''Snopes'' , formerly known as the ''Urban Legends Reference Pages'', is a fact-checking website. It has been described as a "well-regarded reference for sorting out myths and rumors" on the Internet. The site has also been seen as a source f ...
are also widely used resources for verifying information.


Causes

Historically, people have relied on journalists and other information professionals to relay facts and truths about certain topics. Many different things cause miscommunication but the underlying factor is information literacy. Because information is distributed by various means, it is often hard for users to ask questions of credibility. Many online sources of misinformation use techniques to fool users into thinking their sites are legitimate and the information they generate is factual. Often misinformation can be politically motivated. For example, websites such as USConservativeToday.com have posted false information for political and monetary gain. Another role misinformation serves is to distract the public eye from negative information about a given person and/or issues of policy. Aside from political and financial gain, misinformation can also be spread unintentionally. This can cause problems and ignorance in large populations if people don't check what they consume. Misinformation cited with hyperlinks has been found to increase readers' trust. Trust is shown to be even higher when these hyperlinks are to scientific journals, and higher still when readers do not click on the sources to investigate for themselves. Trusting a source could lead to spreading misinformation unintentionally. A good way to check if something is misinforming is to check a source that is widely agreed to be true such as college research papers and/or organization with no agenda or bias ( .org, .edu, and .gov to be specific). Misinformation is sometimes an unintended side effect of bias. Misguided opinions can lead to the unintentional spread of misinformation, where individuals do not intend on spreading false propaganda, yet the false information they share is not checked and referenced. While that may be the case, there are plenty of instances where information is intentionally skewed, or leaves out major defining details and facts. Misinformation could be misleading rather than outright false. Research documents "the role political elites play in shaping both news coverage and public opinion around science issues". Another reason for the recent spread of misinformation may be the lack of consequences. With little to no repercussions, there is nothing to stop people from posting misleading information. The gain they get from the power of influencing other peoples' minds is greater than the impact of a removed post or temporary ban on Twitter. This forces individual companies to be the ones to mandate rules and policies regarding when people's "free speech" impedes other users' quality of life.


Online misinformation

Digital and social media can contribute to the spread of misinformation – for instance, when users share information without first checking the legitimacy of the information they have found. People are more likely to encounter online information based on personalized algorithms. Google, Facebook and Yahoo News all generate newsfeeds based on the information they know about our devices, our location, and our online interests. Although two people can search for the same thing at the same time, they are very likely to get different results based on what that platform deems relevant to their interests, fact or false. An emerging trend in the online information environment is "a shift away from public discourse to private, more ephemeral,
messaging A message is a discrete unit of communication intended by the source for consumption by some recipient or group of recipients. A message may be delivered by various means, including courier, telegraphy, carrier pigeon and electronic bus. ...
", which is a challenge to counter misinformation.


Countermeasures

A report by the Royal Society lists potential or proposed countermeasures: * Automated detection systems (e.g. to flag or add context and resources to content) * Emerging anti-misinformation sector (e.g. organizations combating scientific misinformation) * Provenance enhancing technology (i.e. better enabling people to determine the veracity of a claim, image, or video) * APIs for research (i.e. for usage to detect, understand, and counter misinformation) * Active bystanders * Community moderation (usually of unpaid and untrained, often independent, volunteers) * Anti-virals (e.g. limiting the number of times a message can be forwarded in privacy-respecting encrypted chats) * Collective intelligence (examples being Wikipedia where multiple editors refine encyclopedic articles, and question-and-answer sites where outputs are also evaluated by others similar to peer-review) * Trustworthy institutions and data * Media literacy (increasing citizens' ability to use ICTs to find, evaluate, create, and communicate information, an essential skill for citizens of all ages) ** Media literacy is taught in Estonian public schools – from kindergarten through to high school – since 2010 and "accepted 'as important as maths or writing or reading'" Broadly described, the report recommends building resilience to scientific misinformation and a healthy online information environment and not having offending content removed. It cautions that censorship could e.g. drive misinformation and associated communities "to harder-to-address corners of the internet". Online misinformation about climate change can be counteracted through different measures at different points of time. Prior to misinformation exposure, education and inoculation are proposed. Technological solutions, such as early detection of bots and ranking and selection algorithms are suggested as ongoing mechanisms. Post misinformation, corrective and collaborator messaging can be used to counter climate change misinformation. Fines and imprisonment are also suggested.


The role of social media

In the Information Age, social networking sites have become a notable agent for the spread of misinformation,
fake news Fake news is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue.Schlesinger, Robert (April 14, 2017)"Fake news in reality ...
, and propaganda. Misinformation on social media spreads quickly in comparison to
traditional media Old media, or legacy media, are the mass media institutions that dominated prior to the Information Age; particularly print media, film studios, music studios, advertising agencies, radio broadcasting, and television. Old media institutions ar ...
because of the lack of regulation and examination required before posting. These sites provide users with the capability to spread information quickly to other users without requiring the permission of a gatekeeper such as an editor who might otherwise require confirmation of the truth before allowing publication. Journalists today are criticized for helping to spread false information on these social platforms, but research shows they also play a role in curbing it through debunking and denying false rumors. During the
COVID-19 Pandemic The COVID-19 pandemic, also known as the coronavirus pandemic, is an ongoing global pandemic of coronavirus disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The novel virus was first identi ...
, social media was used as one of the main propagators for spreading misinformation about symptoms, treatments, and long-term health-related problems. This problem has initialized a significant effort in developing automated detection methods for misinformation on social media platforms. Social media platforms allow for easy spread of misinformation. The specific reasons why misinformation spreads through social media so easily remain unknown. A 2018 study of Twitter determined that, compared to accurate information, false information spread significantly faster, further, deeper, and more broadly. Similarly, a research study of Facebook found that misinformation was more likely to be clicked on than factual information. Combating misinformation's spread is difficult for three reasons: the profusion of misinformation sources makes the reader's task of weighing the reliability of information more challenging, social media's propensity for
culture war A culture war is a cultural conflict between social groups and the struggle for dominance of their values, beliefs, and practices. It commonly refers to topics on which there is general societal disagreement and polarization in societal valu ...
s embeds misinformation with identity-based conflict, and the proliferation of echo chambers form an epistemic environment in which participants encounter beliefs and opinions that coincide with their own, moving the entire group toward more extreme positions. Echo chambers and
filter bubble A filter bubble or ideological frame is a state of intellectual isolationTechnopediaDefinition – What does Filter Bubble mean?, Retrieved October 10, 2017, "....A filter bubble is the intellectual isolation, that can occur when websites make us ...
s come from the inclination of people to follow or support like-minded individuals. With no differing information to counter the untruths or the general agreement within isolated social clusters, some argue the outcome is an absence of a collective reality. Although social media sites have changed their algorithms to prevent the spread of fake news, the problem still exists. Furthermore, research has shown that while people may know what the scientific community has proved as a fact, they may still refuse to accept it as such. Researchers fear that misinformation in social media is "becoming unstoppable." It has also been observed that misinformation and disinformation reapear on social media sites. A research study watched the process of thirteen rumors appearing on Twitter and noticed that eleven of those same stories resurfaced multiple times, after time had passed. A social media app called Parler has caused much chaos as well. Right winged Twitter users who were banned on the app moved to Parler after the Capitol Hill riots and the app was being used to plan and facilitate more illegal and dangerous activities. Google and Apple later pulled the app off of the App Store. This app has been able to cause a lot of misinformation and bias in the media allowing for more political mishaps. Another reason that misinformation spreads on social media is from the users themselves. In a study, it was shown that the most common reasons that Facebook users were sharing misinformation for socially-motivated reasons, rather than taking the information seriously. Although users may not be spreading false information for malicious reasons, the misinformation is still being spread. A research study shows that misinformation introduced through a social format influences individuals drastically more than misinformation delivered non-socially.
Facebook Facebook is an online social media and social networking service owned by American company Meta Platforms. Founded in 2004 by Mark Zuckerberg with fellow Harvard College students and roommates Eduardo Saverin, Andrew McCollum, Dustin Mosk ...
's coverage of misinformation has become a hot topic with the spread of COVID-19, as some reports indicated Facebook recommended pages containing health misinformation. For example, this can be seen when a user likes an anti-vax Facebook page. Automatically, more and more anti-vax pages are recommended to the user. Additionally, some reference Facebook's inconsistent censorship of misinformation leading to deaths from COVID-19. Larry Cook, the creator of the "Stop Mandatory Vaccination" organization made money posting anti-vax false news on social media. He posted more than 150 posts aimed towards woman had over 1.6 million views and earned money on every click and share. Twitter is one of the most concentrated platforms for engagement with political
fake news Fake news is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue.Schlesinger, Robert (April 14, 2017)"Fake news in reality ...
. 80% of fake news sources are shared by 0.1% of users, who are "super-sharers". Older, more conservative social users are also more likely to interact with fake news. On Facebook, adults older than 65 were seven times more likely to share fake news than adults ages 18–29. Another source of misinformation on Twitter are bot accounts, especially surrounding
climate change In common usage, climate change describes global warming—the ongoing increase in global average temperature—and its effects on Earth's climate system. Climate change in a broader sense also includes previous long-term changes to ...
. Misinformation spread by bots has been difficult for social media platforms to address. Facebook estimated the existence of up to 60 million troll bots actively spreading misinformation on their platform, and has taken measures to stop the spread of misinformation, resulting in a decrease, though misinformation continues to exist on the platform. A research report by NewsGuard found there is a very high level (~20% in their probes of videos about relevant topics) of online misinformation delivered – to a mainly young user base – with
TikTok TikTok, known in China as Douyin (), is a short-form video hosting service owned by the Chinese company ByteDance. It hosts user-submitted videos, which can range in duration from 15 seconds to 10 minutes. TikTok is an international version o ...
, whose (essentially unregulated) usage is increasing as of 2022. Spontaneous spread of misinformation on social media usually occurs from users sharing posts from friends or mutually-followed pages. These posts are often shared from someone the sharer believes they can trust. Other misinformation is created and spread with malicious intent. Sometimes to cause anxiety, other times to deceive audiences. There are times when rumors are created with malicious intent, but shared by unknowing users. With the large audiences that can be reached and the experts on various subjects on social media, some believe social media could also be the key to correcting misinformation. Agent-based models and other computational models have been used by researchers to explain how false beliefs spread through networks. Epistemic network analysis is one example of computational method for evaluating connections in data shared in a social media network or similar network. In ''The Misinformation Age: How False Beliefs Spread'', a trade book by philosopher Cailin O'Connor and physicist James Owen Weatherall, the authors used a combination of case studies and agent-based models to show how false beliefs spread on social media and scientific networks. This book analyses the social nature of scientific research; the nature of information flow between scientists, propagandists, and politicians; and the spread of false beliefs among the general population.


Lack of peer review

Due to the decentralized nature and structure of the Internet, content creators can easily publish content without being required to undergo
peer review Peer review is the evaluation of work by one or more people with similar competencies as the producers of the work ( peers). It functions as a form of self-regulation by qualified members of a profession within the relevant field. Peer revie ...
, prove their qualifications, or provide backup documentation. While library books have generally been reviewed and edited by an editor, publishing company, etc., Internet sources cannot be assumed to be vetted by anyone other than their authors. Misinformation may be produced, reproduced, and posted immediately on most online platforms.


Censorship accusations

Social media sites such as Facebook and Twitter have found themselves defending accusations of
censorship Censorship is the suppression of speech, public communication, or other information. This may be done on the basis that such material is considered objectionable, harmful, sensitive, or "inconvenient". Censorship can be conducted by governments ...
for removing posts they have deemed to be misinformation. Social media censorship policies relying on government agency-issued guidance to determine information validity have garnered criticism that such policies have the unintended effect of stifling dissent and criticism of government positions and policies. Most recently, social media companies have faced criticism over allegedly prematurely censoring the discussion of the SARS-CoV 2 Lab Leak Hypothesis. Other accusations of censorship appear to stem from attempts to prevent social media consumers from self-harm through the use of unproven COVID-19 treatments. For example, in July 2020, a video went viral showing Dr. Stella Immanuel claiming
hydroxychloroquine Hydroxychloroquine, sold under the brand name Plaquenil among others, is a medication used to prevent and treat malaria in areas where malaria remains sensitive to chloroquine. Other uses include treatment of rheumatoid arthritis, lupus, an ...
was an effective cure for COVID-19. In the video, Immanuel suggested that there was no need for masks, school closures, or any kind of economic shut down; attesting that her alleged cure was highly effective in treating those infected with the virus. The video was shared 600,000 times and received nearly 20 million views on Facebook before it was taken down for violating community guidelines on spreading misinformation. The video was also taken down on Twitter overnight, but not before former president Donald Trump shared it to his page, which was followed by over 85 million Twitter users.
NIAID The National Institute of Allergy and Infectious Diseases (NIAID, ) is one of the 27 institutes and centers that make up the National Institutes of Health (NIH), an agency of the United States Department of Health and Human Services (HHS). NIAID' ...
director Dr. Anthony Fauci and members of the
World Health Organization (WHO) The World Health Organization (WHO) is a specialized agency of the United Nations responsible for international public health. The WHO Constitution states its main objective as "the attainment by all peoples of the highest possible level of ...
quickly discredited the video, citing larger-scale studies of hydroxychloroquine showing it is not an effective treatment of COVID-19, and the FDA cautioned against using it to treat COVID-19 patients following evidence of serious heart problems arising in patients who have taken the drug. Another prominent example of misinformation removal criticized by some as an example of censorship was the ''
New York Post The ''New York Post'' (''NY Post'') is a conservative daily tabloid newspaper published in New York City. The ''Post'' also operates NYPost.com, the celebrity gossip site PageSix.com, and the entertainment site Decider.com. It was established ...
s report on the Hunter Biden laptops, which was used to promote the
Biden–Ukraine conspiracy theory A series of false claims are centered on the baseless allegation that while Joe Biden was vice president of the United States, he engaged in corrupt activities relating to the employment of his son, Hunter Biden, by the Ukrainian gas compan ...
. Social media companies quickly removed this report, and the ''Post'''s Twitter account was temporarily suspended. Over 50 intelligence officials found the disclosure of emails allegedly belonging to Joe Biden’s son had all the "classic earmarks of a Russian information operation". Later evidence emerged that at least some of the laptop's contents were authentic.


Mass media, trust, and transparency


Competition in news and media

Because news organizations and websites compete for viewers, there is a need for efficiency in releasing stories to the public. The news media landscape in the 1970s offered American consumers access to a limited, but often consistent selection of news offerings, whereas today consumers are confronted with an abundance of voices online. This growth of consumer choice when it comes to news media allows the consumer to choose a news source that may align with their biases, which consequently increases the likelihood that they are misinformed. 47% of Americans reported social media as their main news source in 2017 as opposed to traditional news sources. News media companies often broadcast stories 24 hours a day, and break the latest news in hopes of taking audience share from their competitors. News can also be produced at a pace that does not always allow for fact-checking, or for all of the facts to be collected or released to the media at one time, letting readers or viewers insert their own opinions, and possibly leading to the spread of misinformation.


Inaccurate information from media sources

A Gallup poll made public in 2016 found that only 32% of Americans trust the mass media "to report the news fully, accurately and fairly", the lowest number in the history of that poll. An example of bad information from media sources that led to the spread of misinformation occurred in November 2005, when
Chris Hansen Christopher Edward Hansen (born September 13, 1959) is an American television journalist and YouTube personality. He is known for his work on ''Dateline NBC'', in particular the former segment ''To Catch a Predator'', which revolved around catc ...
on ''
Dateline NBC ''Dateline NBC'' is a weekly American television news magazine/reality legal show that is broadcast on NBC. It was previously the network's flagship general interest news magazine, but now focuses mainly on true crime stories with only occasio ...
'' claimed that law enforcement officials estimate 50,000 predators are online at any moment. Afterward, the U.S. attorney general at the time, Alberto Gonzales, repeated the claim. However, the number that Hansen used in his reporting had no backing. Hansen said he received the information from ''Dateline'' expert Ken Lanning, but Lanning admitted that he made up the number 50,000 because there was no solid data on the number. According to Lanning, he used 50,000 because it sounds like a real number, not too big and not too small, and referred to it as a " Goldilocks number". Reporter Carl Bialik says that the number 50,000 is used often in the media to estimate numbers when reporters are unsure of the exact data. The Novelty Hypothesis, which was created by Soroush Vosoughi, Deb Roy and Sinan Aral when they wanted to learn more about what attracts people to false news. What they discovered was that people are connected through emotion. In their study, they compared false tweets on Twitter that were shared by the total content tweeted, they specifically looked at the users and both the false and true information they shared. They learned that people are connected through their emotions, false rumors suggested more surprise and disgust which got people hooked and that the true rumors attracted more sadness, joy and trust. This study showed which emotions are more likely to cause the spread of false news.


Distrust

Misinformation has often been associated with the concept of fake news, which some scholars define as "fabricated information that mimics news media content in form but not in organizational process or intent." Intentional misinformation, called disinformation, has become normalized in politics and topics of great importance to the public, such as climate change and the COVID-19 pandemic. Intentional misinformation has caused irreversible damage to public understanding and trust. Egelhofer et al. argued that the media's wide adoption of the term “fake news” has served to normalize this concept and help to stabilize the use of this buzzword in our everyday language (2021). Goldstein (2021) discussed the need for government agencies and organizations to increase transparency of their practices or services by using social media. Companies can then utilize the platforms offered by social media and bring forth full transparency to the public. If used in strategic ways, social media can offer an agency or agenda (ex: political campaigns or vaccines) a way to connect with the public and offer a place for people to track news and developments. Despite many popular examples being from the US, misinformation is prevalent worldwide. In the United Kingdom, many people followed and believed a conspiracy theory that Coronavirus was linked to the 5G network, a popular idea that arose from a series of hashtags on Twitter. Misinformation can also be used to deflect accountability. For example, Syria's repeated use of chemical weapons was the subject of a disinformation campaign intended to prevent accountability [cite Steward, M. (2021). In his paper ''Defending Weapons Inspections from the Effects of Disinformation,'' Stewart shows how disinformation was used to conceal and purposely misinform the public about Syria's violations of international law. The intention was to create plausible deniability of the violations, making discussion of possible violations to be regarded as untruthful rumors. Because the disinformation campaigns have been so effective and normalized, the opposing side has also started relying on disinformation to prevent repercussions for unfavorable behavior from those pushing a counter narrative. According to Melanie Freeze (Freeze et al., 2020), in most cases the damage of misinformation can be irreparable. Freeze explored whether people can recollect an event accurately when presented with misinformation after the event occurred. Findings showed that an individual's recollection of political events could be altered when presented with misinformation about the event. This study also found that if one is able to identify warning signs of misinformation, they still have a hard time retaining the pieces of information which are accurate vs inaccurate. Furthermore, their results showed that people can completely discard accurate information if they incorrectly deem a news source as “fake news” or untrustworthy and potentially disregard completely credible information. Alyt Damstra (Damstra et al., 2021) states that misinformation has been around since the establishment of press, thus leaving little room to wonder how it has been normalized today. Alexander Lanoszka (2019) argued that fake news does not have to be looked at as an unwinnable war. Misinformation can create a sense of societal chaos and anarchy. With deep mistrust, no single idea can successfully move forward. With the existence of malicious efforts to misinform, desired progress may rely on trust in people and their processes. Misinformation was a major talking point during the
2016 American Presidential Election The 2016 United States presidential election was the 58th quadrennial presidential election, held on Tuesday, November 8, 2016. The Republican ticket of businessman Donald Trump and Indiana governor Mike Pence defeated the Democratic ticket ...
with claims of social media sites allowing "
fake news Fake news is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue.Schlesinger, Robert (April 14, 2017)"Fake news in reality ...
" to be spread. It has been found that exposure to misinformation is associated with an overall rise in political trust by those siding with the government in power or those who self-define as politically moderate. Social media became polarized and political during the 2020 United States Presidential Election, with some arguing that misinformation about COVID-19 had been circulating, creating skepticism of topics such as vaccines and figures such as Dr. Fauci. Others argued that platforms such as Facebook had been unconstitutionally censoring conservative voices, spreading misinformation to persuade voters. Polarization on social media platforms has caused people to question the source of their information. Skepticism in news platforms created a large distrust of the news media. Often, misinformation is blended to seem true. Misinformation does not simply imply false information. Social media platforms can be an easy place to skew and manipulate facts to show a different view on a topic, often trying to paint a bad picture of events.


Impact

Misinformation can affect all aspects of life. Allcott, Gentzkow, and Yu concur that the diffusion of misinformation through social media is a potential threat to democracy and broader society. The effects of misinformation can lead to decline of accuracy of information as well as event details. When eavesdropping on conversations, one can gather facts that may not always be true, or the receiver may hear the message incorrectly and spread the information to others. On the Internet, one can read content that is stated to be factual but that may not have been checked or may be erroneous. In the news, companies may emphasize the speed at which they receive and send information but may not always be correct in the facts. These developments contribute to the way misinformation may continue to complicate the public's understanding of issues and to serve as a source for belief and attitude formation. In regards to politics, some view being a misinformed citizen as worse than being an uninformed citizen. Misinformed citizens can state their beliefs and opinions with confidence and thus affect elections and policies. This type of misinformation occurs when a speaker appears "authoritative and legitimate", while also spreading misinformation. When information is presented as vague, ambiguous, sarcastic, or partial, receivers are forced to piece the information together and make assumptions about what is correct. Misinformation has the power to sway public elections and referendums if it gains enough momentum. Leading up to the 2016 United Kingdom European Union membership referendum, for example, a figure widely circulated by the
Vote Leave Vote Leave was a campaigning organisation that supported a "Leave" vote in the 2016 United Kingdom European Union membership referendum. On 13 April 2016 it was designated by the Electoral Commission as the official campaign in favour of leav ...
campaign claimed the UK would save £350 million a week by leaving the EU, and that the money would be redistributed to the British National Health Service. This was later deemed a "clear misuse of official statistics" by the UK statistics authority. The advert infamously shown on the side of London's double-decker busses did not take into account the UK's budget rebate, and the idea that 100% of the money saved would go to the NHS was unrealistic. A poll published in 2016 by
Ipsos MORI Ipsos MORI was the name of a market research company based in London, England which is now known as Ipsos and still continues as the UK arm of the global Ipsos group. It was formed by a merger of Ipsos UK and MORI in October 2005. The company ...
found that nearly half of the British public believed this misinformation to be true. Even when information is proven to be misinformation, it may continue to shape attitudes towards a given topic, meaning it has the power to swing political decisions if it gains enough traction. A study conducted by Soroush Vosoughi, Deb Roy and Sinan Aral looked at Twitter data including 126,000 posts spread by 3 million people over 4.5 million times. They found that political news traveled faster than any other type of information. They found that false news about politics reached more than 20,000 people three times faster than all other types of false news. Aside from political propaganda, misinformation can also be employed in industrial propaganda. Using tools such as advertising, a company can undermine reliable evidence or influence belief through a concerted misinformation campaign. For instance, tobacco companies employed misinformation in the second half of the twentieth century to diminish the reliability of studies that demonstrated the link between smoking and
lung cancer Lung cancer, also known as lung carcinoma (since about 98–99% of all lung cancers are carcinomas), is a malignant lung tumor characterized by uncontrolled cell growth in tissues of the lung. Lung carcinomas derive from transformed, malign ...
. In the medical field, misinformation can immediately lead to life endangerment as seen in the case of the public's negative perception towards vaccines or the use of herbs instead of medicines to treat diseases. In regards to the
COVID-19 pandemic The COVID-19 pandemic, also known as the coronavirus pandemic, is an ongoing global pandemic of coronavirus disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The novel virus was first identi ...
, the spread of misinformation has proven to cause confusion as well as negative emotions such as anxiety and fear. Misinformation regarding proper safety measures for the prevention of the virus that go against information from legitimate institutions like the World Health Organization can also lead to inadequate protection and possibly place individuals at risk for exposure. Some scholars and activists are heading movements to eliminate the mis/disinformation and information pollution in the digital world. One theory, "information environmentalism," has become a curriculum in some universities and colleges. The general study of misinformation and disinformation is by now also common across various academic disciplines, including sociology, communication, computer science, and political science, leading to the emerging field being described loosely as " Misinformation and Disinformation Studies". However, various scholars and journalists have criticised this development, pointing to problematic normative assumptions, a varying quality of output and lack of methodological rigor, as well as a too strong impact of mis- and disinformation research in shaping public opinion and policymaking. Summarising the most frequent points of critique, communication scholars Chico Camargo and Felix Simon wrote in an article for the Harvard Kennedy School Misinformation Review that "mis-/disinformation studies has been accused of lacking clear definitions, having a simplified understanding of what it studies, a too great emphasis on media effects, a neglect of intersectional factors, an outsized influence of funding bodies and policymakers on the research agenda of the field, and an outsized impact of the field on policy and policymaking."


See also


References


Further reading

* * * Baillargeon, Normand (4 January 2008). A short course in intellectual self-defense. Seven Stories Press. . Retrieved 22 June 2011. * * Christopher Cerf, and
Victor Navasky Victor Saul Navasky (born July 5, 1932) is an American journalist, editor and academic. He is publisher emeritus of ''The Nation'' and George T. Delacorte Professor Emeritus of Professional Practice in Magazine Journalism at Columbia University. H ...
, ''The Experts Speak: The Definitive Compendium of Authoritative Misinformation'', Pantheon Books, 1984. * * Helfand, David J., ''A Survival Guide to the Misinformation Age: Scientific Habits of Mind''. Columbia University Press, 2016. * Christopher Murphy (2005). Competitive Intelligence: Gathering, Analysing And Putting It to Work. Gower Publishing, Ltd.. pp. 186–189. . A case study of misinformation arising from simple error * * O'Connor, Cailin, and James Owen Weatherall, ''The Misinformation Age; How False Beliefs Spread''. Yale University Press, 2019. * * Persily, Nathaniel, and Joshua A. Tucker, eds. ''Social Media and Democracy: The State of the Field and Prospects for Reform''. Cambridge University Press, 2020. * Jürg Strässler (1982). Idioms in English: A Pragmatic Analysis. Gunter Narr Verlag. pp. 43–44. .


External links


Comic: Fake News Can Be Deadly. Here's How To Spot It
(audio tutorial, graphic tutorial)
Management and Strategy Institute, Free Misinformation and Disinformation Training online
{{Authority control * Psychological warfare techniques Deception