Summary
Disinformation is false information deliberately spread to deceive people. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals. Disinformation is implemented through attacks that weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value judgements—to exploit and amplify culture wars and other identity-driven controversies.
- In the ‘About’ section of this post is an overview of the issues or challenges, potential solutions, and web links. Other sections have information on relevant legislation, committees, agencies, programs in addition to information on the judiciary, nonpartisan & partisan organizations, and a wikipedia entry.
- To participate in ongoing forums, ask the post’s curators questions, and make suggestions, scroll to the ‘Discuss’ section at the bottom of each post or select the “comment” icon.
The Disinformation category has related posts on government agencies and departments and committees and their Chairs.
(10:28)
Half of U.S. adults say they sometimes get their news from social media. However, almost two-thirds of adults say they view social media as a bad thing for democracy. This raises the question of what responsibility social media companies bear for our increasingly divided political climate. Judy Woodruff explores that more for her ongoing series, America at a Crossroads.
OnAir Post: Disinformation
News
PBS NewsHour, July 30, 2024 – 11:00 am (ET)
When acting Secret Service Director Ronald Rowe Jr. visited the site of the campaign rally where a gunman attempted to assassinate former President Donald Trump, he went up to the roof, lying flat on his stomach, to evaluate the shooter’s line of sight that mid-July day.
“What I saw made me ashamed,” Rowe told U.S. senators Tuesday. “As a career law enforcement officer and a 25-year-veteran with the Secret Service, I cannot defend why that roof was not better secured.”
Rowe recounted his trip to Butler, Pennsylvania, which he said took place after being named acting director July 23, at a joint hearing of the Senate Judiciary and Homeland Security and Governmental Affairs committees examining the security failures that led to the assassination attempt. Deputy FBI Director Paul Abbate also testified.
PBS NewsHour – February 13, 2024 (05:45)
From robocalls to deep fakes, artificial intelligence is already playing a role in the 2024 election. Last week the Federal Communications Commission made AI-generated voice calls illegal. Laura Barrón-López has been covering AI’s impact on the upcoming election and discussed the latest with Amna Nawaz.
About
Check the Public Safety post for the party positions, committees, government agencies related to Disinformation issues.
Challenges
Identifying and Verifying False Information:
- Massive volume of online content makes it difficult to manually identify false information.
- Sophisticated techniques used to create and spread disinformation, making it hard to distinguish from legitimate news.
2. Amplification and Spread:
- Social media platforms and algorithms unknowingly amplify disinformation, reaching millions of users.
- Echo chambers and filter bubbles further reinforce false narratives within specific groups.
3. Lack of Accountability:
- Anonymous creators and paid influencers spread disinformation without facing consequences.
- Platform policies and enforcement mechanisms are often insufficient or ineffective.
4. Polarizing Society:
- Disinformation exploits societal divisions, fueling distrust and hostility between different groups.
- It can lead to increased political polarization, social unrest, and violence.
5. Erosion of Public Trust:
- Repeated exposure to false information undermines trust in institutions, including media, government, and science.
- This can lead to apathy, cynicism, and a decline in civic engagement.
6. Damage to National Security:
- Disinformation can be used to influence elections, promote foreign propaganda, and spread economic instability.
- It can weaken the United States’ standing on the global stage.
7. First Amendment Considerations:
- Balancing the need to combat disinformation with the protection of free speech is a complex issue.
- Government regulation and censorship must be carefully calibrated to avoid infringing on First Amendment rights.
8. Limited Technical Solutions:
- Automated fact-checking tools have limitations and can struggle to keep up with evolving disinformation tactics.
- Artificial intelligence and other technologies have potential but are still in their development stages.
9. Public Awareness and Education:
- Raising public awareness about disinformation and critical thinking skills is crucial.
- Empowered citizens can better identify and resist false information.
10. Collaboration and Cooperation:
- Addressing disinformation requires multi-faceted efforts involving governments, platforms, media organizations, fact-checkers, and the public.
- Collaboration and coordination are essential to effectively combat this complex issue.
Source: Google Search + Gemini + onAir curation
Solutions
Enhance Media Literacy and Critical Thinking Skills:
- Implement comprehensive media literacy education programs in schools and universities to equip individuals with the skills to identify, evaluate, and respond to disinformation.
- Promote critical thinking training to foster skepticism, source verification, and bias recognition.
2. Strengthen Content Moderation and Fact-Checking:
- Establish clear guidelines and policies for social media platforms to identify and remove disinformation.
- Support independent fact-checking organizations to identify and debunk false claims.
- Enhance coordination between platforms and fact-checkers to ensure timely and accurate responses.
3. Improve Transparency and Accountability:
- Enact laws that require platforms to disclose their algorithms and content moderation practices.
- Hold platforms accountable for promoting or amplifying disinformation.
- Implement systems for users to report and challenge disinformation.
4. Promote Diversity of Information Sources:
- Support independent journalism and diverse media outlets to provide alternative and credible perspectives.
- Encourage the creation of platforms that prioritize factual information.
- Foster a culture of healthy skepticism and questioning of dominant narratives.
5. Address Political Polarization:
- Encourage dialogue and cross-partisan collaboration to reduce division and prevent the spread of disinformation.
- Promote understanding of different perspectives and encourage respectful communication.
- Implement policies that foster inclusivity and prevent the creation of information silos.
6. Leverage Technological Solutions:
- Develop tools and algorithms to detect and analyze disinformation.
- Explore the use of artificial intelligence for fact-checking and identifying malicious actors.
- Promote the adoption of secure messaging platforms to prevent the spread of misinformation.
7. Encourage Collaboration and Partnerships:
- Foster partnerships between academia, industry, government, and civil society organizations to address disinformation.
- Establish multi-stakeholder initiatives to develop and implement effective strategies.
- Share best practices and coordinate efforts to maximize impact.
8. Enhance Public Trust:
- Build trust in institutions, fact-checkers, and journalism through transparency and accountability.
- Engage the public in efforts to combat disinformation.
- Empower individuals to become active participants in promoting factual information.
Source: Google Search + Gemini + onAir curation
Websites
Government Agencies:
- Federal Bureau of Investigation (FBI) Cyber Crime Unit
- Federal Communications Commission (FCC) Media Bureau
- National Counterintelligence and Security Center (NCSC)
- Department of Homeland Security (DHS) Cybersecurity and Infrastructure Security Agency (CISA)
Nonprofit Organizations:
- Brennan Center for Justice
- Center for American Progress
- Committee to Protect Journalists (CPJ)
- Freedom House
- New America Foundation
- Pew Research Center
Academic Institutions:
- Annenberg Public Policy Center
- Berkman Klein Center for Internet & Society, Harvard University
- Center for Strategic and International Studies (CSIS)
- Shorenstein Center on Media, Politics and Public Policy, Harvard University
- Stanford Internet Observatory
Industry Groups:
- Internet Association
- National Association of Broadcasters (NAB)
- News Media Alliance
- Social Media Exchange
Other Resources:
- American Library Association Office for Intellectual Freedom
- Disinformation Index
- First Draft
- Global Engagement Center (GEC)
- Media Literacy Now
Source: Google Search + Gemini + onAir curation
Legislation
Laws
Source: Google Search + Gemini + onAir curation
Countering Foreign Influence in General Elections Act (2020)
- Requires social media platforms to label and provide metadata on content originating from foreign governments or agents engaged in election interference.
- Establishes a central repository for such content to facilitate investigation and public awareness.
2. National Defense Authorization Act (2021)
- Creates a Disinformation Governance Board within the Department of Homeland Security to monitor and combat disinformation campaigns.
- Authorizes funding for research and initiatives to address disinformation.
3. Secure and Fair Elections Act (2021)
- Strengthens protections against voter suppression and election administration disinformation.
- Provides grants to states and localities to enhance election security and combat disinformation.
4. TRUTH Act (2023)
- Requires social media platforms to implement policies to address deepfakes (manipulated or fabricated images or videos).
- Establishes a working group to develop best practices and standards for deepfake detection and mitigation.
5. Countering Russian Influence in Elections Act (2018)
- Imposes sanctions on Russia for interfering in the 2016 US presidential election.
- Requires the Executive Branch to take measures to counter Russian disinformation campaigns.
6. Honest Ads Act (2018)
- Requires online platforms to implement transparency measures for political advertising.
- Mandates the disclosure of information about the advertiser, targeting, and spending.
7. Stopping Harmful Interference in Elections for a Lasting Democracy Act (2022)
- Prohibits foreign entities from using digital or social media platforms to interfere in US elections.
- Requires social media companies to take steps to prevent foreign election interference.
8. Preventing Cyberwarfare Act (2022)
- Addresses foreign influence operations by authorizing sanctions and other measures against actors involved in cyberwarfare or disinformation campaigns.
9. Election Infrastructure and Security Enhancement Act (2022)
- Provides funding and technical assistance to states and localities to enhance election security and combat disinformation.
- Includes measures to address cybersecurity risks and protect voting systems.
10. Secure Technology Act (2023)
- Establishes a program to monitor and address foreign interference in information and communications technologies.
- Requires the development of standards to protect against disinformation and other threats.
New Bills in 2023-2024
Source: Google Search + Gemini + onAir curation
H.R. 3327 – Protecting American Democracy and Public Interest from Online Disinformation Act
- Requires social media platforms to label content that spreads misinformation or is funded by foreign actors.
- Mandates transparency reporting on political advertising.
- Establishes a task force to study and prevent the spread of online disinformation.
S. 4408 – TRUTH Act
- Amends the Federal Election Campaign Act to prohibit the spread of false or misleading information in political advertising.
- Creates a new agency to enforce the ban.
- Imposes penalties on individuals and organizations that violate the law.
H.R. 4005 – Preventing Advancing Disinformation through Online Advertising Act
- Prohibits the use of targeted advertising to spread misinformation or amplify foreign interference campaigns.
- Requires platforms to implement measures to prevent the spread of disinformation through advertising.
- Levies fines on platforms that fail to comply.
S. 4425 – Stop the Online Predatory Targeting of Kids Act
- Protects children from online disinformation by requiring platforms to:
- Verify users’ ages before allowing them to use certain services.
- Prohibit the targeting of children with advertising that contains misinformation.
- Provide educational resources to children about disinformation.
H.R. 4704 – Stopping Harmful Algorithms and Reducing Tactics in Social Media Act
- Requires platforms to remove harmful content, such as disinformation, hate speech, and child sexual abuse material.
- Prevents platforms from using algorithms that amplify harmful content.
- Gives users the right to appeal the removal of their content.
S. 4435 – Disinformation Governance Act
- Creates a new federal agency to oversee the regulation of disinformation.
- Gives the agency authority to investigate and prosecute individuals and organizations involved in spreading disinformation.
- Establece estándares para la verificación de hechos y la divulgación de información.
H.R. 4832 – Digital Transparency and Accountability Act
- Requires platforms to provide transparency into their content moderation practices.
- Mandates platforms to disclose the sources of their funding and the algorithms they use.
- Establishes a new privacy standard for user data collected by platforms.
These bills represent a significant effort by Congress to address the growing challenge of disinformation in the United States. They aim to protect consumers, safeguard elections, and ensure the integrity of online information.
Committees, Agencies, & Programs
Committees
Source: Google Search + Gemini + onAir curation
House of Representatives
- House Select Committee on Intelligence (HSCI): Oversees intelligence and counterintelligence activities, including foreign disinformation campaigns.
- House Committee on Energy and Commerce: Jurisdiction over communications and technology, which play a role in the spread of disinformation.
- House Committee on Foreign Affairs: Responsible for foreign policy and international diplomacy, including efforts to combat disinformation from foreign adversaries.
- House Committee on Homeland Security: Focuses on domestic security issues, including disinformation’s impact on elections and national security.
- House Committee on Science, Space, and Technology: Examines the role of technology in disinformation and develops policy recommendations.
- House Judiciary Committee: Oversees civil rights, privacy, and technology, and has investigated disinformation’s impact on the justice system.
Senate
- Senate Select Committee on Intelligence (SSCI): Similar responsibilities as HSCI, with a focus on intelligence analysis and counterintelligence.
- Senate Committee on Commerce, Science, and Transportation: Jurisdictional oversight over communications, technology, and infrastructure, including disinformation’s dissemination and impact.
- Senate Committee on Foreign Relations: Responsible for foreign policy and diplomacy, and plays a role in addressing international disinformation campaigns.
- Senate Committee on Homeland Security and Governmental Affairs: Examines domestic security issues, including disinformation’s threats to elections and national security.
- Senate Judiciary Committee: Focuses on civil rights, privacy, and technology, and has investigated disinformation’s impacts on the rule of law.
Other Committees
- House and Senate Committees on Appropriations: Provide funding for agencies responsible for counter-disinformation efforts, such as the Department of Homeland Security and the Intelligence Community.
- House Permanent Select Committee on Investigations: Can investigate disinformation campaigns and potential threats to national security.
- Senate Homeland Security and Governmental Affairs Permanent Subcommittee on Investigations: Similar investigative powers to the House Investigations Committee, focusing on homeland security and governmental matters.
Government Agencies
Source: Google Search + Gemini + onAir curation
Department of Homeland Security (DHS)
- Cybersecurity and Infrastructure Security Agency (CISA): Coordinates cybersecurity efforts, provides guidance on disinformation threats, and offers threat hunting services.
- Federal Emergency Management Agency (FEMA): Supports public information and emergency response efforts, including combating disinformation during disasters.
Department of Justice (DOJ)
- National Security Division (NSD): Investigates and prosecutes foreign disinformation campaigns that threaten national security.
- Federal Bureau of Investigation (FBI): Investigates domestic disinformation campaigns and provides training to state and local law enforcement.
Department of Defense (DOD)
- Defense Intelligence Agency (DIA): Analyzes and reports on foreign disinformation threats and their potential impact on US national security.
- Cyber Command (CYBERCOM): Conducts operations to defend against and disrupt disinformation campaigns.
Department of State (DOS)
- Bureau of Global Public Affairs (GPA): Supports international cooperation and disseminates accurate information to counter disinformation.
- Global Engagement Center (GEC): Works with foreign partners to expose and counter disinformation campaigns.
Federal Communications Commission (FCC)
- Media Bureau: Enforces regulations related to broadcasting and telecommunications, including those designed to prevent the spread of disinformation.
- Wireline Competition Bureau: Provides guidance on net neutrality and other issues related to the availability and dissemination of information.
Federal Trade Commission (FTC)
- Bureau of Consumer Protection: Investigates and takes action against businesses that engage in deceptive or fraudulent practices, including the spreading of disinformation.
Government Accountability Office (GAO)
- Cybersecurity and Information Technology Audit Division: Provides oversight and accountability for government efforts to address disinformation.
Intelligence Community
- Office of the Director of National Intelligence (ODNI): Coordinates intelligence efforts and provides assessments on disinformation threats.
- Central Intelligence Agency (CIA): Analyzes and reports on foreign disinformation campaigns.
- National Security Agency (NSA): Provides technical support and surveillance capabilities to counter disinformation threats.
Programs & Initiatives
Source: Google Search + Gemini + onAir curation
Countering Foreign Influence and Disinformation Act (CIFIDA)
- Authorized in the FY23 National Defense Authorization Act
- Establishes a new authority for the State Department to address foreign disinformation and propaganda
- Creates a “Center for Foreign Influence” within the State Department to lead efforts
2. Cybersecurity and Infrastructure Security Agency (CISA)
- Lead federal agency responsible for protecting critical infrastructure from cyber threats
- Has a dedicated team focused on countering disinformation related to elections and other national security threats
- Provides resources and training to state and local election officials on disinformation threats
3. Election Assistance Commission (EAC)
- Nonpartisan commission that provides guidance and assistance to state and local election officials
- Provides funding for election security efforts, including technology and training to address disinformation
- Has developed a “Disinformation Playbook” for election officials
4. Federal Bureau of Investigation (FBI)
- Primary federal law enforcement agency responsible for investigating foreign influence operations and disinformation campaigns
- Has established a dedicated task force to counter foreign interference in elections
5. National Counterintelligence and Security Center (NCSC)
- Coordinates federal efforts to identify and counter foreign threats, including disinformation
- Produces intelligence reports on disinformation threats and provides guidance to government agencies and the private sector
6. Office of the Director of National Intelligence (ODNI)
- Oversees the US intelligence community and provides assessments on foreign disinformation threats
- Has established a National Intelligence Manager (NIM) for Counterintelligence to lead coordination on disinformation issues
7. State Department’s Global Engagement Center (GEC)
- Leads the US government’s efforts to counter state-sponsored disinformation and propaganda
- Partners with foreign governments, civil society organizations, and the private sector to combat disinformation
8. US Agency for International Development (USAID)
- Provides funding and technical assistance to foreign governments and organizations to strengthen their capacity to counter disinformation and promote media literacy
More Information
Judiciary
Source: Bard AI + onAir curation
Disinformation, the intentional spread of false or misleading information, poses a significant threat to the integrity and public trust in the U.S. judiciary. This issue has become increasingly prevalent in recent years, with the rise of social media and other digital platforms that can rapidly disseminate false or misleading information.
Key Impacts of Disinformation on the Judiciary
- Erosion of Public Trust: Misinformation can sow doubt about the impartiality and fairness of the judiciary, undermining public confidence in the legal system. This can lead to increased polarization and distrust in government institutions.
- Threats to Judicial Independence: Disinformation can target individual judges or the judiciary as a whole, seeking to influence their decisions or undermine their authority. This can compromise the independence of the judiciary and erode the rule of law.
- Challenges to Judicial Proceedings: False or misleading information can be introduced into legal proceedings, potentially influencing the outcome of cases. This can lead to wrongful convictions, unjust sentences, or other negative consequences.
- Increased Polarization: Disinformation can exacerbate political polarization by spreading false or misleading information about legal issues, such as election results or judicial appointments. This can lead to increased tension and conflict within society.
Strategies to Combat Disinformation in the Judiciary
- Education and Awareness: Raising awareness among the public, legal professionals, and judges about the dangers of disinformation is essential. This can be achieved through educational programs, public outreach initiatives, and the development of resources to help people identify and counter misinformation.
- Fact-Checking and Verification: Promoting fact-checking and verification efforts can help to combat the spread of false or misleading information. This can involve working with independent fact-checking organizations, developing tools and resources for verifying information, and encouraging the use of reliable sources.
- Transparency and Accountability: Increasing transparency and accountability within the judiciary can help to build public trust and counter misinformation. This can include measures such as public disclosure of judicial decisions, ethical guidelines for judges, and mechanisms for holding judges accountable for misconduct.
- Collaboration and Partnerships: Collaborating with other institutions, such as media organizations, academic institutions, and technology companies, can help to address the challenges posed by disinformation. This can involve sharing information, developing joint initiatives, and coordinating efforts to combat misinformation.
Nonpartisan Organizations
Source: Google Search + Gemini + onAir curation
Research Institutions:
- RAND Corporation: Develops nonpartisan research and analysis on disinformation, including its impact on national security and democratic institutions.
- Brennan Center for Justice: Conducts legal and policy research on disinformation and its implications for elections and civil liberties.
- Brookings Institution: Provides independent research and analysis on disinformation, focusing on its political and social impacts.
Nonprofit Organizations:
- First Draft: A coalition of journalists, technologists, and researchers dedicated to combating disinformation.
- Media Matters for America: A watchdog group that monitors and exposes conservative media bias and disinformation campaigns.
- Snopes: A fact-checking website that investigates urban legends, hoaxes, and false information.
Government Agencies:
- Department of Homeland Security (DHS): Collaborates with other agencies and private sector partners to counter disinformation and protect national security.
- Federal Bureau of Investigation (FBI): Investigates and prosecutes individuals and organizations involved in disinformation campaigns that threaten the United States.
- National Intelligence Council (NIC): Produces intelligence assessments on disinformation and its implications for US interests.
Intergovernmental Organizations:
- National Security Commission on Artificial Intelligence (NSCAI): Examines the potential national security risks and opportunities posed by artificial intelligence, including its role in disinformation.
- Organization for Security and Co-operation in Europe (OSCE): Promotes dialogue and cooperation among countries on issues related to disinformation and media freedom.
Other Key Stakeholders:
- Technology companies: Platforms such as Facebook, Twitter, and YouTube have implemented various measures to combat disinformation on their networks.
- Journalism organizations: The Society of Professional Journalists and other professional associations advocate for ethical reporting and the dissemination of accurate information.
- Educational institutions: Universities and colleges play a crucial role in educating students about disinformation and promoting critical thinking skills.
Partisan Organizations
Source: Google Search + Gemini + onAir curation
Republican Organizations
- Media Research Center: Focuses on exposing liberal media bias and promoting conservative viewpoints.
- American Conservative Union: Promotes conservative principles and opposes government overreach.
- The Heritage Foundation: A think tank that advocates for conservative policies in various areas, including media and technology.
Democratic Organizations
- Media Matters for America: Monitors conservative media outlets for inaccuracies and bias.
- Center for American Progress: A progressive think tank that promotes policies based on evidence and research.
- Tech Transparency Project: Investigates how technology companies spread disinformation and the impact on democracy.
” Disinformation” (Wiki)
Contents
Part of a series on |
War (outline) |
---|
Disinformation is misleading content deliberately spread to deceive people,[1][2] or to secure economic or political gain and which may cause public harm.[3] Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals.[4] Disinformation is implemented through attacks that “weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value judgements—to exploit and amplify culture wars and other identity-driven controversies.”[5]
In contrast, misinformation refers to inaccuracies that stem from inadvertent error.[6] Misinformation can be used to create disinformation when known misinformation is purposefully and intentionally disseminated.[7] “Fake news” has sometimes been categorized as a type of disinformation, but scholars have advised not using these two terms interchangeably or using “fake news” altogether in academic writing since politicians have weaponized it to describe any unfavorable news coverage or information.[8]
Etymology
The English word disinformation comes from the application of the Latin prefix dis- to information making the meaning “reversal or removal of information”. The rarely used word had appeared with this usage in print at least as far back as 1887.[11][12][13][14]
Some consider it a loan translation of the Russian дезинформация, transliterated as dezinformatsiya,[15][1][2] apparently derived from the title of a KGB black propaganda department.[16][1][17][15] Soviet planners in the 1950s defined disinformation as “dissemination (in
the press, on the radio, etc.) of false reports intended to mislead public opinion.”[18]
Disinformation first made an appearance in dictionaries in 1985, specifically, Webster’s New College Dictionary and the American Heritage Dictionary.[19] In 1986, the term disinformation was not defined in Webster’s New World Thesaurus or New Encyclopædia Britannica.[15] After the Soviet term became widely known in the 1980s, native speakers of English broadened the term as “any government communication (either overt or covert) containing intentionally false and misleading material, often combined selectively with true information, which seeks to mislead and manipulate either elites or a mass audience.”[2]
By 1990, use of the term disinformation had fully established itself in the English language within the lexicon of politics.[20] By 2001, the term disinformation had come to be known as simply a more civil phrase for saying someone was lying.[21] Stanley B. Cunningham wrote in his 2002 book The Idea of Propaganda that disinformation had become pervasively used as a synonym for propaganda.[22]
Operationalization
The Shorenstein Center at Harvard University defines disinformation research as an academic field that studies “the spread and impacts of misinformation, disinformation, and media manipulation,” including “how it spreads through online and offline channels, and why people are susceptible to believing bad information, and successful strategies for mitigating its impact”[23] According to a 2023 research article published in New Media & Society,[4] disinformation circulates on social media through deception campaigns implemented in multiple ways including: astroturfing, conspiracy theories, clickbait, culture wars, echo chambers, hoaxes, fake news, propaganda, pseudoscience, and rumors.
Activities that operationalize disinformation campaigns online[4] | |||
---|---|---|---|
Term | Description | Term | Description |
Astroturfing | A centrally coordinated campaign that mimics grassroots activism by making participants pretend to be ordinary citizens | Fake news | Genre: The deliberate creation of pseudo-journalism Label: The instrumentalization of the term to delegitimize news media |
Conspiracy theories | Rebuttals of official accounts that propose alternative explanations in which individuals or groups act in secret | Greenwashing | Deceptive communication makes people believe that a company is environmentally responsible when it is not |
Clickbait | The deliberate use of misleading headlines and thumbnails to increase online traffic for profit or popularity | Propaganda | Organized mass communication, on a hidden agenda, and with a mission to conform belief and action by circumventing individual reasoning |
Culture wars | A phenomenon in which multiple groups of people, who hold entrenched values, attempt to steer public policy contentiously | Pseudoscience | Accounts that claim the explanatory power of science, borrow its language and legitimacy but diverge substantially from its quality criteria |
Doxxing | A form of online harassment that breaches privacy boundaries by releasing information intending physical and online harm to a target | Rumors | Unsubstantiated news stories that circulate while not corroborated or validated |
Echo chamber | An epistemic environment in which participants encounter beliefs and opinions that coincide with their own | Trolling | Networked groups of digital influencers that operate ‘click armies’ designed to mobilize public sentiment |
Hoax | News in which false facts are presented as legitimate | Urban legends | Moral tales featuring durable stories of intruders incurring boundary transgressions and their dire consequences |
Note: This is an adaptation of Table 2 from Disinformation on Digital Media Platforms: A Market Shaping Approach, by Carlos Diaz Ruiz, used under CC BY 4.0 / Adapted from the original. |
In order to distinguish between similar terms, including misinformation and malinformation, scholars collectively agree on the definitions for each term as follows: (1) disinformation is the strategic dissemination of false information with the intention to cause public harm;[24] (2) misinformation represents the unintentional spread of false information; and (3) malinformation is factual information disseminated with the intention to cause harm,[25][26] these terms are abbreviated ‘DMMI’.[27]
In 2019, Camille François devised the “ABC” framework of understanding different modalities of online disinformation:
- Manipulative Actors, who “engage knowingly and with clear intent in viral deception campaigns” that are “covert, designed to obfuscate the identity and intent of the actor orchestrating them.” Examples include personas such as Guccifer 2.0, Internet trolls, state media, and military operatives.
- Deceptive Behavior, which “encompasses the variety of techniques viral deception actors may use to enhance and exaggerate the reach, virality and impact of their campaigns.” Examples include troll farms, Internet bots, astroturfing, and “paid engagement“.
- Harmful Content, which includes health misinformation, manipulated media such as deepfakes, online harassment, violent extremism, hate speech or terrorism.[28]
In 2020, the Brookings Institution proposed amending this framework to include Distribution, defined by the “technical protocols that enable, constrain, and shape user behavior in a virtual space”.[29] Similarly, the Carnegie Endowment for International Peace proposed adding Degree (“distribution of the content … and the audiences it reaches”) and Effect (“how much of a threat a given case poses”).[30]
Comparisons with propaganda
Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like U.S. Department of State) define propaganda as the use of non-rational arguments to either advance or undermine a political ideal, and use disinformation as an alternative name for undermining propaganda.[31] While others consider them to be separate concepts altogether.[32] One popular distinction holds that disinformation also describes politically motivated messaging designed explicitly to engender public cynicism, uncertainty, apathy, distrust, and paranoia, all of which disincentivize citizen engagement and mobilization for social or political change.[18]
Practice
Disinformation is the label often given to foreign information manipulation and interference (FIMI).[33][34] Studies on disinformation are often concerned with the content of activity whereas the broader concept of FIMI is more concerned with the “behaviour of an actor” that is described through the military doctrine concept of tactics, techniques, and procedures (TTPs).[33]
Disinformation is primarily carried out by government intelligence agencies, but has also been used by non-governmental organizations and businesses.[35] Front groups are a form of disinformation, as they mislead the public about their true objectives and who their controllers are.[36] Most recently, disinformation has been deliberately spread through social media in the form of “fake news“, disinformation masked as legitimate news articles and meant to mislead readers or viewers.[37] Disinformation may include distribution of forged documents, manuscripts, and photographs, or spreading dangerous rumours and fabricated intelligence. Use of these tactics can lead to blowback, however, causing such unintended consequences such as defamation lawsuits or damage to the dis-informer’s reputation.[36]
Worldwide
Soviet disinformation
Use of disinformation as a Soviet tactical weapon started in 1923,[39] when it became a tactic used in the Soviet political warfare called active measures.[40]
Russian disinformation
Russian disinformation campaigns have occurred in many countries.[41][42][43][44] For example, disinformation campaigns led by Yevgeny Prigozhin have been reported in several African countries.[45][46] Russia, however, denies that it uses disinformation to influence public opinion.[47]
Often Russian campaigns aim to disrupt domestic politics within Europe and the United States in an attempt to weaken the West due to its long-standing commitment to fight back against “Western imperialism” and shift the balance of world power to Russia and her allies. According to the Voice of America, Russia seeks to promote American isolationism, border security concerns and racial tensions within the United States through its disinformation campaigns.[48][49][50]
Chinese disinformation
American disinformation
The United States Intelligence Community appropriated use of the term disinformation in the 1950s from the Russian dezinformatsiya, and began to use similar strategies[61][62] during the Cold War and in conflict with other nations.[17] The New York Times reported in 2000 that during the CIA’s effort to substitute Mohammed Reza Pahlavi for then-Prime Minister of Iran Mohammad Mossadegh, the CIA placed fictitious stories in the local newspaper.[17] Reuters documented how, subsequent to the 1979 Soviet Union invasion of Afghanistan during the Soviet–Afghan War, the CIA put false articles in newspapers of Islamic-majority countries, inaccurately stating that Soviet embassies had “invasion day celebrations”.[17] Reuters noted a former U.S. intelligence officer said they would attempt to gain the confidence of reporters and use them as secret agents, to affect a nation’s politics by way of their local media.[17]
In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the Reagan Administration had engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi.[63] White House representative Larry Speakes said reports of a planned attack on Libya as first broken by The Wall Street Journal on August 25, 1986, were “authoritative”, and other newspapers including The Washington Post then wrote articles saying this was factual.[63] U.S. State Department representative Bernard Kalb resigned from his position in protest over the disinformation campaign, and said: “Faith in the word of America is the pulse beat of our democracy.”[63]
The executive branch of the Reagan administration kept watch on disinformation campaigns through three yearly publications by the Department of State: Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns (1986); Report on Active Measures and Propaganda, 1986–87 (1987); and Report on Active Measures and Propaganda, 1987–88 (1989).[61]
According to a report by Reuters, the United States ran a propaganda campaign to spread disinformation about the Sinovac Chinese COVID-19 vaccine, including using fake social media accounts to spread the disinformation that the Sinovac vaccine contained pork-derived ingredients and was therefore haram under Islamic law.[64] Reuters said the ChinaAngVirus disinformation campaign was designed to “counter what it perceived as China’s growing influence in the Philippines” and was prompted by the “[fear] that China’s COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing”.[64] The campaign was also described as “payback for Beijing’s efforts to blame Washington for the pandemic”.[65] The campaign primarily targeted people in the Philippines and used a social media hashtag for “China is the virus” in Tagalog.[64] The campaign ran from 2020 to mid-2021.[64] The primary contractor for the U.S. military on the project was General Dynamics IT, which received $493 million for its role.[64]
Response
Responses from cultural leaders
Pope Francis condemned disinformation in a 2016 interview, after being made the subject of a fake news website during the 2016 U.S. election cycle which falsely claimed that he supported Donald Trump.[66][67][68] He said the worst thing the news media could do was spread disinformation. He said the act was a sin,[69][70] comparing those who spread disinformation to individuals who engage in coprophilia.[71][72]
Ethics in warfare
In a contribution to the 2014 book Military Ethics and Emerging Technologies, writers David Danks and Joseph H. Danks discuss the ethical implications in using disinformation as a tactic during information warfare.[73] They note there has been a significant degree of philosophical debate over the issue as related to the ethics of war and use of the technique.[73] The writers describe a position whereby the use of disinformation is occasionally allowed, but not in all situations.[73] Typically the ethical test to consider is whether the disinformation was performed out of a motivation of good faith and acceptable according to the rules of war.[73] By this test, the tactic during World War II of putting fake inflatable tanks in visible locations on the Pacific Islands in order to falsely present the impression that there were larger military forces present would be considered as ethically permissible.[73] Conversely, disguising a munitions plant as a healthcare facility in order to avoid attack would be outside the bounds of acceptable use of disinformation during war.[73]
Research
Research related to disinformation studies is increasing as an applied area of inquiry.[74][75] The call to formally classify disinformation as a cybersecurity threat is made by advocates due to its increase in social networking sites.[76] Despite the proliferation of social media websites, Facebook and Twitter showed the most activity in terms of active disinformation campaigns. Techniques reported on included the use of bots to amplify hate speech, the illegal harvesting of data, and paid trolls to harass and threaten journalists.[77]
Whereas disinformation research focuses primarily on how actors orchestrate deceptions on social media, primarily via fake news, new research investigates how people take what started as deceptions and circulate them as their personal views.[5] As a result, research shows that disinformation can be conceptualized as a program that encourages engagement in oppositional fantasies (i.e., culture wars), through which disinformation circulates as rhetorical ammunition for never-ending arguments.[5] As disinformation entangles with culture wars, identity-driven controversies constitute a vehicle through which disinformation disseminates on social media. This means that disinformation thrives, not despite raucous grudges but because of them. The reason is that controversies provide fertile ground for never-ending debates that solidify points of view.[5]
Scholars have pointed out that disinformation is not only a foreign threat as domestic purveyors of disinformation are also leveraging traditional media outlets such as newspapers, radio stations, and television news media to disseminate false information.[78] Current research suggests right-wing online political activists in the United States may be more likely to use disinformation as a strategy and tactic.[79] Governments have responded with a wide range of policies to address concerns about the potential threats that disinformation poses to democracy, however, there is little agreement in elite policy discourse or academic literature as to what it means for disinformation to threaten democracy, and how different policies might help to counter its negative implications.[80]
Consequences of exposure to disinformation online
There is a broad consensus amongst scholars that there is a high degree of disinformation, misinformation, and propaganda online; however, it is unclear to what extent such disinformation has on political attitudes in the public and, therefore, political outcomes.[81] This conventional wisdom has come mostly from investigative journalists, with a particular rise during the 2016 U.S. election: some of the earliest work came from Craig Silverman at Buzzfeed News.[82] Cass Sunstein supported this in #Republic, arguing that the internet would become rife with echo chambers and informational cascades of misinformation leading to a highly polarized and ill-informed society.[83]
Research after the 2016 election found: (1) for 14 percent of Americans social media was their “most important” source of election news; 2) known false news stories “favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times”; 3) the average American adult saw fake news stories, “with just over half of those who recalled seeing them believing them”; and 4) people are more likely to “believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks.”[84] Correspondingly, whilst there is wide agreement that the digital spread and uptake of disinformation during the 2016 election was massive and very likely facilitated by foreign agents, there is an ongoing debate on whether all this had any actual effect on the election. For example, a double blind randomized-control experiment by researchers from the London School of Economics (LSE), found that exposure to online fake news about either Trump or Clinton had no significant effect on intentions to vote for those candidates. Researchers who examined the influence of Russian disinformation on Twitter during the 2016 US presidential campaign found that exposure to disinformation was (1) concentrated among a tiny group of users, (2) primarily among Republicans, and (3) eclipsed by exposure to legitimate political news media and politicians. Finally, they find “no evidence of a meaningful relationship between exposure to the Russian foreign influence campaign and changes in attitudes, polarization, or voting behavior.”[85] As such, despite its mass dissemination during the 2016 Presidential Elections, online fake news or disinformation probably did not cost Hillary Clinton the votes needed to secure the presidency.[86]
Research on this topic remains inconclusive, for example, misinformation appears not to significantly change political knowledge of those exposed to it.[87] There seems to be a higher level of diversity of news sources that users are exposed to on Facebook and Twitter than conventional wisdom would dictate, as well as a higher frequency of cross-spectrum discussion.[88][89] Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.[90]
Research is also challenging because disinformation is meant to be difficult to detect and some social media companies have discouraged outside research efforts.[91] For example, researchers found disinformation made “existing detection algorithms from traditional news media ineffective or not applicable…[because disinformation] is intentionally written to mislead readers…[and] users’ social engagements with fake news produce data that is big, incomplete, unstructured, and noisy.”[91] Facebook, the largest social media company, has been criticized by analytical journalists and scholars for preventing outside research of disinformation.[92][93][94][95]
Alternative perspectives and critiques
Researchers have criticized the framing of disinformation as being limited to technology platforms, removed from its wider political context and inaccurately implying that the media landscape was otherwise well-functioning.[96] “The field possesses a simplistic understanding of the effects of media technologies; overemphasizes platforms and underemphasizes politics; focuses too much on the United States and Anglocentric analysis; has a shallow understanding of political culture and culture in general; lacks analysis of race, class, gender, and sexuality as well as status, inequality, social structure, and power; has a thin understanding of journalistic processes; and, has progressed more through the exigencies of grant funding than the development of theory and empirical findings.”[97]
Alternative perspectives have been proposed:
- Moving beyond fact-checking and media literacy to study a pervasive phenomenon as something that involves more than news consumption.
- Moving beyond technical solutions including AI-enhanced fact checking to understand the systemic basis of disinformation.
- Develop a theory that goes beyond Americentrism to develop a global perspective, understand cultural imperialism and Third World dependency on Western news,[98] and understand disinformation in the Global South.[99]
- Develop market-oriented disinformation research that examines the financial incentives and business models that nudge content creators and digital platforms to circulate disinformation online.[4] [100]
- Include a multidisciplinary approach, involving history, political economy, ethnic studies, feminist studies, and science and technology studies.
- Develop understandings of Gendered-based disinformation (GBD) defined as “the dissemination of false or misleading information attacking women (especially political leaders, journalists and public figures), basing the attack on their identity as women.”[101][102]
Strategies for spreading disinformation
Disinformation attack
The research literature on how disinformation spreads is growing.[81] Studies show that disinformation spread in social media can be classified into two broad stages: seeding and echoing.[5] “Seeding,” when malicious actors strategically insert deceptions, like fake news, into a social media ecosystem, and “echoing” is when the audience disseminates disinformation argumentatively as their own opinions often by incorporating disinformation into a confrontational fantasy.
Internet manipulation
Studies show four main methods of seeding disinformation online:[81]
- Selective censorship
- Manipulation of search rankings
- Hacking and releasing
- Directly Sharing Disinformation
Exploiting online advertising technologies
Disinformation is amplified online due to malpractice concerning online advertising, especially the machine-to-machine interactions of real-time bidding systems.[112] Online advertising technologies have been used to amplify disinformation due to the financial incentives and monetization of user-generated content and fake news.[100] The lax oversight over the online advertising market can be used to amplify disinformation, including the use of dark money used for political advertising.[113]
See also
- Active Measures Working Group
- Ad hominem
- Agitprop
- Artificial intelligence and elections
- Black propaganda
- Censorship
- Chinese information operations and information warfare
- Counter Misinformation Team
- COVID-19 misinformation
- Declinism
- Deepfakes
- Demoralization (warfare)
- Denial and deception
- Disinformation attack
- Disinformation in the Russian invasion of Ukraine
- Fake news
- False flag
- Fear, uncertainty and doubt
- Gaslighting
- Internet censorship
- Internet manipulation
- Knowledge falsification
- Kompromat
- Manufacturing Consent
- Media manipulation
- Military deception
- Post-truth politics
- Propaganda in the Soviet Union
- Sharp power
- Social engineering (political science)
- State-sponsored Internet propaganda
- The Disinformation Project
Notes
References
- ^ a b c Bittman, Ladislav (1985), The KGB and Soviet Disinformation: An Insider’s View, Pergamon-Brassey’s, pp. 49–50, ISBN 978-0-08-031572-0
- ^ a b c Shultz, Richard H.; Godson, Roy (1984), Dezinformatsia: Active Measures in Soviet Strategy, Pergamon-Brassey’s, pp. 37–38, ISBN 978-0-08-031573-7
- ^ European Commission (16 June 2022). “The Strengthened Code of Practice on Disinformation 2022”. digital-strategy.ec.europa.eu. p. 1. Retrieved 25 November 2024.
{{cite web}}
: CS1 maint: date and year (link) - ^ a b c d Diaz Ruiz, Carlos (2023). “Disinformation on digital media platforms: A market-shaping approach”. New Media & Society. Online first: 1–24. doi:10.1177/14614448231207644. S2CID 264816011. This article incorporates text from this source, which is available under the CC BY 4.0 license.
- ^ a b c d e f Diaz Ruiz, Carlos; Nilsson, Tomas (16 May 2022). “Disinformation and Echo Chambers: How Disinformation Circulates in Social Media Through Identity-Driven Controversies”. Journal of Public Policy & Marketing. 42: 18–35. doi:10.1177/07439156221103852. S2CID 248934562. Archived from the original on 20 June 2022. Retrieved 20 June 2022.
- ^ “Ireton, C & Posetti, J (2018) “Journalism, fake news & disinformation: handbook for journalism education and training” UNESCO”. Archived from the original on 6 April 2023. Retrieved 7 August 2021.
- ^ Golbeck, Jennifer, ed. (2008), Computing with Social Trust, Human-Computer Interaction Series, Springer, pp. 19–20, ISBN 978-1-84800-355-2
- ^ Freelon, Deen; Wells, Chris (3 March 2020). “Disinformation as Political Communication”. Political Communication. 37 (2): 145–156. doi:10.1080/10584609.2020.1723755. ISSN 1058-4609. S2CID 212897113. Archived from the original on 17 July 2023. Retrieved 17 July 2023.
- ^ Newman, Hadley (2022). “Information Warfare: Leveraging the DMMI Matrix Cube for Risk Assessment”. Journal of Information Warfare. 21 (3): 84–102. ISSN 1445-3312. JSTOR 27199985.
- ^ Hadley, Newman (2022). “Author”. Journal of Information Warfare. Archived from the original on 28 December 2022. Retrieved 28 December 2022.
Strategic communications advisor working across a broad range of policy areas for public and multilateral organisations. Counter-disinformation specialist and published author on foreign information manipulation and interference (FIMI).
- ^ “City & County Cullings (Early use of the word “disinformation” 1887)”. Medicine Lodge Cresset. 17 February 1887. p. 3. Archived from the original on 24 May 2021. Retrieved 24 May 2021.
- ^ “Professor Young on Mars and disinformation (1892)”. The Salt Lake Herald. 18 August 1892. p. 4. Archived from the original on 24 May 2021. Retrieved 24 May 2021.
- ^ “Pure nonsense (early use of the word disinformation) (1907)”. The San Bernardino County Sun. 26 September 1907. p. 8. Archived from the original on 24 May 2021. Retrieved 24 May 2021.
- ^ “Support for Red Cross helps U.S. boys abroad, Rotary Club is told (1917)”. The Sheboygan Press. 18 December 1917. p. 4. Archived from the original on 24 May 2021. Retrieved 24 May 2021.
- ^ a b c Ion Mihai Pacepa and Ronald J. Rychlak (2013), Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism, WND Books, pp. 4–6, 34–39, 75, ISBN 978-1-936488-60-5
- ^ Garth Jowett; Victoria O’Donnell (2005), “What Is Propaganda, and How Does It Differ From Persuasion?”, Propaganda and Persuasion, Sage Publications, pp. 21–23, ISBN 978-1-4129-0898-6,
In fact, the word disinformation is a cognate for the Russian dezinformatsia, taken from the name of a division of the KGB devoted to black propaganda.
- ^ a b c d e Taylor, Adam (26 November 2016), “Before ‘fake news,’ there was Soviet ‘disinformation’“, The Washington Post, archived from the original on 14 May 2019, retrieved 3 December 2016
- ^ a b Jackson, Dean (2018), Distinguishing Disinformation from Propaganda, Misinformation, and ‘Fake News’ (PDF), National Endowment for Democracy, archived (PDF) from the original on 7 April 2022, retrieved 31 May 2022
- ^ Bittman, Ladislav (1988), The New Image-Makers: Soviet Propaganda & Disinformation Today, Brassey’s Inc, pp. 7, 24, ISBN 978-0-08-034939-8
- ^ Martin, David (1990), The Web of Disinformation: Churchill’s Yugoslav Blunder, Harcourt Brace Jovanovich, p. xx, ISBN 978-0-15-180704-8
- ^ Barton, Geoff (2001), Developing Media Skills, Heinemann, p. 124, ISBN 978-0-435-10960-8
- ^ Cunningham, Stanley B. (2002), “Disinformation (Russian: dezinformatsiya)”, The Idea of Propaganda: A Reconstruction, Praeger, pp. 67–68, 110, ISBN 978-0-275-97445-9
- ^ “Disinformation”. Shorenstein Center. Archived from the original on 30 October 2023. Retrieved 30 October 2023.
- ^ Center for Internet Security. (3 October 2022). “Essential Guide to Election Security:Managing Mis-, Dis-, and Malinformation”. CIS website Archived 18 December 2023 at the Wayback Machine Retrieved 18 December 2023.
- ^ Baines, Darrin; Elliott, Robert J. R. (April 2020). “Defining misinformation, disinformation and malinformation: An urgent need for clarity during the COVID-19 infodemic”. Discussion Papers. Archived from the original on 14 December 2022. Retrieved 14 December 2022.
- ^ “Information disorder: Toward an interdisciplinary framework for research and policy making”. Council of Europe Publishing. Archived from the original on 14 December 2022. Retrieved 14 December 2022.
- ^ Newman, Hadley. “Understanding the Differences Between Disinformation, Misinformation, Malinformation and Information – Presenting the DMMI Matrix”. Draft Online Safety Bill (Joint Committee). UK: UK Government. Archived from the original on 4 January 2023. Retrieved 4 January 2023.
- ^ François, Camille (20 September 2019). “Actors, Behaviors, Content: A Disinformation ABC – Highlighting Three Vectors of Viral Deception to Guide Industry & Regulatory Responses” (PDF). Archived from the original (PDF) on 21 March 2023. Retrieved 17 May 2024.
- ^ Alaphilippe, Alexandre (27 April 2020). “Adding a ‘D’ to the ABC disinformation framework”. Brookings Institution. Archived from the original on 27 October 2023. Retrieved 18 May 2024.
- ^ Pamment, James (2020). The ABCDE Framework (PDF) (Report). Carnegie Endowment for International Peace. pp. 5–9. Archived from the original on 18 March 2024.
- ^ Can public diplomacy survive the internet? (PDF), May 2017, archived from the original (PDF) on 30 March 2019
- ^ The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money (PDF), Institute of Modern Russia, 2014, archived from the original (PDF) on 3 February 2019
- ^ a b Newman, Hadley (2022). Foreign information manipulation and interference defence standards: Test for rapid adoption of the common language and framework ‘DISARM’ (PDF). NATO Strategic Communications Centre of Excellence. Latvia. p. 60. ISBN 978-952-7472-46-0. Archived from the original on 28 December 2022. Retrieved 28 December 2022 – via European Centre of Excellence for Countering Hybrid Threats.
{{cite book}}
: CS1 maint: location missing publisher (link) - ^ European Extrernal Action Service (EEAS) (27 October 2021). “Tackling Disinformation, Foreign Information Manipulation & Interference”.
- ^ Goldman, Jan (2006), “Disinformation”, Words of Intelligence: A Dictionary, Scarecrow Press, p. 43, ISBN 978-0-8108-5641-7
- ^ a b Samier, Eugene A. (2014), Secrecy and Tradecraft in Educational Administration: The Covert Side of Educational Life, Routledge Research in Education, Routledge, p. 176, ISBN 978-0-415-81681-6
- ^ Tandoc, Edson C; Lim, Darren; Ling, Rich (7 August 2019). “Diffusion of disinformation: How social media users respond to fake news and why”. Journalism. 21 (3): 381–398. doi:10.1177/1464884919868325. ISSN 1464-8849. S2CID 202281476.
- ^ Taylor, Adam (26 November 2016), “Before ‘fake news,’ there was Soviet ‘disinformation’“, The Washington Post, retrieved 3 December 2016
- ^ Martin J. Manning; Herbert Romerstein (2004), “Disinformation”, Historical Dictionary of American Propaganda, Greenwood, pp. 82–83, ISBN 978-0-313-29605-5
- ^ Nicholas John Cull; David Holbrook Culbert; David Welch (2003), “Disinformation”, Propaganda and Mass Persuasion: A Historical Encyclopedia, 1500 to the Present, ABC-CLIO, p. 104, ISBN 978-1610690713
- ^ Stukal, Denis; Sanovich, Sergey; Bonneau, Richard; Tucker, Joshua A. (February 2022). “Why Botter: How Pro-Government Bots Fight Opposition in Russia” (PDF). American Political Science Review. 116 (1). Cambridge and New York: Cambridge University Press on behalf of the American Political Science Association: 843–857. doi:10.1017/S0003055421001507. ISSN 1537-5943. LCCN 08009025. OCLC 805068983. S2CID 247038589. Retrieved 10 March 2022.
- ^ Sultan, Oz (Spring 2019). “Tackling Disinformation, Online Terrorism, and Cyber Risks into the 2020s”. The Cyber Defense Review. 4 (1). West Point, New York: Army Cyber Institute: 43–60. ISSN 2474-2120. JSTOR 26623066.
- ^ Anne Applebaum; Edward Lucas (6 May 2016), “The danger of Russian disinformation”, The Washington Post, retrieved 9 December 2016
- ^ “Russian state-sponsored media and disinformation on Twitter”. ZOiS Spotlight. Retrieved 16 September 2020.
- ^ “Russian Disinformation Is Taking Hold in Africa”. CIGI. 17 November 2021. Retrieved 3 March 2022.
The Kremlin’s effectiveness in seeding its preferred vaccine narratives among African audiences underscores its wider concerted effort to undermine and discredit Western powers by pushing or tapping into anti-Western sentiment across the continent.
- ^ “Leaked documents reveal Russian effort to exert influence in Africa”. The Guardian. 11 June 2019. Retrieved 3 March 2022.
The mission to increase Russian influence on the continent is being led by Yevgeny Prigozhin, a businessman based in St Petersburg who is a close ally of the Russian president, Vladimir Putin. One aim is to ‘strong-arm’ the US and the former colonial powers the UK and France out of the region. Another is to see off ‘pro-western’ uprisings, the documents say.
- ^ MacFarquharaug, Neil (28 August 2016), “A Powerful Russian Weapon: The Spread of False Stories”, The New York Times, p. A1, retrieved 9 December 2016,
Moscow adamantly denies using disinformation to influence Western public opinion and tends to label accusations of either overt or covert threats as ‘Russophobia.’
- ^ “How Russia’s disinformation campaign seeps into US views”. Voice of America. 11 April 2024. Retrieved 13 May 2024.
- ^ Belton, Catherine (17 April 2024). “Secret Russian foreign policy document urges action to weaken the U.S.” The Washington Post. Retrieved 6 June 2024.
- ^ Stone, Peter (16 March 2023). “Russia disinformation looks to US far right to weaken Ukraine support”. The Guardian. ISSN 0261-3077. Retrieved 13 May 2024.
- ^ Bing, Christopher; Paul, Katie; Bing, Christopher (3 September 2024). “US voters targeted by Chinese influence online, researchers say”. Reuters. Retrieved 6 September 2024.
- ^ a b O’Sullivan, Donie; Devine, Curt; Gordon, Allison (13 November 2023). “China is using the world’s largest known online disinformation operation to harass Americans, a CNN review finds”. CNN. Archived from the original on 14 November 2023. Retrieved 6 May 2024.
- ^ “How Microsoft names threat actors”. Microsoft Learn. 17 October 2024. Retrieved 24 October 2024.
- ^ a b Gilbert, David (29 April 2024). “Why China Is So Bad at Disinformation”. Wired. Archived from the original on 9 May 2024. Retrieved 9 May 2024.
- ^ a b Milmo, Dan (5 April 2024). “China will use AI to disrupt elections in the US, South Korea and India, Microsoft warns”. The Guardian. ISSN 0261-3077. Archived from the original on 25 May 2024. Retrieved 7 April 2024.
- ^ “China-linked bots targeting Republicans including Marco Rubio in run-up to election, Microsoft says”. The Guardian. Reuters. 24 October 2024. ISSN 0261-3077. Retrieved 25 October 2024.
The group allegedly responsible is known as Taizi Flood, which has been previously associated with China’s Ministry of Public Security, researchers say.
- ^ Hsu, Tiffany; Myers, Steven Lee (1 April 2024). “China’s Advancing Efforts to Influence the U.S. Election Raise Alarms”. The New York Times. Archived from the original on 3 April 2024. Retrieved 1 April 2024.
The accounts sometimes amplified or repeated content from the Chinese influence campaign Spamouflage, which was first identified in 2019 and linked to an arm of the Ministry of Public Security.
- ^ a b Yang, Lin (8 April 2024). “Chinese nationalist trolls pretend to be Trump supporters ahead of US elections”. Voice of America. Archived from the original on 9 May 2024. Retrieved 9 May 2024.
- ^ Milmo, Dan; Hawkins, Amy (18 May 2024). “How China is using AI news anchors to deliver its propaganda”. The Guardian. ISSN 0261-3077. Archived from the original on 25 May 2024. Retrieved 20 May 2024.
- ^ Nimmo, Ben; Hubert, Ira; Yang, Cheng (February 2021). Spamouflage Breakout: Chinese Spam Network Finally Starts to Gain Some Traction (PDF) (Report). Graphika. Archived (PDF) from the original on 5 March 2021. Retrieved 9 May 2024.
- ^ a b Martin J. Manning; Herbert Romerstein (2004), “Disinformation”, Historical Dictionary of American Propaganda, Greenwood, pp. 82–83, ISBN 978-0-313-29605-5
- ^ Murray-Smith, Stephen (1989), Right Words, Viking, p. 118, ISBN 978-0-670-82825-8
- ^ a b c Biagi, Shirley (2014), “Disinformation”, Media/Impact: An Introduction to Mass Media, Cengage Learning, p. 328, ISBN 978-1-133-31138-6
- ^ a b c d e Bing, Chris; Schechtman, Joel (14 June 2024). “Pentagon Ran Secret Anti-Vax Campaign to Undermine China during Pandemic”. Reuters.
- ^ Toropin, Konstantin (14 June 2024). “Pentagon Stands by Secret Anti-Vaccination Disinformation Campaign in Philippines After Reuters Report”. Military.com. Archived from the original on 14 June 2024. Retrieved 19 June 2024.
- ^ “Pope Warns About Fake News-From Experience”, The New York Times, Associated Press, 7 December 2016, archived from the original on 7 December 2016, retrieved 7 December 2016
- ^ Alyssa Newcomb (15 November 2016), “Facebook, Google Crack Down on Fake News Advertising”, NBC News, NBC News, archived from the original on 6 April 2019, retrieved 16 November 2016
- ^ Schaede, Sydney (24 October 2016), “Did the Pope Endorse Trump?”, FactCheck.org, archived from the original on 19 April 2019, retrieved 7 December 2016
- ^ Pullella, Philip (7 December 2016), “Pope warns media over ‘sin’ of spreading fake news, smearing politicians”, Reuters, archived from the original on 23 November 2020, retrieved 7 December 2016
- ^ “Pope Francis compares fake news consumption to eating faeces”, The Guardian, 7 December 2016, archived from the original on 7 March 2021, retrieved 7 December 2016
- ^ Zauzmer, Julie (7 December 2016), “Pope Francis compares media that spread fake news to people who are excited by feces”, The Washington Post, archived from the original on 4 February 2021, retrieved 7 December 2016
- ^ Griffin, Andrew (7 December 2016), “Pope Francis: Fake news is like getting sexually aroused by faeces”, The Independent, archived from the original on 26 January 2021, retrieved 7 December 2016
- ^ a b c d e f Danks, David; Danks, Joseph H. (2014), “The Moral Responsibility of Automated Responses During Cyberwarfare”, in Timothy J. Demy; George R. Lucas Jr.; Bradley J. Strawser (eds.), Military Ethics and Emerging Technologies, Routledge, pp. 223–224, ISBN 978-0-415-73710-4
- ^ Spies, Samuel (14 August 2019). “Defining “Disinformation”, V1.0″. MediaWell, Social Science Research Council. Archived from the original on 30 October 2020. Retrieved 9 November 2019.
- ^ Tandoc, Edson C. (2019). “The facts of fake news: A research review”. Sociology Compass. 13 (9): e12724. doi:10.1111/soc4.12724. ISSN 1751-9020. S2CID 201392983.
- ^ Caramancion, Kevin Matthe (2020). “An Exploration of Disinformation as a Cybersecurity Threat”. 2020 3rd International Conference on Information and Computer Technologies (ICICT). pp. 440–444. doi:10.1109/ICICT50521.2020.00076. ISBN 978-1-7281-7283-5. S2CID 218651389.
- ^ “Samantha Bradshaw & Philip N. Howard. (2019) The Global Disinformation Disorder: 2019 Global Inventory of Organised Social Media Manipulation. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda” (PDF). comprop.oii.ox.ac.uk. Archived (PDF) from the original on 25 May 2022. Retrieved 17 November 2022.
- ^ Miller, Michael L.; Vaccari, Cristian (July 2020). “Digital Threats to Democracy: Comparative Lessons and Possible Remedies”. The International Journal of Press/Politics. 25 (3): 333–356. doi:10.1177/1940161220922323. ISSN 1940-1612. S2CID 218962159. Archived from the original on 14 December 2022. Retrieved 14 December 2022.
- ^ Freelon, Deen; Marwick, Alice; Kreiss, Daniel (4 September 2020). “False equivalencies: Online activism from left to right”. Science. 369 (6508): 1197–1201. Bibcode:2020Sci…369.1197F. doi:10.1126/science.abb2428. PMID 32883863. S2CID 221471947. Archived from the original on 21 October 2021. Retrieved 2 February 2022.
- ^ Tenove, Chris (July 2020). “Protecting Democracy from Disinformation: Normative Threats and Policy Responses”. The International Journal of Press/Politics. 25 (3): 517–537. doi:10.1177/1940161220918740. ISSN 1940-1612. S2CID 219437151. Archived from the original on 14 December 2022. Retrieved 14 December 2022.
- ^ a b c Tucker, Joshua; Guess, Andrew; Barbera, Pablo; Vaccari, Cristian; Siegel, Alexandra; Sanovich, Sergey; Stukal, Denis; Nyhan, Brendan (2018). “Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature”. SSRN Working Paper Series. doi:10.2139/ssrn.3144139. ISSN 1556-5068. Archived from the original on 21 February 2021. Retrieved 29 October 2019.
- ^ “This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook”. BuzzFeed News. 16 November 2016. Archived from the original on 17 July 2018. Retrieved 29 October 2019.
- ^ Sunstein, Cass R. (14 March 2017). #Republic : divided democracy in the age of social media. Princeton. ISBN 978-0691175515. OCLC 958799819.
{{cite book}}
: CS1 maint: location missing publisher (link) - ^ Allcott, Hunt; Gentzkow, Matthew (May 2017). “Social Media and Fake News in the 2016 Election”. Journal of Economic Perspectives. 31 (2): 211–236. doi:10.1257/jep.31.2.211. ISSN 0895-3309. S2CID 32730475.
- ^ Eady, Gregory; Paskhalis, Tom; Zilinsky, Jan; Bonneau, Richard; Nagler, Jonathan; Tucker, Joshua A. (9 January 2023). “Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 US Election and its Relationship to Attitudes and Voting Behavior”. Nature Communications. 14 (62): 62. Bibcode:2023NatCo..14…62E. doi:10.1038/s41467-022-35576-9. PMC 9829855. PMID 36624094.
- ^ Leyva, Rodolfo (2020). “Testing and unpacking the effects of digital fake news: on presidential candidate evaluations and voter support”. AI & Society. 35 (4): 970. doi:10.1007/s00146-020-00980-6. S2CID 218592685.
- ^ Allcott, Hunt; Gentzkow, Matthew (May 2017). “Social Media and Fake News in the 2016 Election”. Journal of Economic Perspectives. 31 (2): 211–236. doi:10.1257/jep.31.2.211. ISSN 0895-3309.
- ^ Bakshy, E.; Messing, S.; Adamic, L. A. (5 June 2015). “Exposure to ideologically diverse news and opinion on Facebook”. Science. 348 (6239): 1130–1132. Bibcode:2015Sci…348.1130B. doi:10.1126/science.aaa1160. ISSN 0036-8075. PMID 25953820. S2CID 206632821.
- ^ Wojcieszak, Magdalena E.; Mutz, Diana C. (1 March 2009). “Online Groups and Political Discourse: Do Online Discussion Spaces Facilitate Exposure to Political Disagreement?”. Journal of Communication. 59 (1): 40–56. doi:10.1111/j.1460-2466.2008.01403.x. ISSN 0021-9916. S2CID 18865773.
- ^ Lanoszka, Alexander (2019). “Disinformation in international politics”. European Journal of International Security. 4 (2): 227–248. doi:10.1017/eis.2019.6. ISSN 2057-5637. S2CID 211312944.
- ^ a b Shu, Kai; Sliva, Amy; Wang, Suhang; Tang, Jiliang; Liu, Huan (1 September 2017). “Fake News Detection on Social Media: A Data Mining Perspective”. ACM SIGKDD Explorations Newsletter. 19 (1): 22–36. arXiv:1708.01967. doi:10.1145/3137597.3137600. ISSN 1931-0145. S2CID 207718082. Archived from the original on 5 February 2022. Retrieved 1 February 2022.
- ^ Edelson, Laura; McCoy, Damon. “How Facebook Hinders Misinformation Research”. Scientific American. Archived from the original on 2 February 2022. Retrieved 1 February 2022.
- ^ Edelson, Laura; McCoy, Damon (14 August 2021). “Facebook shut down our research into its role in spreading disinformation”. The Guardian. Archived from the original on 24 March 2022. Retrieved 1 February 2022.
- ^ Krishnan, Nandita; Gu, Jiayan; Tromble, Rebekah; Abroms, Lorien C. (15 December 2021). “Research note: Examining how various social media platforms have responded to COVID-19 misinformation”. Harvard Kennedy School Misinformation Review. doi:10.37016/mr-2020-85. S2CID 245256590. Archived from the original on 3 February 2022. Retrieved 1 February 2022.
- ^ “Only Facebook knows the extent of its misinformation problem. And it’s not sharing, even with the White House”. Washington Post. ISSN 0190-8286. Archived from the original on 5 February 2022. Retrieved 1 February 2022.
- ^ Kuo, Rachel; Marwick, Alice (12 August 2021). “Critical disinformation studies: History, power, and politics”. Harvard Kennedy School Misinformation Review. doi:10.37016/mr-2020-76. Archived from the original on 15 October 2023.
- ^ “What Comes After Disinformation Studies?”. Center for Information, Technology, & Public Life (CITAP), University of North Carolina at Chapel Hill. Archived from the original on 3 February 2023. Retrieved 16 January 2024.
- ^ Tworek, Heidi (2 August 2022). “Can We Move Beyond Disinformation Studies?”. Centre for International Governance Innovation. Archived from the original on 1 June 2023. Retrieved 16 January 2024.
- ^ Wasserman, Herman; Madrid-Morales, Dani, eds. (12 April 2022). Disinformation in the Global South (1 ed.). Wiley. doi:10.1002/9781119714491. ISBN 978-1-119-71444-6. Archived from the original on 4 March 2024. Retrieved 4 March 2024.
- ^ a b Diaz Ruiz, Carlos A. (30 October 2024). “Disinformation and fake news as externalities of digital advertising: a close reading of sociotechnical imaginaries in programmatic advertising”. Journal of Marketing Management: 1–23. doi:10.1080/0267257X.2024.2421860. ISSN 0267-257X.
- ^ Sessa, Maria Giovanna (4 December 2020). “Misogyny and Misinformation: An analysis of gendered disinformation tactics during the COVID-19 pandemic”. EU DisinfoLab. Archived from the original on 19 September 2023. Retrieved 16 January 2024.
- ^ Sessa, Maria Giovanna (26 January 2022). “What is Gendered Disinformation?”. Heinrich Böll Foundation. Archived from the original on 21 July 2022. Retrieved 16 January 2024.
- ^ Woolley, Samuel; Howard, Philip N. (2019). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford University Press. ISBN 978-0190931414.
- ^ Diaz Ruiz, Carlos (30 October 2023). “Disinformation on digital media platforms: A market-shaping approach”. New Media & Society. doi:10.1177/14614448231207644. ISSN 1461-4448. S2CID 264816011.
- ^ Marchal, Nahema; Neudert, Lisa-Maria (2019). “Polarisation and the use of technology in political campaigns and communication” (PDF). European Parliamentary Research Service.
- ^ Kreiss, Daniel; McGregor, Shannon C (11 April 2023). “A review and provocation: On polarization and platforms”. New Media & Society. 26: 556–579. doi:10.1177/14614448231161880. ISSN 1461-4448. S2CID 258125103.
- ^ Diaz Ruiz, Carlos; Nilsson, Tomas (2023). “Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies”. Journal of Public Policy & Marketing. 42 (1): 18–35. doi:10.1177/07439156221103852. ISSN 0743-9156. S2CID 248934562.
- ^ Di Domenico, Giandomenico; Ding, Yu (23 October 2023). “Between Brand attacks and broader narratives: how direct and indirect misinformation erode consumer trust”. Current Opinion in Psychology. 54: 101716. doi:10.1016/j.copsyc.2023.101716. ISSN 2352-250X. PMID 37952396. S2CID 264474368.
- ^ Castells, Manuel (4 June 2015). Networks of Outrage and Hope: Social Movements in the Internet Age. John Wiley & Sons. ISBN 9780745695792. Retrieved 4 February 2017.
- ^ “Condemnation over Egypt’s internet shutdown”. Financial Times. Retrieved 4 February 2017.
- ^ “Net neutrality wins in Europe – a victory for the internet as we know it”. ZME Science. 31 August 2016. Retrieved 4 February 2017.
- ^ Braun, Joshua A.; Eklund, Jessica L. (2 January 2019). “Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism”. Digital Journalism. 7 (1): 1–21. doi:10.1080/21670811.2018.1556314. ISSN 2167-0811.
- ^ Nadler, Anthony; Donovan, Joan; Crane, Matthew (17 October 2018). “Weaponizing the Digital Influence Machine”. Data & Society. Retrieved 21 November 2024.
Further reading
- Bittman, Ladislav (1985), The KGB and Soviet Disinformation: An Insider’s View, Pergamon-Brassey’s, ISBN 978-0-08-031572-0
- Boghardt, Thomas (26 January 2010), “Operation INFEKTION – Soviet Bloc Intelligence and Its AIDS Disinformation Campaign” (PDF), Studies in Intelligence, 53 (4), retrieved 9 December 2016
- Golitsyn, Anatoliy (1984), New Lies for Old: The Communist Strategy of Deception and Disinformation, Dodd, Mead & Company, ISBN 978-0-396-08194-4
- O’Connor, Cailin, and James Owen Weatherall, “Why We Trust Lies: The most effective misinformation starts with seeds of truth“, Scientific American, vol. 321, no. 3 (September 2019), pp. 54–61.
- Ion Mihai Pacepa and Ronald J. Rychlak (2013), Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism, WND Books, ISBN 978-1-936488-60-5
- Fletcher Schoen; Christopher J. Lamb (1 June 2012), “Deception, Disinformation, and Strategic. Communications: How One Interagency Group. Made a Major Difference” (PDF), Strategic Perspectives, 11, retrieved 9 December 2016
- Shultz, Richard H.; Godson, Roy (1984), Dezinformatsia: Active Measures in Soviet Strategy, Pergamon-Brassey’s, ISBN 978-0080315737
- Taylor, Adam (26 November 2016), “Before ‘fake news,’ there was Soviet ‘disinformation’“, The Washington Post, retrieved 3 December 2016
- Legg, Heidi; Kerwin, Joe (1 November 2018), The Fight Against Disinformation in the U.S.: A Landscape Analysis, Harvard Kennedy School, Shorenstein Center, retrieved 10 August 2020
External links
- Disinformation Archived 25 October 2007 at the Wayback Machine – a learning resource from the British Library including an interactive movie and activities.
- MediaWell – an initiative of the nonprofit Social Science Research Council seeking to track and curate disinformation, misinformation, and fake news research.