IMUN Logo

Mobile Toggle

General Assembly Issue #1: The question of the spread of disinformation through social media and its effect on minors.

General Assembly Issue #1: The question of the spread of disinformation through social media and its effect on minors.
Reprinted from the www.iberianmun.org website

Disinformation is defined as false information spread by people who know it is not true. This term is commonly and wrongly conflated with misinformation, which is “false information that is spread by people who think it is true” (“What is”). Delegates should refrain from using vague and less appropriate terms synonymous to these such as ‘fake news’, seeing as these actively set back productive debate. It should be noted that discussion referring to misinformation, while not present in the question, is still highly pertinent to topics surrounding disinformation.

United Nations High Commissioner for Refugees (UNHCR) defines the ten types of misinformation or disinformation as fabricated content, manipulated content, imposter content, misleading content, false context, satire and parody, false connections, sponsored content, propaganda, and error (“Factsheet 4”). This is spread online through algorithms, bots, individuals, and groups (Howard et al). Debate should be centered around the effects of these on minors, as the population of people under eighteen is considered to be highly vulnerable to any type of false information. Children are a major point of focus in the discussion surrounding false information as they are more likely to become a vector of misinformation, spreading it among their peers and families. United Nations International Children’s Emergency Fund (UNICEF) surveys show that three quarters of young people feel unable to judge the truthfulness of online information (Howard et al).

This question of the spread of disinformation is a globally relevant issue as there are multiple examples of it inciting real-world conflict. For example, in Myanmar disinformation was attributed with “inciting violence and crime targeted at ethnic minorities, which has resulted in deaths and displacement” (Howard et al), including children (“Report of”). The detrimental effects of disinformation evolving into misinformation on children was also seen during the COVID-19 pandemic, with anti-vaccine conspiracy theories “shaping health-related behaviours, such as reducing parental intentions to vaccinate their children” (Howard et al).

The current online environment is highly unregulated, and it is the duty of UN member states to attempt to impose some regulations or solutions to the issue. Resolutions can approach this issue in multiple ways, such as the integration of media literacy into school curricula, public awareness campaigns, and stricter age restrictions. Given recent developments in AI, the development of machine learning algorithms for the purpose of detecting misinformation can also be a valid avenue for a solution. Potential resolutions can also attempt to collaborate with social media companies to increase the current industry standards for transparency and accountability of their algorithms and content moderation.

Works Cited

“Factsheet 4: Types of Misinformation and Disinformation” UNHCR, https://www.unhcr.org/innovation/wp-content/uploads/2022/02/Factsheet-4.pdf. Accessed 9 June 2024.

Howard, Philip, et al. Digital Misinformation / Disinformation and Children. 2021.

“What Is Misinformation and Fake News?” Internet Matters, 18 Jan. 2024, www.internetmatters.org/issues/fake-news-and-misinformation-advice-hub/learn-aboutfake-news-to-support-children/. Accessed 9 June 2024.

“Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar (A/HRC/39/CRP.2) - Myanmar.” ReliefWeb, 18 Sept. 2018, www.reliefweb.int/report/myanmar/report-detailed-findings-independent-internationalfact-finding-mission-myanmar. Accessed 9 June 2024.