Research Paper
Today, platforms like TikTok, Instagram, YouTube, and X (formerly Twitter) have replaced traditional news outlets as the primary source of political information for millions of Americans. This shift has given rise to a new ecosystem where fringe voices, activists, and hyper partisan influencers can reach massive audiences with minimal oversight.
Introduction
Social media has transformed how Americans gather information, form political opinions, and participate in public discourse. Platforms such as TikTok, YouTube, Instagram, Facebook, and X allow ordinary users and political activists to reach audiences at a scale once limited to news organizations and political institutions. Because emotionally charged and sensational content travels faster online than verified information, these platforms often amplify influencers who produce the most provocative content (Kaur & Gupta, 2023). Research on political TikTok also shows that playful performance, visual trends, and remix culture can embed political messages into entertainment, making persuasion feel subtle and natural (Sánchez-Querubín et al., 2023).
This paper examines how platform design, emotional manipulation, fringe ecosystems, and cross-platform dynamics enable influencers and activists to manipulate public opinion in the United States.
Emotional Manipulation and the Power of Performance
Influencers succeed largely because their audiences perceive them as authentic. As Sánchez-Querubín et al. (2023) note, TikTok users rely on “performance, humor, and aesthetic trends” to communicate political messages in ways that feel organic and entertaining. This emotional style encourages viewers to accept and share content without fully questioning its accuracy.
Studies of misinformation consistently show that emotionally arousing content spreads faster than factual information because it activates rapid, intuitive thinking (Ku et al., 2026). Platforms reward this behavior because emotionally intense posts drive engagement, which keeps users online longer. This cycle makes emotional manipulation one of the most effective tools for political influence online.
Algorithmic Amplification and Platform Design
Digital platforms rely on algorithmic systems that filter, rank, and recommend content. These systems are designed to maximize engagement, not accuracy. As the Congressional Research Service explains, platforms “disseminate information quickly to billions” and rely on engagement-based ranking systems that determine what users see first (Cho & Zhu, 2023, p. 1). This creates an environment where fringe influencers who produce dramatic or sensational content can gain disproportionate visibility.
Research on misinformation interventions shows that users often lack both the skills and motivation to evaluate content critically (Ku et al., 2026). Algorithms exploit this gap by prioritizing content that captures attention, even when it is misleading. Kaur and Gupta (2023) found that “false information spreads faster than verified information,” which explains why algorithms frequently elevate misinformation.
Fringe-to-Mainstream Pipelines
Many misleading narratives begin in fringe online communities before entering mainstream platforms. In their analysis of political TikTok, Sánchez-Querubín et al. (2023) describe how political messages often emerge from small, highly active communities and then spread widely once influencers remix or repackage them. Similarly, studies of misinformation spread show that fringe narratives often move from smaller platforms into more widely used ones through meme culture, visual remixing, and influencer amplification (Kaur & Gupta, 2023).
This pipeline normalizes extreme ideas. Once a narrative appears across multiple platforms, repetition makes it feel legitimate. Influencers may cite unverified claims from fringe sources as “evidence,” which reinforces the illusion of credibility.
Cross-Platform Spread and Viral Misinformation
Different platforms enhance misinformation in different ways. TikTok adds visual and musical trends. Instagram adds graphic design. YouTube provides commentary and long-form framing. Facebook amplifies emotional engagement in group settings. According to Kaur and Gupta (2023), the “virality of content” and low barriers to dissemination create conditions where misinformation spreads rapidly and widely across networks.
When a narrative spreads across multiple platforms, it creates what researchers call an “illusion of consensus.” Seeing the same idea repeatedly makes users believe it is widely accepted, even when it originates from a fringe source.
Why Audiences Are Vulnerable
Periods of uncertainty make people more susceptible to influencers who present confident, emotionally compelling explanations. Medical misinformation during COVID-19 demonstrated this vulnerability. Khullar (2025) explains that online misinformation resulted in “nonadherence to mitigation measures” and contributed to public confusion as opposing narratives circulated widely (p. 1). He also notes that only a small number of accounts, the “disinformation dozen” produced the vast majority of false claims (Khullar, 2025, p. 2).
This concentration of influence mirrors political misinformation, where a small network of hyper partisan influencers can steer opinions across platforms. Psychological factors such as fear, identity validation, and social belonging increase vulnerability, especially when users trust influencers more than institutions.
Platform Moderation and the Limits of Deplatforming
Moderation efforts remain inconsistent across platforms. Cho and Zhu (2023) note that platforms have “wide latitude” in enforcing policies because they are private companies, not government agencies (p. 2). While de-platforming can reduce the reach of harmful influencers, it also creates opportunities for them to claim censorship. Many migrate to smaller platforms where moderation is weaker, allowing misinformation to spread unchecked.
Kaur and Gupta (2023) emphasize that misinformation harms businesses, individuals, and society by eroding trust and deepening polarization. Their review found that misinformation often persists because detection systems are limited and platform policies differ widely.
Conclusion
Social media influencers and political activists can manipulate public opinion by exploiting emotional storytelling, algorithmic visibility, fringe communities, and cross-platform narrative flows. Research across multiple domains including political communication, misinformation studies, and public health shows that emotionally charged and sensational content spreads more rapidly than verified information. Algorithmic systems amplify posts that maximize engagement, giving fringe influencers disproportionate power.
To counter this trend, digital literacy and critical thinking interventions must become central to education and public policy. As Ku et al. (2026) demonstrate, even simple “boost and nudge” techniques can help users make more informed decisions. Understanding how digital manipulation works is essential for protecting democratic processes and creating healthier information ecosystems.