Navigating the Algorithm: Active Learning in Political Communication[1]
Sang Jung Kim, University of Iowa
Addis Dennie; Aj Dolan; Bertrand Jay; Courtney Fournier; Emma Cutsforth; Holland Smith; Maura De Cicco (Students who participated in the Political Communication project – Alphabetical Order)
https://nbn-resolving.org/urn:nbn:de:0168-ssoar-103951-5, PDF
As social media platforms have become central arenas for political communication, scholars have increasingly turned their attention to how the infrastructures of these platforms—particularly algorithmic systems—shape patterns of information consumption and perceptions of political issues (Gillespie, 2014; Bucher, 2018). Among the most widely studied concepts in this context are selective exposure, echo chambers, and filter bubbles, which collectively describe the tendency of individuals to encounter information that reinforces their preexisting beliefs (Stroud, 2008; Pariser, 2011). While these ideas remain influential in public and scholarly debates, recent empirical research suggests a more nuanced understanding of algorithmic curation, showing that its effects on political perceptions are highly context-dependent and often overstated (Flaxman et al., 2016; Guess et al., 2018).
Nevertheless, these conceptual frameworks continue to serve as critical entry points for students to understand how digital platforms shape political realities. In an era when political actors, institutions, and citizens increasingly rely on algorithm-curated media for information, students must develop the analytical tools to interrogate these systems. This is especially crucial given the growing role of social media platforms in shaping political perceptions among younger audiences. Without this foundation, students risk becoming passive consumers of political information, unable to recognize the subtle but powerful ways in which algorithmic governance influences what is seen, believed, and acted upon.
While traditional lectures effectively introduce undergraduate students to theoretical foundations of algorithmic influence, fully grasping the subtle yet pervasive effects of algorithms on political attitudes requires active observation and hands-on experience. To complement traditional instructional methods, we designed and implemented a semester-long final project focused on directly observing YouTube algorithmic recommendations within an experimental setting in our undergraduate political communication course. This collaborative, project-based approach offered two notable advantages. First, it allowed students to directly experience how algorithms shape information exposure and political perceptions, moving theory into practice through structured observations on platforms like YouTube. Second, by engaging collectively in the experimental design, we gained a deeper critical understanding of research methodologies. By designing and conducting these experiments together, students actively developed insights into methodological rigor, ethical considerations, and the complexities involved in empirically studying digital platforms and political phenomena.
Project Design: Investigating Algorithmic Influence
Before conducting the final project, students were introduced to foundational concepts in digital political communication, specifically selective exposure, filter bubbles, and echo chambers. To ensure students deeply understood these phenomena, we first engaged in structured discussions where students reflected on their personal experiences and observations. By sharing their own encounters with algorithmic recommendations and online information environments, students began connecting theoretical concepts to real-life contexts, laying a foundation for the subsequent experimental project.
Next, students were divided into two groups, each receiving a dedicated “burner laptop”—a clean device specifically prepared for the project. These laptops had no prior browsing history, cookies, or user data, ensuring a neutral starting point free from algorithmic biases or personalization. By using these devices, students could monitor how YouTube’s recommendation algorithms evolved solely based on their experimental actions. This methodological precaution enabled students to isolate and accurately observe the impact of specific viewing behaviors and content interactions on the platform’s algorithmic recommendations.
After setting up these devices, each student group created a new Google profile specifically for the experiment. To systematically observe how ideological preferences influenced YouTube’s algorithmic recommendations, one group subscribed exclusively to channels associated with conservative ideology, while the other subscribed exclusively to channels aligned with liberal ideology. These channels included not only traditional news sources but also prominent social media commentators and influencers known for ideologically oriented content. To ensure thoughtful and representative selection, we held collaborative classroom-based discussions in which students proposed, debated, and finalized the channel lists for each ideological category. By diversifying the subscription lists beyond mainstream news to encompass various types of political content creators, students better reflected the broader, real-world consumption patterns shaping algorithmic recommendation behaviors.
Once the subscriptions were finalized, students began systematically engaging with their assigned content. Specifically, they watched the most recent videos and Shorts posted by the selected channels, thereby establishing a viewing history that allowed YouTube’s recommendation algorithm to generate personalized suggestions. Starting the following week and continuing throughout the semester, students documented the top three videos recommended by YouTube on their homepage feeds each time they logged in. Students recorded details for each recommended video, including the channel name, the overall political or ideological stance conveyed, and the specific political or social issue discussed. This detailed approach allowed students to not only observe broad patterns in algorithmic recommendation but also to analyze more subtle dynamics, such as the types of topics or ideological framing that tended to appear more frequently over time. Through this systematic categorization and documentation, students gained nuanced insights into the relationship between initial viewing preferences, algorithmic behavior, and ongoing political information exposure.
Finally, as the final task of this project, students analyzed the collected data using both qualitative and quantitative methods to identify trends and patterns in YouTube’s algorithmic recommendations over the semester. Qualitatively, students performed thematic analysis, categorizing the content of recommended videos based on recurring topics, ideological stances, and framing techniques. Quantitatively, they conducted proportion comparisons, systematically assessing the frequency and distribution of recommended channels, ideological leanings, and political issues. Each group examined how their initial ideological subscriptions influenced these patterns. The course concluded with an interactive classroom discussion in which students compared and contrasted findings across groups, highlighting key differences in how YouTube’s algorithm responded to conservative versus liberal viewing histories. This reflective session enabled students to critically interpret their combined qualitative and quantitative findings, deepening their understanding of the broader implications of algorithmic curation on political information consumption and polarization.
Project Results: Pronounced Effects of Echo Chamber/ Filter Bubble
Our analysis revealed clear yet nuanced patterns regarding how YouTube’s recommendation algorithm responded to initial ideological alignments established by channel subscriptions. Both groups provided evidence of echo chambers and filter bubbles; however, these phenomena were less pronounced than we originally anticipated.
In the group subscribed to conservative-oriented YouTube channels, social media commentary channels significantly outweighed traditional news channels in the platform’s recommendations. Channels such as “Brett Cooper” (@bbrettcooper) and “Amir Odom” (@amirxodom) appeared prominently and frequently in students’ homepage feeds. Although videos from Fox News were regularly recommended, these predominantly consisted of segments from “Fox & Friends,” which is primarily a commentary-driven program. This suggests a notable algorithmic preference for recommending opinion-based or commentary-focused content over traditional news reporting, even within mainstream conservative media.
Interestingly, we observed that over the semester, the recommended social commentary videos gradually shifted toward greater neutrality, exhibiting fewer instances of explicitly conservative rhetoric. This shift became particularly pronounced as more political events unfolded during the first 100 days of the Trump administration. While our findings cannot be broadly generalized without additional evidence, this suggests two potential interpretations: either the conservative-oriented social commentators adjusted their rhetoric in response to ongoing political developments, or YouTube’s algorithm itself adapted recommendations toward content with broader appeal or less explicitly partisan framing.
In the group subscribed to liberal-oriented YouTube channels, students initially found that YouTube’s recommendations included several videos unrelated to politics, featuring topics such as science or food. The group informally labeled this initial pattern as the “PBS[2] effect,” noting that even though PBS (@PBS) is classified as left-center by Media Bias Fact Check, it predominantly produces educational and general-interest videos rather than explicitly political content. Because the group had subscribed to PBS, YouTube’s algorithm initially prioritized recommending educational and general-interest videos. This highlighted a notable difference in the algorithm’s initial response compared to the conservative-oriented subscriptions, indicating that subscribing to liberal channels did not immediately or exclusively trigger strongly partisan content recommendations.
To investigate if this initial pattern would persist or shift toward a clearer ideological echo chamber or filter bubble, the group intentionally adjusted their viewing habits by selecting more explicitly political videos recommended on their YouTube homepage. This deliberate shift led the algorithm to increasingly recommend explicitly political social commentary videos, primarily from channels such as Secular Talk (@SecularTalk).[3] These recommendations predominantly featured reactions to events involving Donald Trump and Elon Musk, characterized by a strongly emotional and reactionary tone.
Across both ideological groups, our findings provide evidence of echo chambers and filter bubbles influenced by initial channel subscriptions. Nonetheless, the magnitude of these effects appeared weaker and less consistent than initially expected. Rather than strongly reinforcing ideological isolation, the algorithm showed moderate tendencies toward ideological reinforcement, accompanied by periodic shifts toward broader or more neutral content. These observations collectively suggest a more nuanced understanding of YouTube’s algorithmic behavior, indicating the platform may not foster ideological isolation or extreme polarization as aggressively as commonly perceived.
Learning Beyond Lectures: Students’ Reflections on Experiential Learning
When students were asked how this experiment might be improved in future replications, they suggested including a “control group,” which would subscribe exclusively to YouTube channels unrelated to political content, thereby providing a baseline for comparison. Also, students further suggested that it would be better to examine YouTube recommendations on the sidebar rather than on the homepage, considering YouTube users’ watching behaviors. Students also made clear that the results from our project is not generalizable, as the size of the data was small. These suggestions demonstrate that students clearly understood the importance of including a control condition in experimental design, as well as the significance of ensuring ecological validity by closely reflecting real-world viewing behaviors. Although these methodological principles were not explicitly taught through traditional lectures, students effectively learned and internalized them through the process of conducting the experiment itself.
Students also reflected deeply on how the project shaped their perspectives on future interactions with social media. Many reported heightened awareness of algorithmic influence and developed a more critical mindset toward online political content. As Courtney expressed, “The knowledge I’ve gained about algorithms and echo chambers will help me in the future to understand the videos that come up on my recommended page and understand why certain videos pop up on my social media.” Similarly, Holland reflected, “I learned how what you watch matters—it heavily affects how you get your news and learn information. This project showed me the importance in being impartial when looking for news and how content creators are not always the correct source, even if their job is citizen journalism.” These student reflections highlight the project’s significant impact, not only in terms of conceptual understanding but also in shaping their future approaches to digital media consumption and engagement.
This project underscores the significant educational value of experiential learning in political communication, particularly when teaching complex phenomena such as effects of social media algorithms. Students’ reflections further demonstrated the effectiveness of this hands-on learning approach, demonstrating that experiential activities can significantly shape how they critically interact with and interpret social media content in their daily lives. Moving forward, educators should increasingly adopt such active, inquiry-driven methods, equipping students as critically engaged citizens capable of thoughtfully navigating the digital political environment.
References
Brittanica (2025). PBS. Retrieved from https://www.britannica.com/topic/Public-Broadcasting-Service
Bucher, T. (2018). If… Then: Algorithmic power and politics. Oxford University Press.
Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press.
Guess, A. M., Nyhan, B., & Reifler, J. (2018). Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign. European Research Council Working Paper.
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Press.
Secular Talk. (2025). Retrieved from https://www.youtube.com/user/SecularTalk.
Stroud, N. J. (2008). Media use and political predispositions: Revisiting the concept of selective exposure. Political Behavior, 30(3), 341–366.
Sang Jung Kim is an assistant professor in the School of Journalism and Mass Communication at the University of Iowa. Her research focuses on the interaction between technology, politics, and social identity, with particular attention to the mediating role of social media platforms and the spread of information to the public.
[1] Copyright © 2025 Sang Jung Kim. Licensed under the Creative Commons Attribution Non-commercial No Derivatives (by-nc-nd). Available at https://politicalcommunication.org.
[2] PBS, or the Public Broadcasting Service, is a private, nonprofit American corporation composed of public television stations across the United States that provides educational and cultural content to the public (Brittanica, 2025).
[3] Secular Talk is a progressive political commentary hosted by Kyle Kulinski (Secular Talk, 2025).