Author: Wanchen Li

  • blog3

    Blog 3: Mobilising for Change: How Digital Media Powers Modern Activism

    INTRODUCTION: CONNECTIVE ACTION IN THE DIGITAL AGE

    As a climate justice activist, I would harness the power of digital media to raise awareness, mobilise support, and pressure decision-makers. Digital technologies have redefined how movements operate, allowing decentralised participation and personalised engagement. According to Bennett and Segerberg (2012), “connective action” replaces hierarchical structures with flexible networks driven by shared digital content. Movements like Fridays for Future and #StopCambo exemplify how individuals coalesce around emotionally resonant frames rather than rigid organisations.These movements thrive not because of charismatic leaders but due to their ability to allow participants to act on personal terms while contributing to a collective cause (Bennett & Segerberg, 2012). In this way, digital media shift activism from centralised coordination to bottom-up storytelling, often mediated by hashtags, memes, and user-generated content.This personalised and loosely structured model of participation also helps sustain engagement over time. Unlike traditional top-down campaigns, connective action allows individuals to tailor their participation—whether through reposting, attending protests, or sharing stories—making it easier to maintain long-term involvement without burnout. As Tufekci (2013) argues, this kind of digitally enabled engagement blends emotional appeal with practical mobilisation, offering a resilient alternative to older models of movement building.

    TELLING STORIES, BUILDING COMMUNITIES

    Social media enables storytelling that personalises large-scale issues. For instance, TikTok videos showing flooding or forest loss offer emotional appeals that statistics alone cannot achieve. By amplifying personal narratives, movements build emotional resonance and a sense of shared urgency (Ozkula, 2021). This narrative-driven strategy increases engagement and encourages peer-to-peer diffusion.Platforms like Instagram have further enabled visual documentation of climate strikes and frontline activism. A single viral story—such as a teen activist filming plastic pollution in their town—can motivate thousands. Emotional proximity creates a feeling of global solidarity. As seen in the #AllEyesOnRafah campaign, visual storytelling galvanises public empathy and transnational support.This short documentary highlights how real-world change can begin with online narratives, showing examples of young activists using platforms to link local experiences to global issues—making the abstract effects of climate change emotionally real for global audiences.

    MICRO-ACTIONS AND HASHTAG CAMPAIGNS

    To capitalise on this I would launch micro-actions that require minimal commitment but high visibility—for example, a “Plastic Free Day” challenge on Twitter or a TikTok trend using the hashtag #GreenerNow. These allow participants to engage on their own terms while embedding them in a shared cause. The role of hashtags here is crucial: they not only classify content but also create ephemeral publics around which participants gather (Bruns & Burgess, 2011). These publics can transform from passive observers to active contributors when the campaign messaging aligns with existing online identities and values.Such tactics also make activism accessible to those who may be time-constrained or physically limited. This democratisation of participation is central to the logic of connective action. However, it requires careful design to prevent message dilution and ensure that actions retain substance.

    VISUAL CULTURE AND PLATFORM LOGIC

    Digital activism is increasingly visual. Infographics, memes, and short-form videos condense complex arguments into shareable formats. I would collaborate with visual designers to ensure content is accessible and culturally resonant. However, activists must also work with platform logic. TikTok’s For You Page, for instance, privileges certain content forms, so understanding algorithmic preferences is crucial for visibility (Castillo Esparcia et al., 2023).Algorithmic literacy is therefore essential.

    If activists want their message to reach beyond the choir, they must adapt formats for speed, engagement, and emotional impact. Platform-specific strategies, such as duets on TikTok or carousel posts on Instagram, can increase reach.In this context, the aesthetics of activism matter. Studies have shown that colourful, emotionally resonant, and minimalist visuals are more likely to be reshared, especially among younger audiences (León & Bourgeois, 2020). Therefore, as a movement grows, maintaining visual coherence becomes a form of branding—essential to message recall and campaign longevity.

    CAUTION AGAINST CLICKTIVISM

    While digital tools lower entry barriers, they risk encouraging “clicktivism”—performative acts with minimal offline impact. To counter this, I would embed digital participation in a broader strategy including petitions, in-person protests, and policy lobbying. According to Tufekci (2013), sustained movements require a hybrid model blending digital reach with institutional pressure.

    In the UK, Extinction Rebellion has shown how online awareness must translate into roadblocks, court cases, and government negotiations. Clicks do not equal change, but they can be catalysts if strategically leveraged.

    COUNTERING MISINFORMATION AND BACKLASH

    Online activism faces backlash, trolling, and misinformation. Pre-emptive moderation strategies and content verification are essential. Creating alliances with credible influencers and using platform tools to report abuse can help mitigate risks. Transparency about funding, goals, and methods also builds credibility.For example, Fridays for Future ensures its messaging includes citations, partnerships with climate scientists, and frequent Q&As to educate new followers. Verified content pinned to profiles increases legitimacy and helps newcomers distinguish activism from disinformation.Another effective strategy is the co-creation of content with scientists, artists, and public intellectuals. This not only diversifies the voice of the campaign but also reduces the burden on activists to “educate” on every topic. As highlighted by Christiansen (2009), coalition-building within movements enhances legitimacy and extends reach beyond activist circles into policy, media, and academia.

    CONCLUSION: DESIGNING RESILIENT ACTIVISM

    Digital media provides unprecedented tools for activism, but their power lies in strategic, thoughtful deployment. A successful campaign combines storytelling, participation, and hybrid tactics to create momentum and resilience. The aim is not virality for its own sake but meaningful, collective transformation.As ethical activists, we must not only ride the algorithmic wave but question who owns the surfboard. Sustaining change requires us to balance platform optimisation with values of transparency, inclusion, and justice. Only then can hashtags become headlines—and headlines become policies.Ultimately, the success of digital activism hinges on reflexivity—the ability to critique the very tools being used. Platforms are not neutral; they are governed by economic incentives, algorithmic bias, and uneven access. Sustainable activism therefore includes media literacy education for participants, equipping them not only to campaign but to understand and reshape the infrastructure they operate within (Bennett & Segerberg, 2012).

    References (Harvard Style):

    Bennett, W.L. and Segerberg, A. (2012) ‘The Logic of Connective Action: Digital media and the personalization of contentious politics’, Information, Communication & Society, 15(5), pp. 739–768. doi: 10.1080/1369118X.2012.670661.

    Bruns, A. and Burgess, J. (2011) ‘The use of Twitter hashtags in the formation of ad hoc publics’, Proceedings of the 6th European Consortium for Political Research (ECPR) General Conference, pp. 1–9.

    Castillo Esparcia, A., Caro Castaño, L. and Almansa-Martínez, A. (2023) ‘Evolution of digital activism on social media: opportunities and challenges’, Revista Latina de Comunicación Social, 81, pp. 191–208. doi: 10.4185/RLCS-2023-1575.

    Christiansen, J. (2009) Four stages of social movements. Theories of Social Movements. pp. 14–24.

    León, B. and Bourgeois, C. (2020) ‘Visualising climate change: An exploration of infographics and data visualisation for climate communication’, Journal of Science Communication, 19(6). doi: 10.22323/2.19060201.

    Ozkula, S.M. (2021) ‘What is digital activism anyway? Social constructions of the “digital” in contemporary activism’, Journal of Digital Social Research, 3(3), pp. 60–84. doi: 10.33621/jdsr.v3i3.91.

    Tufekci, Z. (2013) ‘Not this one: Social movements, the attention economy, and microcelebrity networked activism’, American Behavioral Scientist, 57(7), pp. 848–870. doi: 10.1177/0002764213479367.

    Vox (2021) How social media activism is changing the world [Video]. YouTube. Available at: https://www.youtube.com/watch?v=8KY7_runERg (Accessed: 29 May 2025).

    Words :994

  • blog2

    Shaped by the Feed: The Personal Costs of Data-Driven Marketing

    INTRODUCTION: FROM CONSUMERS TO TARGETS
    Data-driven digital advertising transforms how individuals are perceived and shaped online. Rather than neutral commerce, personalised ads leverage behaviour, demographics, and psychometrics to nudge decisions and reinforce identities (Nadler & McGuigan, 2021). This practice, often invisible, reinforces stereotypes, manipulates behaviour, and compromises autonomy.Indeed, users are no longer just audiences; they are algorithmically segmented subjects whose emotional vulnerabilities and cognitive shortcuts are monetised (Zuboff, 2019). This transformation reshapes not only how people shop but also how they construct their sense of self and social belonging.Indeed, users are no longer just audiences; they are algorithmically segmented subjects whose emotional vulnerabilities and cognitive shortcuts are actively exploited for profit (Büchi, 2021; Thomas & Docherty, 2021).Platforms don’t just respond to user preferences, they predict and influence them. Users are becoming more predictable as our online choices are incorporated into machine learning systems, reducing individual agency and reinforcing feedback loops that limit exploration.

    PERSONALISATION OR PROFILING?
    Algorithmic personalisation promises relevance. However, these systems rely on tracking everything from likes to scrolling speed, turning users into predictable behavioural profiles (Zuboff, 2019). Eubanks (2018) notes such profiling reinforces inequalities when categorising by gender, race, or income.This is particularly visible in credit, housing, and job recruitment platforms, where automated decision-making has led to systemic exclusions (Eubanks, 2018; Angwin et al., 2016). When personalisation crosses into predictive profiling, it risks determining what options individuals even see—removing their ability to choose freely.This is particularly visible in credit, housing, and job recruitment platforms, where automated decision-making has led to systemic exclusions (Eubanks, 2018; Angwin et al., 2016).

    STEREOTYPING THROUGH ALGORITHMS
    Studies demonstrate ad delivery replicates biases. Ali et al. (2019) found job ads for higher-paid positions are more likely shown to men. Facial recognition-based targeting can further racial discrimination, amplifying inequalities (Noble, 2018).This reflects what Noble calls “algorithmic oppression,” where the logic of optimisation embeds social prejudice into everyday digital interactions (Noble, 2018).

    It not only limits opportunity but normalises discriminatory norms under the guise of efficiency.This reflects what Noble calls ‘algorithmic oppression,’ where the logic of optimisation embeds social prejudice into everyday digital interactions (Noble, 2018).

    MICROTARGETING AND POLITICAL INFLUENCE
    Cambridge Analytica exemplified how psychological profiling manipulates voters through emotional targeting (Cadwalladr, 2018). These techniques undermine democratic processes by encouraging emotional reactions instead of rational deliberation.The logic of microtargeting is inherently asymmetrical—it grants campaigners granular insight into individuals while denying the public a shared informational ground. As a result, citizens are offered fragmented, polarised versions of political reality, often shaped by behavioural triggers rather than civic engagement (Bennett & Segerberg, 2012).The logic of microtargeting is inherently asymmetrical—it grants campaigners granular insight into individuals while denying the public a shared informational ground.

    EMOTIONAL EXPLOITATION AND FAKE NEWS
    Emotionally charged ads spread misinformation because engagement, not truth, drives algorithms. During COVID-19, anti-vaccine conspiracy theories proliferated due to their provocative nature, creating public harm (Frenkel et al., 2020).Emotionally manipulative content, such as fear-based messages or fabricated claims, enjoys algorithmic preference due to its virality (Pennycook & Rand, 2018). This undermines users’ epistemic agency—the ability to filter what’s credible—by constantly prioritising affect over evidence.Emotionally manipulative content, such as fear-based messages or fabricated claims, enjoys algorithmic preference due to its virality (Pennycook & Rand, 2018).

    CONSENT WITHOUT CONTROL
    Consent becomes meaningless when dark patterns push users towards data sharing without informed understanding (Mathur et al., 2019). Power asymmetry prevents meaningful user control. Even when users are given options, they are often buried in lengthy policies or nudged through misleading interface designs. “Accept All” buttons are prominently placed, while opt-out mechanisms are hidden or require extra effort. This architecture exploits cognitive fatigue, not informed choice.For instance, cookie banners often use misleading design to nudge users into accepting tracking, a phenomenon termed “privacy Zuckering” (Brignull, 2020). Furthermore, the complexity of privacy policies leaves users unable to comprehend what they are agreeing to (Nouwens et al., 2020).

    RESISTANCE AND REGULATION
    Despite regulatory frameworks like GDPR, enforcement is weak. Structural reform alongside digital literacy is required for meaningful resistance. Beyond institutional change, user-led resistance is emerging in the form of ad blockers, privacy-focused browsers, and ethical tech movements. While these actions are limited in scope, they signal a growing awareness that behavioural data should not be exploited unchecked.As Gillespie (2018) argues, platforms act as “custodians of the public sphere” yet operate under corporate imperatives that often conflict with public accountability. Without global coordination and transparent oversight, legal measures struggle to keep pace with technological innovation. Civil society and educators play a crucial role in equipping users to resist manipulation.As Gillespie (2018) argues, platforms act as ‘custodians of the public sphere’ yet operate under corporate imperatives.

    CONCLUSION: AUTONOMY IN THE AGE OF ALGORITHMS
    Data-driven advertising shapes how we shop, vote, and think. Protecting autonomy demands critical examination and accountability from those profiting off shaping digital identities. While technology itself is not inherently harmful, its current implementation prioritises commercial gain over human dignity. Reclaiming digital autonomy involves not just resisting invasive advertising, but reimagining a digital economy where users are citizens, not just data points.True reform requires not only corporate transparency but also participatory design approaches that foreground user agency. Until then, the digital marketplace remains an uneven battlefield—one where personal data is currency and attention is the product.True reform requires not only corporate transparency but also participatory design approaches that foreground user agency.

    References (Harvard Style):

    Ali, M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A. and Rieke, A., 2019. Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), pp.1–30.

    Angwin, J., Larson, J., Mattu, S. and Kirchner, L., 2016. Machine bias. ProPublica. Available at: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

    Bennett, W.L. and Segerberg, A., 2012. The logic of connective action. Information, Communication & Society, 15(5), pp.739–768.

    Brignull, H., 2020. Deceptive by Design: The Dark Patterns in Modern UI. Darkpatterns.org.

    Cadwalladr, C., 2018. The great British Brexit robbery: how our democracy was hijacked. The Guardian.

    Eubanks, V., 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press.

    Frenkel, S., Alba, D. and Zhong, R., 2020. Surge of Virus Misinformation Stumps Facebook and Twitter. The New York Times.

    Gillespie, T., 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.

    Mathur, A., Acar, G., Friedman, M.G., Lucherini, E., Mayer, J., Chetty, M. and Narayanan, A., 2019. Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), pp.1–32.

    Nadler, A. and McGuigan, L., 2021. An impulse to exploit: The behavioral turn in data-driven marketing. New Media & Society, 23(8), pp.2129–2148.

    Noble, S.U., 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

    Nouwens, M., Liccardi, I., Veale, M., Karger, D. and Sax, M., 2020. Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. CHI Conference on Human Factors in Computing Systems, pp.1–13.

    Pennycook, G. and Rand, D.G., 2018. The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Stories Increases Perceived Accuracy of Stories Without Warnings. Management Science, 66(11), pp.4944–4957.

    Zuboff, S., 2019. The Age of Surveillance Capitalism. PublicAffairs.

    Word count: 892
     

  • blog1

    Scrolling into Stress? Rethinking Social Media’s Impact on Well-Being

    INTRODUCTION: THE HIDDEN COST OF ALWAYS ONLINE

    Social media platforms promise connection and creativity, but are they quietly undermining our well-being? Digital well-being is defined as ‘the feeling of fulfilment, utility and control when using digital technologies’ (Büchi, 2021). This blog argues that the design and business models of social media platforms often work against users’ digital wellbeing, promoting addictive behaviours and exacerbating psychological stress.In a hyper-connected world where notifications are constant and information is constantly refreshed, users feel the pressure to remain visible online. This “always-on” culture leads to cognitive overload and a blurring of the line between rest and stimulation (Carmi et al., 2020) As more and more aspects of life move online, understanding how platforms shape these behaviours becomes crucial.

    PLATFORM DESIGN: DESIGNED TO BE ADDICTIVE

    Features such as infinite scrolling, push notifications and auto-playing videos are purposely designed to maximise a user’s time on a platform. These are persuasive techniques that utilise cognitive biases (Eyal, 2014). For example, TikTok’s “for you page” constantly serves up emotionally charged content to users, reducing their ability to self-regulate their usage and increasing digital fatigue (Montag et al., 2021). Whilst these tools may increase user engagement, they diminish users’ control over their digital habits.According to research by Alter (2017), the psychological mechanisms behind these designs are similar to those used in gambling machines—reward loops, unpredictability, and anticipation. Users may intend to spend five minutes on an app, but instead lose hours to content that was never intentionally searched for. These manipulative designs are not flaws—they are features intended to maximise data collection and monetisation.These behaviours are particularly concerning for younger users, who are developmentally more vulnerable to compulsive usage patterns. Kuss and Griffiths (2017) argue that social media addiction is linked to anxiety, poor sleep, and academic underperformance, especially among adolescents. Their study shows that compulsive checking of notifications becomes a form of self-soothing—ironically reducing mental health over time. As such, design becomes a matter of public health, not just usability.

    How to Find Trending Songs on TikTok

    BUSINESS MODELS: MONITORING CAPITALISM

    Behind the friendly user interface is a system of data extraction. platforms such as Instagram and Facebook monetise attention by collecting behavioural data to sell targeted advertising (Zuboff, 2019). This economic logic drives platform design choices that prioritise user retention over user well-being. Even features like ‘screen time reminders’ are optional, ineffective, and more PR than protection (Ranadive and Ginsberg, 2018).These models, often called surveillance capitalism, raise ethical concerns around informed consent. Most users are unaware of the extent to which their personal data is harvested, inferred, and commodified. This lack of transparency erodes trust and places users in a passive, extractable role, rather than as autonomous participants in digital space (Hödl and Myrach, 2023).

    Content and Emotional Damage

    Social media can also expose users to toxic content, from body image ideals to misinformation. Visual platforms in particular contribute to a culture of comparison. Young women often compare themselves to idealised images, often leading to lowered self-esteem and anxiety (Fardouly et al., 2015.) This psychological impact is amplified when platforms algorithmically promote engagement with such content.The phenomenon of doomscrolling, particularly amid global crises, intensifies unpleasant sentiments. Research undertaken during the COVID-19 pandemic demonstrated that extended exposure to upsetting content worsened anxiety, loneliness, and depressed symptoms, especially among young individuals (Gao et al., 2020). Notwithstanding this, algorithms persist in promoting this material due to its elevated engagement rates.In addition, the illusion of control created by curated feeds and algorithmic personalisation often deceives users into believing they are making conscious choices. Docherty (2021) warns that the automation of emotional experience on platforms—through targeted content that mimics empathy—can lead to false intimacy and digital manipulation. This has implications for our emotional autonomy and contributes to longer-term disconnection from genuine human interactions.

    Supportive communities: a multifaceted landscape
    It is essential to acknowledge that platforms may also provide peer assistance, especially in the domain of mental health. Online communities offer platforms for affirmation and the exchange of experiences, potentially improving well-being (Naslund et al., 2016). Nonetheless, many beneficial applications endure despite platform motivations.Platforms like Reddit or Twitter host subcommunities where users share coping strategies or discuss struggles openly. Yet even these spaces face moderation challenges, with some communities being hijacked by harmful advice or misinformation. Without robust safeguarding tools, such spaces can become double-edged swords, offering both help and harm.

    Conclusion: digital design requires transformation
    Although social media fosters friendships, its structural design may frequently harm users’ emotional well-being. To advance authentic digital health, platforms must emphasise ethical design, transparency, and algorithmic accountability. Until that time, users will be confined within systems engineered for profit rather than well-being.As users, we also have a voice. Through digital literacy, screen time awareness and platform pressure campaigns, we can influence the development of these systems. However, long-term solutions require regulatory intervention and ethical leadership from the tech industry. Only then will digital spaces become places of empowerment rather than places of exhaustion.

    References (Harvard Style)


    Alter, A. (2017). Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked. New York: Penguin Press.

    Büchi, M. (2021). Digital Wellbeing. Cambridge: Polity Press.

    Carmi, E., Yates, S., Lockley, E. and Pawluczuk, A. (2020). Digital Wellbeing: Developing a New Metric for the Digital Age. Palgrave Macmillan.

    Docherty, N. (2021). The automation of empathy in platform design. International Journal of Communication, 15, pp. 2730–2748. https://ijoc.org/index.php/ijoc/article/view/17721

    Eyal, N. (2014). Hooked: How to Build Habit-Forming Products. Penguin.

    Fardouly, J., Diedrichs, P.C., Vartanian, L.R. and Halliwell, E. (2015). Social comparisons on social media: The impact of Facebook on young women’s body image concerns and mood. Body Image, 13, pp.38–45.

    Gao, J., Zheng, P., Jia, Y. et al. (2020). Mental health problems and social media exposure during COVID-19 outbreak. PLoS ONE, 15(4), e0231924.

    Hödl, T. and Myrach, T. (2023). Content creators between platform control and user autonomy: the role of algorithms and revenue sharing. Business & Information Systems Engineering, 65(5), pp.497–519.

    Kuss, D.J. and Griffiths, M.D. (2017). Social networking sites and addiction: Ten lessons learned. International Journal of Environmental Research and Public Health, 14(3), 311.

    Montag, C., Lachmann, B., Herrlich, M. and Zweig, K. (2021). Digital detox: An effective solution or just another digital myth? Addictive Behaviors Reports, 13, 100339.

    Naslund, J.A., Aschbrenner, K.A., Marsch, L.A. and Bartels, S.J. (2016). The future of mental health care: peer-to-peer support and social media. Epidemiology and Psychiatric Sciences, 25(2), pp.113–122.

    Ranadive, A. and Ginsberg, D. (2018). New Tools to Manage Your Time on Facebook and Instagram. [online] Meta. Available at: https://about.fb.com/news/2018/08/manage-your-time/ (Accessed: 20 May 2025).

    Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs.

    Word count:819