The Male Gaze in Algorithmic Media

Algorithms, often regarded as impartial tools, are essentially sets of instructions programmed to process data. However, they inherit the biases of their creators and the datasets they are trained on. This becomes problematic when these biases show harmful stereotypes, such as the male gaze, in digital spaces. As Erving Goffman (1959) argues in The Presentation of Self in Everyday Life, the ways in which individuals present themselves in public are influenced by the expectations and norms imposed by society, which is reflected in how algorithms shape online identities.

Take social media platforms like TikTok, where algorithms dictate the content users see. These platforms rely on engagement metrics such as likes, shares, and comments to determine what is promoted. Research has shown that these algorithms tend to show hypersexualized and idealized images of women, often favoring beauty standards, slim body types, and youthfulness. For example, a 2021 study by the Algorithmic Justice League found that TikTok For You feed disproportionately promoted content featuring attractive women in revealing clothing, sidelining diverse creators who did not fit. This phenomenon is consistent with the work of Miller (1995), who discussed how digital platforms encourage self-presentation that conforms to societal ideals, reinforcing visibility for those who align with these norms.#

Unlocking the Power of Colour: Dressing for Your Skin Tone – Lavender ...

This bias is not accidental. It is the result of historical and cultural trends embedded in the datasets that control these algorithms. Many image recognition algorithms are trained on datasets dominated by eurocentric and male visual preferences, which reinforce a narrow view of beauty. Similarly, platforms optimise for engagement, and content that aligns with the male gaze which performs better because it attracts more clicks and interactions. According to James Allen-Robertson (2016), the materiality of digital media, including the algorithms that drive them, reflects the broader societal structures and inequalities that they are embedded in. 

The implications of bias are far reaching. Women creators often feel pressured to these narrow standards to gain visibility and grow their audiences. A 2020 survey of Instagram influencers revealed that 67% of female respondents felt compelled to post photos that align with these algorithmic preferences to maintain their follower count and brand collaborations. Miller (1995) explores how digital environments facilitate self presentation that conforms to prevailing ideals of beauty, which places pressure on women to craft their online personas to fit algorithmic preferences.

Skin Tight Wedding Dress

The bias also impacts intersectional identities. Women of color, plus size women, and non binary individuals often face greater invisibility. For example, creators from oppressing communities have reported being shadowbanned, a practice where their content is not prioritized or hidden by algorithms, often without explanation. This disproportionately affects posts about social justice, LGBTQ+ rights, or body positivity that challenge dominant norms. Allen-Robertson (2016) also highlights how algorithmic systems are designed around dominant cultural narratives, sidelining content that challenges or diverges from these norms.

Ultimately, algorithmic bias is a cultural issue that perpetuates social inequalities, requiring efforts to diversify datasets, enforce ethical oversight, and prioritize inclusivity over engagement in tech development.

References 

Erving Goffman (1959)The Presentation of Self in Everyday Life 

Miller, H. (1995)The Presentation of Self in Electronic Life preferences.

James Allen-Robertson (2016)The Materiality of Digital Media 

Leave a Reply