Why Can't Black Women Just Be?: Black Femme Content Creators Navigating Algorithmic Monoliths

Gianna Williams, Khoury College, Northeastern University, Boston, Massachusetts, USA, williams.gia@northeastern.edu
Natalie Chen, Northeastern University, Boston, Massachusetts, USA, chen.nat@northeastern.edu
Michael Ann DeVito, Khoury College of Computer Sciences, Northeastern University, Boston, Massachusetts, USA, m.devito@northeastern.edu
Alexandra To, Art + Design and Khoury College of Computer Science, Northeastern University, Boston, Massachusetts, USA, a.to@northeastern.edu

Content creation allows many online social media users to support themselves financially through creativity. The “creator economy” empowers individuals to create content (i.e. lifestyle, fitness, beauty) about their interests, hobbies and daily life. Social media platforms in turn moderate content (e.g., banning accounts, flagging and reporting videos) to create safer online communities. However, Black women, femme, and non-binary people content creators have seen their content disproportionately suppressed, thus limiting their success on the platform. In this paper, we investigate Black femme content creators’ (BFCC) theories about how their identities impact both how they create content and how that content is subsequently moderated. In our findings, we share the perceptions participants felt the algorithm constrains Black creators to. We build upon Critical Technocultural Discourse studies and algorithmic folk theories attributed to Black women and non-binary content creators to explore how Black joy can be prioritized online to resist algorithmic monoliths.

CCS Concepts:Human-centered computing → Empirical studies in collaborative and social computing; • Social and professional topics → Race and ethnicity; • Social and professional topics → Gender;

Keywords: Blackness and the Internet, Online Communities, Critical Algorithmic Studies, Black femmes, Content Creators, Algorithmic Folk Theory

ACM Reference Format:
Gianna Williams, Natalie Chen, Michael Ann DeVito, and Alexandra To. 2025. Why Can't Black Women Just Be?: Black Femme Content Creators Navigating Algorithmic Monoliths. In CHI Conference on Human Factors in Computing Systems (CHI '25), April 26--May 01, 2025, Yokohama, Japan. ACM, New York, NY, USA 14 Pages. https://doi.org/10.1145/3706598.3713842

1 Introduction

Social Media has grown over the past ten years with many people using it as a place to express their creativity and passions. Specifically, TikTok became popular in 2019 after the Chinese company, ByteDance acquired the popular dance company Musical.ly. During the 2020 pandemic, TikTok grew to become one of the most popular social media platforms. The platform began to rely on the creator economy where a content creator is defined as “someone who creates entertaining or educational material to be expressed through any medium or channel” [56]. The creator economy was studied by Paul Saffo. In his work he argued that the economic recession led to people finding ways to commodify their daily activities [73]. Now, TikTok and other social media platforms are creating opportunities for creators through “creator funds” [62]. Creator funds are determined by the creator's ability to meet the platform's requirements for followers and content views. This enables content creators to produce videos across a wide range of genres, empowering individuals to turn their hobbies and passions into sustainable online careers.

Machine learning recommendation systems shape the trends we see on social media platforms. [83, 89]. These models are curated based on metadata gathered from users’ likes, comments and hashtags [8, 89]. However, AI fairness research has revealed significant biases in these systems. [15, 48, 66]. Specifically, marginalized people have experienced inequities in the creator economy [41, 42, 50]. These challenges include an increased risk of harassment, aggressive content moderation [39, 60, 79], and unequal pay [17]. These issues reflect a broader pattern of unequal treatment of marginalized groups by social media platforms.

Algorithmic bias on social media platforms results in unfair and inaccurate predictions and classifications affecting marginalized users. This leads to over-moderation, increased online harassment, and mistreatment of marginalized content creators, who must work around these challenges to succeed in the creator economy [34]. BIPOC content creators, in particular, have voiced concerns about their experiences on the TikTok platform [41]. Researchers are beginning to explore this phenomenon through interview studies focused on online communities making a living online.

In this study, we explored how Black Women and non-binary content creators (hereafter referred to as femme1) use TikTok, highlighting their experiences with content moderation and investigating how intersecting identities affected their experiences as content creators. We conducted a semi-structured interview study with 10 Black femme content creators (BFCCs). We use Black Feminist Theory, digital Black feminism, and algorithmic folk theory as lenses to interpret our participants’ experiences. The results illustrate how the platform places BFCCs into controlling narratives, as described by Patricia Hill Collins [43], and how these narratives add to Nadia Karizat's Strainer Algorithmic Folk Theory [47]. We conclude our analysis with a discussion of how designers and engineers can center Black joy in social media feeds, rather than restricting narratives to those that fail to fully represent the Black experience. We explored this through the research questions (RQs) below:

  • How do Black femme content creators perceive their experiences with content moderation on social media platforms?
  • How do Black femme content creators resist content moderation?
  • What folk theories do Black femme content creators have on TikTok?

2 Related Work

In this section, we provide context for the significance of the content creator economy, including how creators are impacted by content moderation. We review prior research on the specific experiences of Black users on social media and examine how marginalized people more broadly use folk theories to make sense of their experiences online.

2.1 Content Creation

In tandem with the rise of social media, new pathways have emerged for the creation of video content for monetary gain [87]. In this section, we explore the history of content creation and what the creator economy affords to content creators. Vlogging (video blogging), fashion, books, gaming and other interests have allowed content creators to explore their personalities and interests. Content creation typically falls into two categories: (1) those who pursue it for full-time financial support and (2) those who use it casually for their own enjoyment [9]. Both options grant financial compensation, however, those who take content creation seriously often create schedules to prioritize their content, investing their own time and money to maintain engagement. During the early 2010s, social media platforms such as YouTube and Instagram created an expansive market for influencers to advertise products, fashion, and ideas to adolescent teens [45]. This content tended to be popular with Gen-Z and millennial viewers. In the early 2010s, YouTube became a popular platform for content due to its ability to reach a wide audience across various social media channels and its support for long-form videos. Crystal Abidin defines influencers as "everyday, ordinary internet users who accumulate a relatively large following on blogs and social media through the textual and visual narration of their personal lives and lifestyles," and monetizing their activity through the promotion of commercial brands in their content [1]. Content creators differ from mainstream celebrities, who are established in other popular entertainment domains (such as modeling, music or acting). The term ‘microcelebrity’ indicates the broadening of content creators’ audiences through social networking sites. A micro-celebrity is defined as a social media user with a relatively large following and a niche audience.[58, 59]. Although they may not have the status of mainstream media celebrities, their fashion, beauty and lifestyle content on social media serves a similar purpose. On TikTok, the creator fund is allocated to content creators who meet the platform's eligibility criteria, allowing everyday people to get paid for sharing their daily lives [62] .

While being able to monetize different aspects of one's life can be rewarding, many researchers have reported negative effects associated with being a content creator. The stigma surrounding influencers often paints content creators as either self-absorbed or contributing to issues of over-consumption. [84]. Additionally, influencers are vulnerable to harassment, unfair content moderation, and doxing. Brooke Duffy explores the psychological impacts that affect content creators that work within the creator economy [27]. More specifically, the stress and boundary conflicts that are afforded on social media. She argues that these issues arise when creators try to differentiate their personal and professional lives online. Duffy indicates that the demands of increased social media engagement can lead to significant emotional labor, which in turn contributes to feelings of exhaustion and burnout among those who contributes to the creator economy. Researchers have specifically explored how LGBTQ+ content creator communities have been affected by these practices. For example, many marginalized content creators feel that they need to prove that their content is being moderated to justify their low engagement [23]. In our research, we examine the experiences of Black femme content creators to shed light on the growing issue of unfair content moderationwhile exploring their unique experiences as content creators. In addition, we explore TikTok through Black online communities to better understand the inequities affecting content creation in marginalized communities.

2.2 Black Experiences on Social Media

There exists a substantial body of digital humanities literature investigating the experiences of Black online communities on social media. Here we review Black experiences in online spaces as well as briefly introduce Black Feminist Theory and Digital Black Feminism as key frameworks for interpreting these experiences.To start, many digital humanities and race scholars have argued that race and ethnicity are inextricable from the "culture” of the Internet. For example, André Brock's work utilizes Critical Technocultural Discourse Analysis (CTDA) to explore the relationships of Black users on Black Twitter [11]. In his work he argues that Black Twitter is a collective of Black users who come together in discourse bounded by shared cultural experiences. Brock and Sarah Florini explain that Black Twitter's "live tweeting” works as storytelling [11, 32] where hashtags like #ThanksgivingforBlackFamilies and #BlackBoyJoy create a digital common space of shared stories that are archival [13] due to the finiteness of the internet. This emergence of Black technoculture relates to the Black Feminist tradition of storytelling and signifying [43] through preservation. Black online communities have always played a significant role in digital humanities and the culture and even preservation of the internet.

Online spaces also give way to new negotiations of Black identity - both providing space for identity formation as well as presenting new challenges in racial bias and oppression. For example, Maryann Erigha and Ashley Crooks-Allen researched Black girls’ use of online platforms to engage with identity formation, empowerment, and community through three digital campaigns : Black Girls Rock !, Well-Read Black Girl, and SayHerName [29]. In these campaigns, they see how online spaces enable Black girls to create care networks, assert their voices, and challenge dominant narratives [43] of race, gender, and media representation.

However, at the same time, the racial oppression of our offline world is often remade and reppetuated through digital systems [67]. In the most extreme yet often still "everyday” cases, Black online communities face persistent problems including racist harassment, infiltration of Black spaces, theft of Black cultural production, the digital commodification of Blackness, and excessive forms of content moderation. For example, the online recommendation systems that proliferate social media reinforce racialized biases against Black users [38]. Camille Harris et al. used Twitter datasets for African American English (AAE) dialect and hate speech classifiers to investigate the fine-grained relationship between the specific language of AAE, such as word choice and grammatical features, and hate speech predictions. Harris’ work pushes for more inclusive training data to reduce biases in automated content moderation tools, which disproportionately affect Black social media users [40]. Similarly, Mutale Nkonde argues that racial bias is embedded in facial recognition technologies, impacting Black communities. She explores how these systems are deployed in heavily surveilled urban areas such as Brooklyn, disproportionately misidentifying Black people due to the surveillance technologies’ reliance on biased datasets. Nkonde goes on to argue that this exacerbates racial profiling, over-policing in Black neighborhoods and wrongful arrests [65]. Allison Koenecke's et al. studied how racial disparities in mainstream and broadly used automated speech recognition technologies have significant performance gaps particularly for AAE speakers [54]. Along with this, Tyler Musgrave's et al. demonstrates that online harassment and unwarranted behavior toward Black users led to distrust in social media platforms, resulting in negative relationships with self and others [64]. Gabriela Richard and Kishonna Gray explore how video games repurpose the social structures of race and gender, thus perpetuating exclusionary practices, particularly toward marginalized groups [72]. Hibby Thach et al. explores the often unseen processes of content moderation on Reddit with where moderation practices are disproportionately experienced by marginalized communities [79] . We seek to add to this extensive scholarship by exploring Black femme content creators’ experiences with content moderation on TikTok and the perceptions TikTok imposes on them.

We primarily utilize Black Feminist Theory (BFT) [18, 21, 43] and digital Black feminism [76] to position our work and understand its context. BFT engages the complexity of identity as well as incorporates and understanding of systems of power and pathways for resistance that are common threads throughout the above reviewed Black experiences online. Specifically, we endorse and utilize tenets of BFT such as community building, resisting controlling narratives, and centering Black women's experiences [43] as well as complex and intersectional understandings of identity (i.e., how race, sexuality and class oppression are intersected for specifically marginalized peoples identities [18, 21]) - where our focus particularly attends to race and gender. More specifically, Patricia Hill Collins’ four domains of the matrix of domination illustrate how oppression operates through interconnected systems: structural domains, which organize and institutionalize oppression; disciplinary domains, which administer and enforce it; hegemonic domains, which sustain it through cultural and media narratives; and interpersonal domains, which manifest in individual experiences. Together, these domains reinforce the systemic marginalization of oppressed groups [43].

Catherine Knight Steele argues that the digital intersects within Black feminist tradition and practice through digital Black feminism [76]. Digital Black feminism prioritizes Black women's experiences, voices, and perspectives in digital spaces. It seeks to address and amplify the erasure and marginalization they often face in online discourse (e.g., by recognition of trendsetting by Black creators on the internet [76]). Digital Black feminism seeks to confront and challenge online spaces where misogynoir (i.e., the confluence of racism and sexism specifically targeting Black women [3]) is perpetuated, advocating for platforms to address harassment, discrimination, and hate speech targeting Black women [76]. Digital Black feminism promotes the creation of supportive online communities where Black women can connect, share experiences, and build solidarity. These spaces are essential for fostering empowerment and mutual support [76]. Encouraging and promoting the creation of diverse and accurate representations of Black women in digital content is crucial. This involves challenging stereotypes, showcasing a variety of experiences, and fostering positive narratives that disrupt the idea that Blackness is monolithic [76]. Digital Black feminism utilizes online platforms for activism and advocacy, leveraging social media and digital tools to raise awareness about issues impacting Black women and to mobilize for social change [76].

There remains a lack of exploration into the experiences of Black femme content creators specifically on Tiktok, despite Black Women content creators driving many of the trends on TikTok [7, 37]. We see our work as a contribution toward challenging the online perceptions that affect Black femmes on social media, while also addressing the need for diverse representation of Black women online. Next, we investigate the relationship between critical algorithmic studies and marginalized communities.

2.3 Folk Theories and Marginalized Communities

Many scholars in critical algorithmic studies have introduced folk theorization as a way of understanding everyday users’ expectations and mental models of how digital platforms work [25, 30, 47, 60, 88]. More specifically, algorithmic folk theories help us interpret how users navigate and document perceived algorithmic unfairness and bias [88]. Brita Ytre-Arne and Hallvard Moe describe five main folk theories the average user believes about algorithms [88]:

  • Algorithms are confining - This belief holds that algorithms narrow the user's worldview by feeding them more and more of what the user is interested in [88].
  • Algorithms are practical - This belief puts algorithms in a positive light where user's rely on them to make technology more efficient and productive. [88].
  • Algorithms are reductive - This belief holds that algorithms optimize towards stereotypes and make mistakes based on their lack of human experience and identity [88].
  • Algorithms are intangible - This belief is based on the notion that algorithms’ power is opaque and difficult to understand [88].
  • Algorithms are exploitative - This belief holds that algorithms use personal data that enables companies to sell users more products [88].

In the context of social media, algorithmic folk theories are used and created both by general users and content creators. TikTok, for example, uses recommender systems such as content-based filtering to generate their For You Page [8]. Investigations into the algorithm show that TikTok uses users’ metadata such as likes, comments, followers, and following counts to curate a user's For You Page [8]. Moreover, content creators are aware of this metadata collection and use and have strong folk theories or "assumptions” about how posting time, hashtags, comments, likes, and shares impact the likelihood their videos will be "pushed” to other users’ For You Page [53].

Folk theories are especially critical for marginalized users. We know that our offline world's systemic oppression is very often reproduced and remade through digital systems [67, 75]. For example, Michael Ann DeVito explains in her work that the unfair treatment of transfeminine hypervisibility online leads to adverse reactions on the TikTok platform, such as hate speech, harassment, and aggressive content moderation [24]. Motahhare Eslami's et al. "eye of providence" theory argues that social media platforms, such as Facebook, act like panopticons2 that determine what content users see in their feeds [30]. Users believe that the platform watches everything, controls what is shown, and filters out or highlights content based on its own criteria. Essentially, people see the platform as having total control over what appears in their social feeds. Nadia Karizat and colleagues developed "identity strainer theory," which argues that algorithms suppress certain identities while privileging others, allowing some identities to become prioritized [47]. Using this folk theory, Karizat et al. explores how social identities are suppressed.

2.4 Content Moderation

Throughout the growth of the internet, many instances of physical, emotional, and mental abuse have been perpetuated on social media [51, 61]. As a result, social media platforms have implemented content moderation to mitigate harm by removing content that is considered inappropriate [35]. Traditionally, companies have used human decision-making for content moderation, but this exposes moderators to excessive amounts of harmful content[55]. To reduce human contact with this content, social media platforms utilize machine learning models to classify harm [2]. However, in our research we see that HCI researchers have noticed marginalized content creators reporting these ML models moderate their content unfairly [35, 40].

Samuel Mayworm et al. work found that LGBTQIA users are regularly abused online, leading to negative perceptions of and lack of trust in the social media platforms they use[60]. Many content removals directly contradicted the platforms’ stated goals of inclusion, safety, and freedom of self-expression [22]. Adding to this, Reina Gossett, Eric Stanley, and Johanna Burton delve into this issue in their book Trap Door, by examining the visibility politics surrounding the transgender community [36]. The "Trap of the Visible" in transgender representation influences various aspects of transgender lives when such representation deviates from societal gender norms. Consequently, the representation of marginalized communities is often weaponized or their identities reduced to simplistic narratives. This has prompted discussions around the use of algorithmic folk theorization as a framework for evaluating the design of these platforms, as research shows that many marginalized communities do not trust social media platforms to treat them equitably, keep them safe from abuse, or fairly enforce content moderation policies.

Shadowbanning is another form of content moderation under scrutiny. Researchers in critical algorithmic studies have been working to better define shadowbanning. Content creators have defined it as a decrease in engagement on their profiles or the exclusion of their content from the recommendation algorithm, with the platform often using community guidelines as a justification for the suppression of content [20]. Throughout critical algorithmic research, content creators and everyday users have created theories that justify that definition. Our research contributes to the conversation of shadowbanning by exploring how content moderation affects Black femme content creators relationship with social media.

3 Methods

Our research investigates Black femme content creators on TikTok to understand their perception of the algorithm and their experiences with content moderation and shadowbanning. We conducted a semi-structured interview with n=10 of these content creators.

3.1 Recruitment

This project was approved by the authors’ Institutional Review Board. Once approved, we began recruiting participants from March 2024 to July 2024. We created a recruitment screening survey to better filter the study criteria. We prioritized Black non-binary people and Black women who reported content moderation issues or potential shadowbanning. We used the term ‘femme’ to represent gender identities that include queer femininity, [57] rather than adopting a cis-normative perspective.

We promoted the study by sending the survey to content creators from the authors’ personal TikTok following and using the TikTok API to filter through #BlackWomenTikTok and #BlackTikTok. We also promoted the study by searching Black content creators in the authors’ city using the TikTok search bar. Our last form of promotion was through social media platforms such as Instagram and TikTok. The majority of our success came from emailing content creators who listed their emails in their bios and incorporating snowball sampling techniques by asking participants to send our survey to others who would be interested in the study.

After participants filled out our recruitment survey, we adhered to our study criteria for selection. Out of the 30 applicants, 10 were selected for interviews. The remaining 20 creators were not selected because they either had not experienced shadowbanning or had only started creating content within the last six months of the study. We wanted participants with more than six months of experience and a considerable following. We divided participants into two groups: those with 5k to 10k followers and those with more than 10k followers. We did this to ensure an even distribution of newer content creators who started within the past year and those with more established content creation history.

After completing the survey screening, participants scheduled a one-hour interview. Upon completion, they received a $20 gift card.

3.2 Demographics

We conducted semi-structured interviews with 10 participants from various locations across the United States. On the next page we have Table 1 to represent the demographics of our participants.

Table 1: Participant demographics and information including the TikTok genre the participants felt best represented their content and their follower count as of July 2024.
Participant TikTok Genre Follower Count Gender
1 Grad School TikTok 16.5K Female
2 Lifestyle TikTok 541 Cis-Female
3 Grad School TikTok 20.1K Woman
4 Fitness TikTok 22.4K Female
5 Sports/Lifestyle TikTok 19.6K Female
6 Current Events TikTok 94.6K Female
7 Beauty, Fashion and Hair TikTok 41.1K Female
8 Fashion TikTok 11.6K Non-Binary
9 Cosplay TikTok 35.8K Female
10 Music TikTok 8.2K Female

3.3 Interview Procedure

All interviews were conducted via Zoom video conferencing with only audio recordings retained. All recordings were stored in a password-protected folder and only accessible to the research team. Once interviewees consented to participate in the study, they were asked questions in five high-level categories: Content creator's account and audience dynamics, growth and performance, content moderation, identity and TikTok content, and future on the platform. Below, we define these five categories in more detail.

3.3.1 Content Creator Account and Audience Dynamics. This section focused on understanding the participant's TikTok account. Questions addressed who their target audiences were on TikTok. For example:

  • How would you describe the content you make online and the audiences you tend to stray away from when creating content?

3.3.2 Growth and Performance. This section focused on the strategies the content creator used to engage with their audience and grow a following. These questions were asked to address our interest in the folk theories used by these Black femme content creators. For example:

  • In what ways do you try to get users to your account?
  • What kind of content do you enjoy making the most?

3.3.3 Content Moderation. Content moderation was a central focus of our study on Black women and non-binary content creators. To address this, we asked participants about their experiences with content moderation. These questions helped us understand how content creators mitigate and resist content moderation. For example:

  • In your experience, do you believe content moderation systems have treated you fairly, or unfairly?
  • Are you familiar with shadowbanning? Could you give me your definition of it ?

3.3.4 Identity and TikTok Content. This vital section deepens our understanding of Black femmes on TikTok and contributes to research in digital Black studies. For example:

  • How does your personal identity shape your experience as a creator on TikTok compared to others in your genre?
  • Do you find yourself expressing certain aspects of your identity more prominently than others?

3.3.5 Future on the platform. Our final section was for studying participants’ future content creation strategies. Given the potential for TikTok to be banned in the United States in 2024, we explored how content creators were mitigating the risk of losing their content.

3.4 Data Analysis

We used reflexive thematic analysis to analyze our interview data [10, 14] with a mixed inductive and deductive coding approach. Our deductive analysis focused on content moderation experiences, resistance, and folk theories among Black femme content creators on TikTok, while inductive coding revealed broader experiences.

During the interview stage of the study, the first author shared summaries at weekly meetings for the team to review data, refine the approach, and discuss relevant themes. Once data collection was complete, we used Otter.ai to transcribe, copyedit, and clean our interviews. To address the potential biases of transcription AI services [69], the first author carefully reviewed the transcriptions generated against the original recordings. This additional step allowed us to identify and correct any misrepresented words or phrases, ensuring accuracy for our findings. Then the first author used the Atlas.ti software to manually open code all of the interviews. Next, the second author was invited in to further open code three of the interviews to lend a new perspective on the data and further develop themes. The last two authors reviewed some data informally to provide additional perspectives, especially where they had particular topic expertise or experience related to the creators’ content. Throughout the process, we discussed codes and themes in weekly research meetings. We iteratively developed the themes alongside data extracts until our findings were coherent, consistent, and distinctive.

4 Findings

Here we showcase the results of our study. We first showcase how social feeds perceive Black femme content creators. We then present the folk theories held by BFCCs regarding the TikTok platform. Next, we explain how content moderation was presented for these content creators and conclude by highlighting the importance of community as a common need for Black femme content creators on the platform.

4.1 Folk Theories for Black femme content creators

This section addresses the theories Black femme content creators had while on TikTok. We present how participants perceived themselves online as well as how they felt algorithms perceived them online. We then present the emotional and invisible labor that participants felt was associated with being a Black femme Content Creator.

4.1.1 Sexuality Online. Participants alluded to their bodies being hypersexualized on TikTok. For example, a beauty content creator in our study commented that her dancing videos were flagged as ‘inappropriate’ by TikTok. She went on to say “I'm Caribbean, in our culture, this is how we dance. But I also wasn't doing anything inappropriate. I was fully dressed and covered” (P07). This participant is confused as to why the algorithm flagged this content as inappropriate when it was so familiar to her. Other participants described being hyperaware of what they wear and how they look. One of the youngest participants saidI'm not trying to attract, like, the weird guys... I kind of hate that side of TikTok when I see those comments” (P05). The comments in question mostly concerned her body and not the content she was promoting of her being a student athlete. From this we observed that participants think critically before they post due concerns about how the audience might perceive their bodies, whether the attention is warranted or unwarranted. P05 continued to make content on her life as a student athlete, but she noticed much of her content being fed to the “male gaze” unintentionally.

Another participant had two accounts, a family-friendly cosplay account and an adult only cosplay account. She spoke about her sexuality being misrepresented by the algorithm, explaining that on her family-friendly account, her audience is primarily white women who tend to flag her content as inappropriate. “White women will just tag your content as dangerous on a whim” she says (P09). She goes on to say that white women specifically label her family friendly content as inappropriate due to "nudity" however, all of her content is family-friendly. She says “it's disheartening that when you try to be family-friendly, and people see your mere existence as a problem” (P09). This was interesting due to her adult content being more sexual. She explains that on her 18+ account, her audience is primarily male and less inclined to moderate her content. She then goes on to express her frustration at being a Black content creator and wanting to embrace her sexuality with her 18+ content: “As a Black person, how hard it [expressing your sexuality] can be, when, you know, the Jezebel trope tends to pop up... I don't want people's husbands. I don't care. I just want the right to be a sexual person. A sexual adult” (P09). This contrasts with P05’s frustration about not wanting to look too sexual “I don't want to thirst traps. But it's also hard because you can't control, how people perceive things” (P05). These findings highlight the complicated tension between avoiding negative perceptions on social media and the difficulty of maintaining full agency over your perception online.

4.1.2 Diversify Representation of Blackness Online. In our study we observed a concurrent theme of participants being frustrated by the representation of Blackness on TikTok. A participant who defined herself as a “binge TikToker” (P03) saysI think I'm constantly in stages of rebranding myself... I'll binge on TikTok for a month and then not be on it at all. And I think it's because I kind of changed so much that, I don't like the fact that there's a reminder, of who I was a month ago” (P03). The feeling of being confined to an aesthetic or niche within the algorithm was a common frustration among our participants. P02 theorized that for Black women to become popular on TikTok, they must come from some sort of controversy or drama. She goes on to say that this is what social media for Black women is built on. She describes her point through what she see's on beauty TikTok where somebody is "making fun of a Black girl" (P02) either through criticizing her makeup or life choices. She goes on to say that social media “likes to put us all in one box of a type of Black women” (P02) and that she is happy that there has been a rise in more diverse representations of Black women online. The nicheness of content creators’ content is what many participants said brought joy to their work. A graduate school content creator explores this further by saying that doing "day in the life" content as a Black graduate student is easier content to film, due to much of graduate school content being affiliated with “Black excellence TikTok" (P03). However, she notes that the idea of Black excellence being pushed by the algorithm can be harmful because it can “isolate Black people or people of color who just want to be a person” (P03). This highlights the nuanced tension between celebrating Black excellence and the potential harm of reducing diverse identities to a single narrative, ultimately limiting authentic expressions of Blackness on platforms like TikTok.

Appearance also plays a role in this conversation, as participants expressed a desire not to be restricted to the one-size-fits-all portrayal of Black content creators. One participant explained the beauty standards associated with Black women on TikTok, defining “TikTok Pretty” (P05) as someone with clear skin, a tight bun, and a “clean girl” aesthetic. However, she observed that some of the aesthetics afforded to the “clean girl” aesthetic are very much connected to white women “because when you're Black, and you have braids in your head, it's not like a slicked back bun clean girl look.” (P05). She makes the observation that the clean girl aesthetic for white women is the "Black Barbie aesthetic" (P05) for Black women, providing an example:I notice like when I have lashes, like I used to wear lashes a lot, I would get more engagement, I would get more followers, I would get more likes, just because I was adhering to like that beauty standard. Black Barbie Girl beauty standards like lashes, hair done, nails done, like everything done, basically. You know?” (P05). Another participant also notes this, saying “I notice it [content] does better depending on, like, how I looked in the video. So like, if it's like I'm in a different background, or like my hair is especially done, like my lashes are freshly done, those videos tend to do better than when everything is not fully done” (P06). It's important to note that neither of these participants made beauty content on their accounts, but still needed to adhere to specific beauty standards to stay relevant online.

When asking participants what their content would look like if they didn't feel the pressure of the algorithm, many said that they would just want to talk freely or post “weirder things” (P08). One participant answered this question by saying that she is hesitant to showcase some of her hobbies because she knows that her interests—coffee and yoga—are widely “dominated by like, white boys in Brooklyn... and that yoga content has to like look a very specific way. And I'm kind of intimidated by that” (P03). In summary, although our study included a diverse pool of content creators, many felt that the ways they are perceived by the algorithm are very much the same.

4.1.3 The importance of Metadata and Consistency. Participants were very passionate about how to reach higher engagement. Throughout our interviews we observed a trend among Black femme content creators: they used common folk theories about hashtags, captions and trending sounds to stay relevant on TikTok. For example, one participant said “Yes, hashtags, but also, when you caption the video. There's a kind of a lot of things. I think about you wanting to immediately grab people's attention so you can't say too much” (P05). Some participants tended to use Black hashtags such as #BlackGirlTikTok, #BlackTikTok, to reach a Black audience. We noticed a split within our participants in using this strategy to gain more engagement. Some chose not to use Black hashtags in order to avoid being restricted on the platform and to reach a broader audience. For example, one participant expressed this point by saying “I don't typically use, like, Black hashtags, because I want my content to be served to everybody and them seeing me as Black doing it [her content] more as normality than an out the box thing” (P06).

Another trend emphasized by our participants for staying relevant was the need for consistency. When asked in what ways they try to attract people to their profile, one participant said “So the first thing that you have to do is be consistent, which I haven't been great with. But yeah, you have to post regularly, at least once a day, maybe twice a day, if you're really churning out videos” (P05). This trend of posting regularly and creating a schedule was common among participants. A fitness content creator added to this theory by highlighting the importance ofjust posting consistently, and always replying to comments, and being really engaged with my followers, because I like to consider the people that follow me as my friends because we kind of live a similar lifestyle” (P04). These responses suggest that content creators tend to rely on posting continually to be prioritized by the algorithm. Many agreed that, although consistency is important, it can become tiring and overwhelming when combined with their other demanding responsibilities outside of content creation. For example, when asked what video she would make if she could make any video without thinking about the algorithm or her audience, a content creator who dedicates her content to Black healing said the following:Yeah, I would still post about the same thing. But I think I would just not be as stressed about the algorithm and timing. And what caption to post" (P07). She goes on to explain the questions she asks herself before posting: "Should I wait to post it at this time? Is the morning better?... I feel all of that creates this anxiety around posting” (P07). Here we see the negative affects that the need for consistency has on content creators. There are many factors involved in creating content beyond just posting a video.

4.1.4 Authenticity. The need to be authentic online was another common trend our participants reported when discussing how to improve engagement. Our participants came from a wide range of genres, but many believed their authenticity on the platform led to their success. One participant said “I'm the same person online than as I am in real life. And I don't plan on ever trying to shift my personality or who I am just for more engagement... I feel like building your platform up slowly is better than quick followers... I feel like a lot of people like me just because I'm myself. So I don't want to change that. Because at the end of the day, once my content does get pushed to more people, I know that more people follow me because they like how genuine I am” (P04). In this interview , the participant expressed that because she did not change her personality for views, more people felt that her content was genuine. Another participant expressed a similar view, sayingI feel that the message that I show every time I post I'm fully myself in every room and I think when people see me it's the same thing” (P07). Throughout the interviews, we observed that many participants who felt this way also felt more joyful about the platform. This is because they gain followers by being authentic, rather than putting on a personality that would appeal to the algorithm. During a conversation about the different platforms content creators have at their disposal, a participant expressed how the nature of TikTok can make it harder to be authentic. However, she says that on YouTube, "you see the most authentic version of me because it's just not cut down as much as it is on TikTok or Instagram” (P04). This sentiment emerged often when participants were asked what they would post if they didn't have to prioritize the algorithm. The common response was that they would speak freely or simply talk without needing to conform to a certain niche. "I would love to make more content where I'm just talking about what's going on today, my random thoughts, my weird kind of skits and things"says one participant (P10). She adds that she doesn't participate in that kind of content because the platform will "niche you really fast" (P10).

4.1.5 Appearance. Another common theme was the importance of beauty standards to increase engagement on the platform. Participants noted that certain looks and aesthetics afforded them higher views. A participant whose main form of content was showcasing her life in grad school said “I was convinced that some of my videos were doing better when I had, straight hair or I had wigs on versus, when I had braids, or I had my hair in, more coily styles. And I think that that's something that's just, under the hood... Because, you know, I don't always wear makeup... until I put makeup on or make sure my hair looks good before I get on the camera” (P01). This quote highlights the frustration of needing to think about appearances before even getting in front of the camera, as well as the common theme of how one's appearance affects engagement. This was also evident when a participant shared her experience of disliking a certain look on herself but noticing that her followers loved it: “And, I used to only get blonde braids, I've kind of, just let myself explore a different kind of look. I also notice, my followers just love that look. They just love the blonde hair and the lashes look, like they just love it” (P05). She goes on to say that the "lashes look" aligns with the Black Beauty Barbie aesthetic mentioned earlier in our findings, however, the aesthetics that were seen as desirable for her followers are not the common natural aesthetics for Black women (i.e blonde hair and long lashes).

4.2 Content Moderation

Overmoderation of participants’ content was observed consistently throughout our study. In this section, we report how participants felt about their content being moderated. We split this section into 2 parts: ambiguity surrounding why their content was moderated, and the idea that their looks and ideas affect the heightened moderation they experience compared to non-Black femme content creators.

4.2.1 Looks and Ideas. Participants’ appearances have come up often within the context of how they feel perceived by the platform. We also observed this theme when asking participants about their experiences with content moderation. Before asking participants about their own experiences with being moderated by the platform, we asked them to define shadowbanning. From the participants we interviewed, many believed that shadowbanning was either a suppression of resources based on not abiding to community guidelines or an underhanded way of trying to get someone off the platform. "I guess shadowbanning, to me, is the way social media companies will kind of passive aggressively ban content for creators without necessarily kicking them off of the platform" (P03). As this conversation progressed, BFCCs expressed that the guidelines they believed led to their over-moderation or shadowbanning were hard to define due to the lack of transparency in accessing the community guidelines. Participants speculated that their content was overmoderated or shadowbanned because of their appearance. One participant even imagined what her profile would be like if she didn't reveal her Blackness, referencing another creator who only shows their hands covered in gloves (P10).

Participants in our study stated that the platform seemed to favor a certain look or prioritize "TikTok Pretty," for Black femmes which they defined as having their hair freshly styled in box braids, silk press3, or extensions, and wearing makeup. Although this "Black Barbie" aesthetic worked for some BFCCs, participants noticed that engaging with this aesthetic didn't circumvent the effects of overmoderation and shadowbanning. A beauty content creator who made content to uplift Black women's natural features noticed that content that involved her dancing was more closely scrutinized by the platform than her other content: "Once in a while, I'll post videos of me dancing when I'm in the mood, and they flagged it as inappropriate. And I'm like, people are d*mn near twerking on TikTok and they're not flagged. They're going viral" (P07). Here, we see how this content creator is frustrated by the over-moderation of her content as compared to others. Another participant said they stopped going live on her account to reduce the over-moderating and false reporting of her content: "I don't go live anymore. Because every time I go live when I'm working or something, there will be people who come in, and they will ban my live. Saying I am nude, or I'm doing something inappropriate, you know? And it's mainly white women, because my family friendly account is like 75% white women" (P09). This highlights how content moderation is shaped not only by algorithms but also by platform users, contributing to the broader conversation about what it means to be a Black femme content creator whose Blackness is hypervisible.

4.2.2 Ambiguity. When asked questions about their experiences with content moderation, participants highlighted a common theme of overmoderation of their content. The participants experienced unfair treatment from the platform's community guideline procedures. Throughout the interviews, participants expressed frustration and confusion as to why their content was moderated. A participant noted “Literally, one of my videos just gets taken down. And I'm just like, I literally, did nothing... There's actually no reason for you to be like, this violated community guidelines or whatever. It just doesn't make sense, which is why I have tried to be more careful” (P05). This participant attempted to find out why her content was taken down, but she could not find a reason. This is important, as TikTok doesn't provide reasons for moderating content, which could greatly help users prevent their accounts being taken down. Another participant expressed the hurt she feels at not knowing why her content isn't being pushed or is being moderated on the platform:“Sometimes you kind of get impostor syndrome, because sometimes you think, I feel like this video was good... why didn't it get pushed more? So that's when I'm like, there's something wrong with the algorithm, because there's so much engagement” (P04). These findings allow for a conversation about the effect that the lack of transparency in content moderation has on Black femme content creators.

4.3 Community

The importance of making a collaborative community online was frequently mentioned by Black femme content creators as a driving force for creating content. This theme emerged in response to questions about why these creators stayed on the platform and what brings them joy as creators. When asked how she measures success on the platform, one participant said “I would say that kind of how I measure my success is through like the interpersonal relationships that I'm able to cultivate” (P05). She went on to explain that much of her success in growing a community online came from her followers personally reaching out to her for advice on applying for graduate school applications. This relationship between the content creator and her followers was interesting because it extended beyond online interaction into real life. Another participant noted that she never really thought about leaving TikTok, going on to say “even if I take a break. I always go back just because, I kind of feel like it's my little family. You know? I actually care for my followers” (P05). This participant, like many others, also noted that she tended to use the comment section to keep track of her online community.

Participants that had been on the platform for over 2 years expressed how hard it is to build community online.One said that “The hardest part when you start is building a community. First of all is figuring out exactly like what they want to see from you” (P05). Here, we see how the content creator’s understanding of her followers helped form a strong community. This was common for many of the content creators due to their under-representation in the content they create. A fitness content creator that had been making content since 2019 saidI really believe in the building the community part, that's so important to me, because at the end of the day, maybe my content will switch up in a few years, and I want to explore something else, you know, but if I have people that follow me, because they just like who I am as a person, they'll ride with me for anything, you know, instead of, somebody that just followed me because they saw my posts on my 10 pound weight loss” (P04). The theme of authenticity and community building is apparent in this content creator’s observation that in her time building a following i.e. her followers’ tendency to see who she was as a person beyond her fitness content correlated with a community that appreciated her as her.

We also see aspects of community in the ways participants navigated shadowbanning or overmoderation. When defining shadowbanning participants believed that shadowbanning was a passive aggressive way of suppressing ones content. To counteract this tactic, BFCCs relied on their followers and friends in real life to check if their content was being banned. One participant describes this strategy: "So I've asked my friends, like my other content creator friends to go on my page, and they don't see it [my content]. So that, to me, is shadowbanning, like some of my, users can see it, some people can't, even if they look me up" (P10). In this example, the content creator uses her network of other content creators to determine whether her content had been moderated or shadowbanned. P10 also noted that TikTok designed a campaign to give Black content creators a platform to report instances of overmoderation; however, the campaign's effectiveness was limited because it primarily benefited creators with larger followings, leaving those with smaller audiences with less influence and visibility.

5 Discussion

5.1 Algorithmic Confinement in Black femme Content Creators

“Everyone coming of age in the digital era has practiced this online performance of self. But Black women considered deviant and “other” in American society had extra practice in navigating their sense of self in stark contrast to societal expectations.” [76]

Our participants want freedom from the constraints imposed by the algorithm. They describe TikTok as a result of our larger societal inability to view Blackness as complex and diverse, preferring instead to hold onto stereotypes of race, gender, and class and viewing anyone outside of those stereotypes as anomalous. We connect these sentiments to Andre Brock's work in Critical Technoculture discourse analysis (CTDA) [12, 13] and more specifically the notion of the “libidinal economy” [12, 13, 63, 86]. CTDA investigates the internet and online behaviors with a critical perceptive on culture and has been increasingly used in HCI research (e.g., [16, 70, 81]).  This framework draws focus away from the Western deficit perspective on minority technology use but instead emphasizing the knowledge and experiences of underrepresented groups of technology users. In reference to Brock, a libidinal economy emphasizes “the role of emotional and psychological intensities in driving anti-Blackness, rather than the more rationalist models of human behavior derived from political-economic approaches” [78]. Brock argues that the constant portrayal of Black suffering that is pervasive on the internet and in other forms of media reinforces the anti-Blackness we see today. Social media algorithms amplify images of Black suffering, thus perpetuating a narrow, one-dimensional, monolithic view of Blackness. Our participants clearly echo this sentiment, as they face challenges due to some of their content not fulfilling monolithic representations. The datafication of injustices described by Ruha Benjamin makes the negative viewing of Blackness a normality on the internet[4].  In this way, it is not just the users, but the algorithm on TikTok optimizing Black femme content creators towards these confining margins. The TikTok algorithm prioritizes digital narratives [76] of harmful stereotypes of Blackness rather than optimizing towards joy and participants’ own nicheness.

Monolithic views of Blackness are prevalent on TikTok and most social media platforms [31]. Those with more algorithmic privilege [47] can be multidimensional on the internet. For example, TikTok facilitates microtrends [6] ( e.g., Fleabag Aesthetic, Clean Girl, Aesthetic, BarbieCore, and Coquette ). Formal content creators and general users can display different sides of their personality by engaging with micro trends through different posts, including different outfits, styles, sounds, and other creative choices. However, these micro-trends, at best, only materially benefit cis white women and, at worst, explicitly exclude Black femmes. While on the surface, microtrends are just a new form of social engagement online, trends such as these have always seriously impacted the public's identity exploration, formation, and feelings of representation and belonging [6, 26].

Reina Gossett, Eric Stanley, and Johanna Burton explore identity formation in their book Trap Door , where they discuss visibility politics for the transgender community [36]. The ’Trap of the Visible’ for transgender representation affects aspects of transgender lives when the representation does not fit into societal norms of gender. As a result of not fitting into these norms, marginalized communities’ representation is weaponized and/or identity flattened. Identity flattening, coined by DeVito, is the minimizing of someone's identity, not allowing them to be multifaceted [24]. In the context of social media, the identity of Black femme content creators is confined to monoliths, not allowing their content to be represented in other micro-trends afforded to those more algorithmically privileged. Black femme content creators’ visibility is scrutinized if they do not fall into the monoliths the algorithms designed for them. Our participants’ "trendy" content was actively suppressed, and when it was not, they faced repeated rebuffs (e.g., surprise at Black femmes’ interest in the trend, feedback that they don't actually fit the trend, etc.). For social media algorithms to advertise the freedom to be oneself but quietly select a few to promote speaks to Ruha Benjamin's argument that the datafication of Black pain shapes how Blackness is seen through the internet [4]. This cycle can become dangerous due to social media's influence on identity formation, which affects Black online communities’ perception of self and can affect young Black users who are highly influenced by the internet.

In addition to CTDA, we observed participants’ experiences through their resistance to monoliths. Black feminist scholars have discussed the importance of pluralism when explaining Black people's experiences within the everyday world. In our study, participants expressed their intersectional identities through the nicheness of their content. They described leaning into their unique interests and experiences through their content and audience curation. They embody Patricia Hill Collins's emphasis on the multiplicity of Black women's experiences [43]. Collins argues against the notion of a single, universal experience of Black women, instead advocating for the recognition of the diverse ways in which Black women live and resist oppression.

bell hooks describes self-actualization as “aware[ness] of oneself and the influence one has in the transformation of the lives of others through the construction and dissemination of knowledge” [44]. We observe Black femme content creators self-actualizing via their online platforms by embracing their talents, interests, and lifestyles free of the assumptions that are perpetuated by racism. However, BFCCs’ identities tends to be reduced by social media through the stereotypes of the jezebel 4, mammy 5 or sapphire 6, making it hard to reach self actualization. We argue that the algorithm encourages this rhetoric. [85]

Computer scientists and HCI researchers alike have already found that algorithms are biased. They are not purely logical and rational, and instead they perpetuate racial, gender, and other identity-based discrimination (e.g., [40, 66, 67]). Black femme content creators specifically navigate an algorithm that prioritizes the “Black Barbie Aesthetic” (P05) or a certain type of Black woman that is exceptional in what they do (P02, P03). In Collins's matrix of domination, we see how social media algorithms hold these domains of algorithmic power by organizing, sustaining, enforcing, and manifesting aesthetics, thoughts and trends on the internet and deciding who can be the face of them.

This networked circulation of what is a Black femme content creator is developed in Brock's argument of “racism without racists,” in which imagery of anti-Blackness is distributed via social media algorithms rather than through a person perpetuating the racism. Through this monolithic view of Black femme content creators online, pluralism and diversity are hindered. However, despite the algorithm prioritizing pain over joy, we see content creators resisting through building tight-knit communities of care among other content creators and their followers.

5.2 Hypervisibility of Blackness Necessitates Hypervigilance in Content Creation

Identity was formed both around participants’ day to day lifestyles and around their role as Black content creators. However, our participants speculated that the algorithm made their physical appearance and Blackness hypervisible to everyday users. This contextualizes their over-moderation and shadowbanning, sparking conversation around what gets views and what does not. For example, P05 felt frustration with her predominantly male audience and felt very sexualized on the platform. She believed that, as a Black woman, her body was more sexualized than most due to the stereotypes associated with Black women's bodies. While this participant aimed to share content that would resonate with those interested in her journey as a student athlete and contribute to the representation of Black girls in higher education, she felt that much of her content unintentionally catered to the male gaze. At times, she even felt pressured to lean into the male gaze to gain more views.

Other participants noted how their appearance was made incredibly visible by the algorithm in respect to either their hair or their clothing. A non-binary content creator expressed that they tended to “play up femme” (P08) because they noticed that their content would be promoted more on the platform if they adhered to traditional forms of femininity. They later went on to say that offline they present more androgynously. Another participant, who created content about graduate school, resisted the “Black  excellence”  narrative she saw being pushed on TikTok for Black content creators pursuing post-graduate education. Building on this, we believe the Black exceptionalism narrative further complicates the experience for Black creators, as it imposes expectations of what it means to be a "good" Black person online. This leads us to ask: What is a "good" Black content creator in the context of social media feeds? The Black exceptionalism narrative is imposed on Black creators online as a way of of defining what it means to be a good Black person. Upward Black mobility, both online and offline, is indicative of Black people attaining positions of high power that disrupt negative perceptions of Blackness. However, these "negative" perceptions are applied to those who do not follow social norms brought on by white hegemonic capitalist power structures of success, which results in the othering of Black people who are perceived as "different" or "ghetto". Scholars have explored the damages these narratives can create as they push the idea that to be Black and successful, you must be  extraordinary [19, 71]. The participants in our study experienced frustration regarding how they think the algorithm perceives them. The central theme that emerged from our study was: Why can't Black women just be? (P02, P03, P04, P05) We observed that the offline pressures and perceptions of what it means to be a successful Black woman have now extended to the online space, causing Black femme content creators to be acutely conscious of how they present themselves, what they say, andwhat they do before posting.

Our results regarding algorithmic folk theories for Black femme content creators reflect past research on folk theories of social media algorithms, such as Karizat's strainer theory [47] as well as Brook Duffy's work on the labor of social media consumption [27]. Karizat argues that social media algorithms recognize, classify, sort, and suppress social identities due to the social construction of these identities. In this folk theory, the algorithm is represented as a strainer impacting what types of identities are prioritized on social feeds. Building upon strainer theory, Karizat recognizes algorithm privilege, which refers to the advantages that arise from algorithms prioritizing certain identities over others. We argue that stereotypical views associated with Black women are more optimized on these feeds. The unwarranted hypervisibility driven by the algorithm forced the content creators in our study to become highly aware of their Blackness. This hypervigilance was manifested in their clothes, hair, gender expression, sexuality, or career. In these instances, we see how content creators who lack algorithmic privilege must perform additional labor compared to other content creators. Building on Brooke Duffy's work [27], we see how Black femme content creators navigate the dual pressures of offline societal norms imposed on Black femmes and the online stereotypes they must either resist or embrace within social media spaces.

As our study revealed, the mental and emotional hurdles that marginalized individuals must consider before posting online are significant. We build on the work of HCI scholars like Brock, DeVito, and Klassen in trauma-informed computing. At the same time, other scholars have described how TikTok's interface empowers users to engage in a Black feminist praxis (e.g., through the green screen feature that enables users to teach and critique each others’ content [68]). We also advocate for a shift in how Black communities are viewed online, moving away from a deficit mindset towards a more joyful perspective [81]. In the following sections, we outline the benefits of building social feeds rooted in joy rather than struggle.

5.3 Designing Centering Black Joy

In our research, we observed that Black femme content creators navigate the platform's over-moderation by consciously adjusting their appearance or lifestyle, often aware that they are being monitored and moderated. This hypervigilance can be exhausting, but these content creators are inspired by the relationships they have formed online. For example, some participants described being more comfortable with their followers knowing that they were interested in the creator's authenticity, while others created their own measures of success (e.g., the number of followers who reach out for advice). Those who felt like their content was being shadowbanned or overly moderated would rely on their followers for feedback, posting videos to ask if their content was being shown on their feeds or, in extreme cases, asking followers to follow them on other platforms. These are care networks the creators cultivated through their followers online and offline. Despite the negative experiences BFCCs had while navigating TikTok, we saw BFCCs relying on their followers (e.g. having meaningful conversation through comments, giving or taking advice as well as sharing inside jokes relevant to the Black experience) and showcasing their diverse cultures online to cultivate joy (e.g., dancing, cooking, makeup and haircare). We propose that these meaningful connections could help with exploring the implications of social feeds that are rooted in Black joy:

Black joy allows us space to stretch our imaginations beyond what we previously thought possible and allows us to theorize a world in which white supremacy does not dictate our everyday lives.’ [46]

Prioritizing Black joy as the ultimate goal in the design of future social technologies, we look to the many interrelated approaches proposed by HCI and technology researchers and designers for centering BIPOC and specifically Black experiences. First, trauma-informed design encourages us to confront head-on that many users have histories of trauma that are often re-perpetuated or aggravated by social technologies [70]. To that end, Randazzo and colleagues discussed tangible ways for researchers and designers to build for marginalized communities in ways informed by their traumas, such as adopting principles of trustworthiness, transparency, and direct engagement with social and cultural differences [70]. In the case of Black femme content creators, the hypersexualization and other stereotyping that is imposed on Black women offline translates to algorithmic constraints and overmoderation online. Here, transparency in the user interface for creators—specifically, sharing more about how the algorithm reads BFCCs’ content—could mitigate the physiological and psychological harm associated with uncertainty regarding digital racial microaggressions [82].

Second, To, Smith, and colleagues suggest moving beyond a deficits framing to design that focuses on the flourishing and joy of people of color [81]. They integrate value-based design with transformative justice and other design protocols, allowing care networks for marginalized users to be centered around joy rather than struggle. We encourage designers to focus on building tools for the care communities already curated by content creators. For example, allowing followers to opt-in to creator notifications for new content (as YouTube does with the notification bell) and emphasizing the visibility of the creators that users already follow could support creator-community relationships.

Finally, focusing on Black experiences, researchers have proposed Afro-futurism as a framework for exploring ways to help Black women and non-binary users have joyful experiences with technologies [52, 75]. In our research we saw that Black joy was represented through dancing, shared experiences and care networks between followers and creators. We believe that focusing on these pluralistic Black experiences on the internet will create a positive user experience online for BIPOC users. Many researchers have been laying the foundation for social feeds that are informed by Black joy rather than constant forms of stereotypes, struggle, and trauma. Lisa Egede and colleagues have explored how to design technology for Black experiences that are "For Us By Us" by studying Black technologists’ work on joyful technologies [28]. In their research, they studied Black technologists who curated resources to support lived Black experience rooted in joy. Through their study, they recognized that Black culture is shared through social media. They also observed that the monetization of Blackness on social media is often paired with a failure to center Blackness in the design of these applications. With our participants, we observed that BFCCs’ Blackness was hypervisible, often resulting in consequences like overmoderation or shadowbanning. This prompts a discussion about designing for Black users on social media platforms like TikTok, where their physical appearance is prominently displayed, as opposed to platforms like Twitter, which has been more extensively explored in Black Techno-studies. This sentiment was evident when one participant (P10) reflected on how her content might be perceived differently if her Blackness were not so visibly expressed. The dimming of one's Blackness should not be the solution to addressing injustices on the internet, as it undermines the push in HCI research towards building for joy rather than struggle for marginalized communities.

Egede and colleagues also argue for the need to be intentional about the recruitment of minoritized groups. We learned from one participant that TikTok created a resource for Black content creators in 2020 to help with overmoderation and shadowbanning. However, the resource has not been maintained, making Black content creators rely on their care networks. Egede et al. observes that this phenomenon of a lack of full support for Blackness in design is likely due to non-Black participants in their study lacking an understanding of the nuanced needs and aspects of Black communities. We also believe that the push for diversity and inclusion tactics within the United States is greatly used as a one size-fits all approach to combating racial contentions. However, without naming and understanding the intersectional identities and histories rooted on why these technologies are racialized a cyclical pattern will occur. Making more space for unsustainable design recommendations for BIPOC users. We advocate for increased efforts to develop sustainable support systems for content creators, with a particular emphasis on including Black designers and engineers in building these social feeds.

6 Limitations and Future Work

While this work provides an important window into the experiences of Black women content creators, our findings regarding monolithic views of what it means to be Black suggest that it is crucial for future work in this area to actively focus on and sample from more diverse Black populations. For example, it is crucial to further explore Black online LGBTQ+ communities, as we've seen an increase in research exploring LGBTQ+ online communities [77], but Black LGBT+ online communities are still understudied. We believe further exploring Black LGBTQ+ online spaces could potentially expand our argument for pluralism within BIPOC online communities, disrupting monolithic views of what it means to be Black.

Finally, while our results indicate that content creators both believe in and are basing their behavior upon their perceptions of how the TikTok algorithm operates, there would also be significant utility in confirming the technical realities. As such, we call for an algorithmic audit such as a Sock-Puppet auditing of the TikTok algorithm, to specifically examine how the TikTok algorithm is optimized for BIPOC communities, as it would be helpful in establishing proof that this phenomenon is happening in marginalized online communities.

7 Conclusion

In our work, we conducted semi-structured interviews with 10 Black femme content creators on TikTok to better understand how they experience TikTok and their experiences with over-moderation on the platform. We found that TikTok's perception of Black Women is optimized towards misogynoir via algorithmic prioritization of monolithic representations of Blackness (e.g., the mammy, the sapphire, the jezebel). This algorithmic confinement can be very labor intensive to Black femme content creators. However, it is clear that Black joy can also be represented in social feeds, and there is great potential for a future where social feeds are optimized towards joyful narratives of BIPOC instead of struggle and stereotypes.

Acknowledgments

The authors would like to thank the reviewers for their feedback. As well as, Josie Zvelebilova, who helped on the copy editing of the paper. We also are thankful for Soni Rusagara who gave helpful citations to better help our argument in the discussion.

References

  • Crystal Abidin. 2018. Internet celebrity : understanding fame online. Emerald Publishing Limited, Bingley, U.K.
  • Tomas Apodaca. 2024. How Automated Content Moderation Works (Even When It Doesn't) – The Markup. https://themarkup.org/automated-censorship/2024/03/01/how-automated-content-moderation-works-even-when-it-doesnt-work#: :text=That's%20where%20machine%20learning%20algorithms,moderators%20have%20reported%20as%20inappropriate.
  • Moya Bailey. 2021. Misogynoir transformed: Black women's digital resistance. In Misogynoir transformed. New York University Press.
  • Ruha Benjamin. 2022. Viral justice: How we grow the world we want. Princeton University Press.
  • Jeremy Bentham. 2022. Panopticon versus New South Wales and other writings on Australia. UCL Press. http://www.jstor.org/stable/j.ctv1jq5qhg
  • Sara Bimo and Aparajita Bhandari. 2023. ALGORITHMS, AESTHETICS AND THE CHANGING NATURE OF CULTURAL CONSUMPTION ONLINE. AoIR Selected Papers of Internet Research (2023).
  • John Blake. 2023. What's ‘digital blackface?’  And why is it wrong when White people use it?CNN (Mar 2023). https://www.cnn.com/2023/03/26/us/digital-blackface-social-media-explainer-blake-cec/index.html
  • Maximilian Boeker and Aleksandra Urman. 2022. An empirical investigation of personalization factors on TikTok. In Proceedings of the ACM web conference 2022. 2298–2309.
  • David R. Brake. 2014. Are We All Online Content Creators Now? Web 2.0 and Digital Divides*. Journal of Computer-Mediated Communication 19, 3 (Apr 2014), 591–609. https://doi.org/10.1111/jcc4.12042
  • Virginia Braun and Victoria Clarke. 2021. One size fits all? What counts as quality practice in (reflexive) thematic analysis?Qualitative Research in Psychology 18, 3 (2021), 328–352. https://doi.org/10.1080/14780887.2020.1769238 arXiv:https://doi.org/10.1080/14780887.2020.1769238
  • André Brock. 2012. From the Blackhand Side: Twitter as a Cultural Conversation. Journal of Broadcasting & Electronic Media 56, 4 (2012), 529–549. https://doi.org/10.1080/08838151.2012.732147 arXiv:https://doi.org/10.1080/08838151.2012.732147
  • André Brock. 2018. Critical technocultural discourse analysis. New media & society 20, 3 (2018), 1012–1030.
  • André L. Brock. 2020. Distributed Blackness: African American Cybercultures. NYU Press, New York.
  • Emeline Brulé. 2020. Thematic analysis in HCI. https://sociodesign.hypotheses.org/555
  • Joy Buolamwini and Timnit Gebru. 2018. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency(Proceedings of Machine Learning Research, Vol. 81), Sorelle A. Friedler and Christo Wilson (Eds.). PMLR, 77–91. https://proceedings.mlr.press/v81/buolamwini18a.html
  • Janet X. Chen, Allison McDonald, Yixin Zou, Emily Tseng, Kevin A Roundy, Acar Tamersoy, Florian Schaub, Thomas Ristenpart, and Nicola Dell. 2022. Trauma-Informed Computing: Towards ;Safer;Technology;Experiences nbsp;for ;All. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 544, 20 pages. https://doi.org/10.1145/3491102.3517475
  • Angèle Christin and Yingdan Lu. 0. The influencer pay gap: Platform labor meets racial capitalism. New Media & Society 0, 0 (0), 14614448231164995. https://doi.org/10.1177/14614448231164995 arXiv:https://doi.org/10.1177/14614448231164995
  • Cheryl Clarke, Demita Frazier, Gloria Hull, Audre Lorde, Chirlane McCray, Margo Okazawa-Rey, Barbara Smith, and Beverly Smith. 2012. The Combahee River Collective Statement. https://www.blackpast.org/african-american-history/combahee-river-collective-statement-1977/
  • Black Feminist Collective. 2024. The Black Bourgeoisie, Black Capitalism, and the Myth of Black Excellence. https://blackfeministcollective.com/2024/05/04/the-black-bourgeoisie-black-capitalism-and-the-myth-of-black-excellence/
  • Kelley Cotter, Julia R DeCook, Shaheen Kanthawala, and Kali Foyle. 2022. In FYP we trust: The divine force of algorithmic conspirituality. International Journal of Communication 16, 2022 (2022), 1–23.
  • Kimberle Crenshaw. 1991. Mapping the Margins: Intersectionality, Identity Politics, and Violence against Women of Color. Stanford Law Review 43, 6 (1991), 1241–1299. http://www.jstor.org/stable/1229039
  • Julie de Bailliencourt. 2024. TikTok Guidlines. https://www.tiktok.com/community-guidelines/en
  • Daniel Delmonaco, Samuel Mayworm, Hibby Thach, Josh Guberman, Aurelia Augusta, and Oliver L Haimson. 2024. " What are you doing, TikTok?": How Marginalized Social Media Users Perceive, Theorize, and" Prove" Shadowbanning. Proceedings of the ACM on Human-Computer Interaction 8, CSCW1 (2024), 1–39.
  • Michael Ann DeVito. 2022. How transfeminine TikTok creators navigate the algorithmic trap of visibility via folk theorization. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (2022), 1–31.
  • Michael A. DeVito, Darren Gergle, and Jeremy Birnholtz. 2017. "Algorithms ruin everything": #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 3163–3174. https://doi.org/10.1145/3025453.3025659
  • Siddhali Doshi, Mehak Sirohi, Kimaya Rao, and Aashi Mishra. 2024. Gen-Z's engagement with micro-cores: Exploring aesthetics and identity in contemporary times. Fashion, Style & Popular Culture (2024).
  • BROOKE ERIN DUFFY. 2017. (Not) Just for the Fun of It: The Labor of Social Media Production. Yale University Press, 45–97. http://www.jstor.org/stable/j.ctt1q31skt.6
  • Lisa Egede, Leslie Coney, Brittany Johnson, Christina Harrington, and Denae Ford. 2024. " For Us By Us": Intentionally Designing Technology for Lived Black Experiences. In Proceedings of the 2024 ACM Designing Interactive Systems Conference. 3210–3224.
  • Maryann Erigha and Ashley Crooks-Allen. 2020. Digital communities of Black girlhood: New media technologies and online discourses of empowerment. The Black Scholar 50, 4 (2020), 66–76.
  • Motahhare Eslami, Karrie Karahalios, Christian Sandvig, Kristen Vaccaro, Aimee Rickman, Kevin Hamilton, and Alex Kirlik. 2016. First I" like" it, then I hide it: Folk Theories of Social Feeds. In Proceedings of the 2016 cHI conference on human factors in computing systems. 2371–2382.
  • Sarah Florini. 2016. This week in blackness and the construction of blackness in independent digital media. In Race and Gender in Electronic Media. Taylor & Francis.
  • Sarah Florini. 2019. Beyond hashtags: Racial politics and Black digital networks. New York University Press.
  • Michel Foucault. 1977. Discipline and Punish: The Birth of the Prison. Pantheon Books, New York.
  • Catalina Goanta and Gerasimos Spanakis. 2022. Chapter 6: The commercial unfairness of recommender systems on social media. https://www.elgaronline.com/edcollchap/edcoll/9781839109966/9781839109966.00013.xml
  • Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society 7, 1 (2020), 2053951719897945.
  • Reina Gossett, Eric A. Stanley, and Johanna Burton. 2017. Trap door : trans cultural production and the politics of visibility.
  • Joshua Lumpkin Green. 2006. Digital Blackface: The repackaging of the Black masculine image. Master's thesis. Miami University.
  • Uma Sushmitha Gunturi, Anisha Kumar, Xiaohan Ding, and Eugenia H. Rho. 2024. Linguistically Differentiating Acts and Recalls of Racial Microaggressions on Social Media. Proc. ACM Hum.-Comput. Interact. 8, CSCW1, Article 89 (April 2024), 36 pages. https://doi.org/10.1145/3637366
  • Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc. ACM Hum.-Comput. Interact. 5, CSCW2, Article 466 (oct 2021), 35 pages. https://doi.org/10.1145/3479610
  • Camille Harris, Matan Halevy, Ayanna Howard, Amy Bruckman, and Diyi Yang. 2022. Exploring the Role of Grammar and Word Choice in Bias Toward African American English (AAE) in Hate Speech Classification. In Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (Seoul, Republic of Korea) (FAccT ’22). Association for Computing Machinery, New York, NY, USA, 789–798. https://doi.org/10.1145/3531146.3533144
  • Camille Harris, Amber Gayle Johnson, Sadie Palmer, Diyi Yang, and Amy Bruckman. 2023. "Honestly, I Think TikTok has a Vendetta Against Black Creators": Understanding Black Content Creator Experiences on TikTok. Proc. ACM Hum.-Comput. Interact. 7, CSCW2, Article 320 (oct 2023), 31 pages. https://doi.org/10.1145/3610169
  • Sharon Heung, Lucy Jiang, Shiri Azenkot, and Aditya Vashistha. 2024. “Vulnerable, Victimized, and Objectified”: Understanding Ableist Hate and Harassment Experienced by Disabled Content Creators on Social Media. In Proceedings of the CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’24). Association for Computing Machinery, New York, NY, USA, Article 744, 19 pages. https://doi.org/10.1145/3613904.3641949
  • Patricia. Hill Collins. 2002. Black feminist thought : knowledge, consciousness, and the politics of empowerment (rev. 10th anniversary ed. ed.). Routledge, New York.
  • bell hooks. 1994. Teaching to transgress : education as the practice of freedom. Routledge, New York.
  • Mingyi Hou. 2019. Social media celebrity and the institutionalization of YouTube. Convergence 25, 3 (2019), 534–553. https://doi.org/10.1177/1354856517750368 arXiv:https://doi.org/10.1177/1354856517750368
  • Javon Johnson. 2015. Black joy in the time of Ferguson. QED: A Journal in GLBTQ Worldmaking 2, 2 (2015), 177–183.
  • Nadia Karizat, Dan Delmonaco, Motahhare Eslami, and Nazanin Andalibi. 2021. Algorithmic folk theories and identity: How TikTok users co-produce Knowledge of identity and engage in algorithmic resistance. Proceedings of the ACM on human-computer interaction 5, CSCW2 (2021), 1–44.
  • Maximilian Kasy and Rediet Abebe. 2021. Fairness, Equality, and Power in Algorithmic Decision-Making. In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (Virtual Event, Canada) (FAccT ’21). Association for Computing Machinery, New York, NY, USA, 576–586. https://doi.org/10.1145/3442188.3445919
  • Banseka Kayembe. 2021. Why are we still depicting Black women as ‘Mammies’?https://shado-mag.com/opinion/why-are-we-still-depicting-black-women-as-mammies/
  • Sara Kingsley, Proteeti Sinha, Clara Wang, Motahhare Eslami, and Jason I. Hong. 2022. "Give Everybody [..] a Little Bit More Equity": Content Creator Perspectives and Responses to the Algorithmic Demonetization of Content Associated with Disadvantaged Groups. Proc. ACM Hum.-Comput. Interact. 6, CSCW2, Article 424 (nov 2022), 37 pages. https://doi.org/10.1145/3555149
  • Laurence J Kirmayer, Eugene Raikhel, and Sadeq Rahimi. 2013. Cultures of the Internet: Identity, community and mental health., 165–191 pages.
  • Shamika Klassen, Joanna Judith Elizabeth Mendy, Mikayla Buford, and Casey Fiesler. 2024. Black to the Future-The Power of Designing Afrofuturist Technology with Black Women, Femmes, and Non-Binary People. In Proceedings of the 2024 ACM Designing Interactive Systems Conference. 2156–2172.
  • Daniel Klug, Yiluo Qin, Morgan Evans, and Geoff Kaufman. 2021. Trick and Please. A Mixed-Method Study On User Assumptions About the TikTok Algorithm. In Proceedings of the 13th ACM Web Science Conference 2021 (Virtual Event, United Kingdom) (WebSci ’21). Association for Computing Machinery, New York, NY, USA, 84–92. https://doi.org/10.1145/3447535.3462512
  • Allison Koenecke, Andrew Nam, Emily Lake, Joe Nudell, Minnie Quartey, Zion Mengesha, Connor Toups, John R Rickford, Dan Jurafsky, and Sharad Goel. 2020. Racial disparities in automated speech recognition. Proceedings of the national academy of sciences 117, 14 (2020), 7684–7689.
  • Vivian Lai, Samuel Carton, Rajat Bhatnagar, Q Vera Liao, Yunfeng Zhang, and Chenhao Tan. 2022. Human-ai collaboration via conditional delegation: A case study of content moderation. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–18.
  • Erika Lenkert. 2020. What is a content creator and how to become one?https://www.adobe.com/express/learn/blog/content-creator#: :text=A%20content%20creator%20is%20someone, through%20any%20medium%20or%20channel.
  • Heidi M Levitt and Kathleen M Collins. 2021. Making intelligible the controversies over femme identities: A functionalist approach to conceptualizing the subversive meanings of femme genders. In Feminizing Theory. Routledge, 122–139.
  • Rebecca Lewis. 2020. “This Is What the News Won't Show You”: YouTube Creators and the Reactionary Politics of Micro-celebrity. Television & New Media 21, 2 (2020), 201–217. https://doi.org/10.1177/1527476419879919 arXiv:https://doi.org/10.1177/1527476419879919
  • Alice E. Marwick. 2015. You May Know Me from YouTube: (Micro-)Celebrity in Social Media. John Wiley & Sons, Ltd, Chapter 18, 333–350. https://doi.org/10.1002/9781118475089.ch18 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/9781118475089.ch18
  • Samuel Mayworm, Michael Ann DeVito, Daniel Delmonaco, Hibby Thach, and Oliver L. Haimson. 2024. Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users. Trans. Soc. Comput. 7, 1, Article 1 (mar 2024), 27 pages. https://doi.org/10.1145/3632741
  • Ian Mcculloh and Ben Cohen. 2024. Fragile Minds: Exploring the Link Between Social Media and Young Adult Mental Health. In Proceedings of the 2023 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (Kusadasi, Turkiye) (ASONAM ’23). Association for Computing Machinery, New York, NY, USA, 475–479. https://doi.org/10.1145/3625007.3627490
  • Ryan McPhee. 2022. What Is the TikTok Creator Fund? Here's How to Join + Start Making Money. Backstage (Aug 2022). https://www.backstage.com/magazine/article/tiktok-creator-fund-explained-how-to-join-75090/
  • Fred Moten. 2003. In the Break: The Aesthetics of the Black Radical Tradition. University of Minnesota Press.
  • Tyler Musgrave, Alia Cummings, and Sarita Schoenebeck. 2022. Experiences of harm, healing, and joy among black women and femmes on social media. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–17.
  • Mutale Nkonde. 2019. Automated anti-blackness: facial recognition in Brooklyn, New York. Harvard Journal of African American Public Policy 20 (2019), 30–36.
  • Safiya Umoja Noble. 2018. Algorithms of oppression : how search engines reinforce racism. New York University Press, New York.
  • Ihudiya Finda Ogbonnaya-Ogburu, Angela DR Smith, Alexandra To, and Kentaro Toyama. 2020. Critical race theory for HCI. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–16.
  • Chelsea Peterson-Salahuddin. 2024. Teachable moments: TikTok social drama as a site of Black feminist intellectual production. Information, Communication & Society (2024), 1–18.
  • Sébastien Point and Yehuda Baruch. 2023. (Re)thinking transcription strategies: Current challenges and future research directions. Scandinavian Journal of Management 39, 2 (2023), 101272. https://doi.org/10.1016/j.scaman.2023.101272
  • Casey Randazzo, Carol F. Scott, Rosanna Bellini, Tawfiq Ammari, Michael Ann Devito, Bryan Semaan, and Nazanin Andalibi. 2023. Trauma-Informed Design: A Collaborative Approach to Building Safer Online Spaces. In Companion Publication of the 2023 Conference on Computer Supported Cooperative Work and Social Computing (Minneapolis, MN, USA) (CSCW ’23 Companion). Association for Computing Machinery, New York, NY, USA, 470–475. https://doi.org/10.1145/3584931.3611277
  • Janelle Raymundo. 2021. The burden of excellence: A Critical Race Theory analysis of perfectionism in Black students. The Vermont Connection 42, 1 (2021), 12.
  • Gabriela T Richard and Kishonna L Gray. 2018. Gendered play, racialized reality: Black cyberfeminism, inclusive communities of practice, and the intersections of learning, socialization, and resilience in online gaming. Frontiers: A Journal of Women Studies 39, 1 (2018), 112–148.
  • Paul Saffo. 2009. Get ready for a new economic era. (2009). chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://static1.squarespace.com/static/660b48914fe4486aa3d2a1d7/t/66102b0014933445817a0b0d/1712335616955/McKinsey-Creator.pdf
  • Patricia Bell Scott. 1976. Debunking Sapphire: Toward a non-racist and non-sexist social science. J. Soc. & Soc. Welfare 4 (1976), 864.
  • Catherine Knight Steele. 2021. Black feminist pleasure on TikTok: An ode to Hurston's “Characteristics of Negro Expression”. Women's Studies in Communication 44, 4 (2021), 463–469.
  • Catherine Knight. Steele. 2021. Digital Black Feminism. NYU Press, New York.
  • Jordan Taylor, Wesley Hanwen Deng, Kenneth Holstein, Sarah Fox, and Haiyi Zhu. 2024. Carefully Unmaking the “Marginalized User:” A Diffractive Analysis of a Gay Online Community. ACM Trans. Comput.-Hum. Interact. (jun 2024). https://doi.org/10.1145/3673229 Just Accepted.
  • tba. 2021. (dis)Info Studies: André Brock, Jr. on Why People Do What They Do on the Internet. https://logicmag.io/beacons/dis-info-studies-andre-brock-jr/
  • Hibby Thach, Samuel Mayworm, Daniel Delmonaco, and Oliver Haimson. 2024. (In)visible moderation: A digital ethnography of marginalized users and content moderation on Twitch and Reddit. New Media & Society 26, 7 (2024), 4034–4055. https://doi.org/10.1177/14614448221109804 arXiv:https://doi.org/10.1177/14614448221109804
  • Rubin Thomlinson. 2024. What Taylor and Travis’ relationship taught us about Misogynoir . https://rubinthomlinson.com/what-taylor-and-travis-relationship-taught-us-about-misogynoir/
  • Alexandra To, Angela D. R. Smith, Dilruba Showkat, Adinawa Adjagbodjou, and Christina Harrington. 2023. Flourishing in the Everyday: Moving Beyond Damage-Centered Design in HCI for BIPOC Communities. In Proceedings of the 2023 ACM Designing Interactive Systems Conference (Pittsburgh, PA, USA) (DIS ’23). Association for Computing Machinery, New York, NY, USA, 917–933. https://doi.org/10.1145/3563657.3596057
  • Alexandra To, Wenxia Sweeney, Jessica Hammer, and Geoff Kaufman. 2020. " They Just Don't Get It": Towards Social Technologies for Coping with Interpersonal Racism. Proceedings of the ACM on Human-Computer Interaction 4, CSCW1 (2020), 1–29.
  • Pengda Wang. 2022. Recommendation Algorithm in TikTok: Strengths, Dilemmas, and Possible Directions. International journal of social science studies 10, 5 (2022), 60–.
  • Philip Weber, Thomas Ludwig, Sabrina Brodesser, and Laura Grönewald. 2021. “It's a Kind of Art!”: Understanding Food Influencers as Influential Content Creators. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 175, 14 pages. https://doi.org/10.1145/3411764.3445607
  • Carolyn M West. 1995. Mammy, Sapphire, and Jezebel: Historical images of Black women and their implications for psychotherapy.Psychotherapy: Theory, Research, Practice, Training 32, 3 (1995), 458.
  • Frank B. Wilderson. 2020. Afropessimism. Liveright Publishing Corporation.
  • Jonathan Hua Ye and Cecil Eng Huang Chua. 2024. Monetization for Content Generation and User Engagement on Social Media Platforms: Evidence from Paid Q&A. IEEE transactions on engineering management 71 (2024), 4022–4034.
  • Brita Ytre-Arne and Hallvard Moe. 2021. Folk theories of algorithms: Understanding digital irritation. Media, Culture & Society 43, 5 (2021), 807–824.
  • Min Zhang and Yiqun Liu. 2021. A commentary of TikTok recommendation algorithms in MIT Technology Review 2021. Fundamental Research 1, 6 (2021), 846–847. https://doi.org/10.1016/j.fmre.2021.11.015

Footnote

1We use femme to widely include the diversity of womens’ and other feminine experiences and to avoid an implicit bioessentialism or focus on cisgender heteronormative experiences. We recognize that this is an imperfect umbrella term. More in recruitment.

2A theory developed by Jeremy Bentham, further studied by Michel Foucault explores how individuals can be controlled by creating the perception that they are being watched by an unseen authority [5, 33]

3An elevated blowdry that encompassed several passes of the hair with a flat iron

4The jezebel caricature is centered around Black women being innately promiscuous, and or predatory [80]

5The mammy archetype is typically depicted as an older, overweight, dark-skinned woman. She embodies the idealized caregiver, characterized by traits such as warmth, loyalty, maternal instinct, and a non-threatening, obedient, and submissive demeanor.[49]

6The sapphire caricature depicts Black women as rude, loud, malicious, stubborn, and overbearing [74].

CC-BY non-commercial, share alike license image
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

CHI '25, Yokohama, Japan

© 2025 Copyright held by the owner/author(s).
ACM ISBN 979-8-4007-1394-1/25/04.
DOI: https://doi.org/10.1145/3706598.3713842