top of page
Search

Lenticular Logic and LGBTQ+ YouTube

A simplified look at YouTube's structure might look something like this; YouTube provides a platform for content creators to reach consumers, while advertisers purchase space from YouTube to reach consumers through the content creators. YouTube provides advertisers with the ability to run their ads on certain types of content so that their ads do not run on content inconsistent with the brand image, and YouTube algorithms determine certain types of content on which ads simply do not run because of the controversial contents of the video (i.e., demonetization). Creators are able to receive a small portion of the profit of ad revenue, as well as the possibility to freely create whatever they wish "within reason". Consumers, then, are provided with a seemingly infinite amount of content and the ability to report content that is not in line with community guidelines (guidelines that seem to ban only the most "extreme" types of content).


This structure, although it seems to open up the creative possibilities of content and enable the wide distribution of new content and information, can be discriminatory in the type of content that is regulated via YouTube algorithms. As I mentioned in class, YouTuber Chase Ross found that including the words "FTM trans" in the title of his video caused it to be immediately demonetized. This algorithmic restriction of content, intended to be advantageous for advertisers by including their ads on videos that are "less extreme", is simultaneously harmful for LGBTQ+ creators and their content, thus discouraging the creation of LGBTQ+ content and harming the creators by stripping them of income.


There are, beyond the case of demonetization, various other ways in which YouTube's system, intended to protect advertisers, consumers, and creators, is simultaneously harmful to marginalized groups. In particular, the ability of consumers to report content that violates YouTube guidelines and the age restriction of certain videos (which also results in their demonetization). While reporting content can be useful for discouraging violent or hateful content, the ability of users to report certain content creates an interesting dynamic between consumers and creators. While it is unclear how reporting videos functions, it seems as if there are algorithmic functions (rather than review processes) which will issue demonetization, age restriction, or strikes to videos or channels. In the case of LGBTQ+ YouTubers, this can result in the removal of videos, age restriction, or demonetization because of false reports, generally claiming the presence of sexual content when there is none. In this case, it is the tripartite interaction between the content creator, the consumer, and YouTube's algorithm that creates this lenticular logic. On the one hand, the reporting function provides users with a sort of protection against content that should not be on YouTube (users can report content for content that includes violence, hate speech, and even the promotion of terrorism). On the other hand, this allows for the proliferation of marginalization against LGBTQ+ communities, whose videos can be removed, demonetized, or age-restricted because of these reports. LGBTQ+ creators are thus harmed because of a lack of income, but the consumers of their content are also harmed, because these videos are hidden and creators are discouraged from producing this type of content because of demonetization. The report function, while useful for protecting against the proliferation of hateful content, creates an environment where the marginalization of already marginalized groups is able to flourish.

5 views0 comments

Recent Posts

See All

Digital Media and the Human Condition

As this is my final blog post in the final week of class, I feel like it is appropriate to think on the class as a whole. During our conference, I was struck by a reoccurring theme during each of the

bottom of page