When YouTube started out in 2005, it was an innocent and simple video-sharing website. During that time, people uploading these videos were not really aware of exactly how far-reaching the audience database would be. Soon enough, it has grown to be one of the largest video-sharing websites on the internet. Naturally, seeing that we live in a capitalist society, this type of growth also brings along a lot of opportunities for monetization, and this monetization came in the form of ad revenue. “YouTubers” soon began to make a profit from the videos that they post onto the website. Nowadays, being a YouTuber is actually a viable career. However, because the creators of the platform are so widespread, and the videos are available to virtually any demographic, there is a need for regulation to ensure that the audience has a safe viewing experience. This regulation may initially seem to implement good and consistent control and monitoring over the vast platform, but in actuality, a lot of the rules and guidelines YouTube put up in recent years has caused controversy and much strain for a lot of the content creators on the platform.
In Paul du Gay’s explanation of the circuit of culture, the consumer is able to form various types of personal associations with the product and incorporates it into their life in various ways. However, this type of close connection of the consumer to the product would not be possible without a careful distribution of the product; the presentation of product to the consumer is crucial to form those connections. Therefore, there must be insightful regulation of the distribution of the product to ensure that it is received correctly. The regulation of YouTube videos as “products” fits within du Gay’s description of regulation well. YouTube has the responsibility to ensure that the viewer would form the connection to the video and they would not be able to do so if the video is harmful or against morals. Therefore, YouTube has come up with these community guidelines that prohibit certain content to be posted on the platform: spam and deceptive practices, sensitive content, violent or dangerous content, regulated goods, and misinformation. Because of these viewing restrictions, the audience experience is much more streamlined and managed. If these guidelines were to be violated, the content could be removed, the channel itself could be removed (in a three-strike system), or there would be demonetization (Zappin, Malik, et al. 2021). Demonetization is when ad revenue cannot be generated from the video anymore, and it could come in the form of a video-by-video basis or complete demonetization of the whole channel (Zappin, Malik, et al. 2021). Also, demonetization does not stop with ad revenue. Demonetized videos or channels would also be less likely to be promoted by the platform, effectively censoring them (Zappin, Malik, et al. 2021). Given the prominence of the monetization in online posted content on YouTube and the fact that large sums of people are making a living from it, demonetization poses a large threat to content creators on YouTube.
Over the recent years, YouTube has made changes to their regulation, advocating for a more kid-friendly platform and increasing the strictness of their guidelines, making the threat of demonetization even larger. In 2017, YouTube tightened the algorithm to minimize the amount of extremist content on the platform, leading to mass demonetization and became known as the “Adpocalypse” (Kumar 2019). As Kumar eloquently puts it in his 2019 article “The Algorithmic Dance,” “The Adpocalypse, unlike any other event before, allows us to consider the policy implications of a scenario where financial interests compete with ensuring the viability of contrarian, risky, unpopular or non-mainstream ideas that are necessary to enable a robust debate…” There is nothing wrong with YouTube looking to minimize extremist content, however the new algorithm “automatically demonetize[s] and de-platform[s] numerous channels that simply covered sensitive and controversial issues,” as Piper Thomson puts it in their 2019 article “Understanding YouTube Demonetization and the Adpocalypse.” So, even if creators are trying to go against the extremist content, their argument still gets automatically labeled as a violation of these newly implemented restrictions. Furthermore, especially in the last couple of years, YouTube has implemented a lot more policies to cater toward a younger audience. If content is meant to be made for kids, the need to follow these strict guidelines is even more uptight. Though, more kid-friendly the content, the easier YouTube can promote it and the more ad-revenue it can generate. Now, there is a stark disparity in the respect toward different areas of content. Some content might be more mature but still contains insightful and valuable information. Yet, because it is less likely to abide by these strict guidelines YouTube keeps putting forth, that type of content becomes less visible and recommended to viewers. Again, there is a similar pattern in cultivating a confining area of content that removes a lot of the freedom from the creator. What started off as YouTube merely ensuring the safety of the viewers and targeting a more kid-friendly audience is now bordering a control over free speech and becoming unproductive in pushing past these sensitive topics.
Although YouTube’s original intent with a stricter layout of restrictions seems noble, the details of the regulation are actually not clear enough. As previously stated, the algorithm would automatically label demonetized content even if it’s meant to simply educate the audience about controversial issues. Also, certain words are still labeled as indicators of policy violations even if they are used in a totally different content. Thus, this ambiguity could lead to unfair punishments where some creator may find their videos demonetized, and as Sarah Ferguson states it, in an article in Wyzowl, “similar videos uploaded by huge brands weren’t suffering the same fate.” Now, there is the issue of the platform becoming less of a personal and original creator space, and more of a marketable, business-centric place. Also, given the vague technicalities of these strict guidelines, the smaller creators who are not a part of a bigger brand have even less of a voice to the more upper-level administrators of YouTube. Because of the lack of feedback in this system of authoritative hierarchy, there are many instances in which content creators are mistakeably marked by the demonetization policies and want to file an appeal, but there is no space for discussion. A channel called GamerSault gives his personal account with YouTube demonetization in one of his videos. He ensures that all of the content he makes is suitable for a younger audience, however he still had received a demonetization strike by YouTube with little to no explanation. When he submits an appeal, it is through a system and there are no further communication routes possible. His appeal ultimately gets rejected with no other explanation, and there is still no option to communicate with an actual human about this policy violation. YouTube is creating these stricter policies with no room for feedback and this lack of a space for feedback is severing the connection between company and creator.
Because of the recently tightened polices of demonetization creating a weak link between creator and company, the structure that YouTube is built upon—the public and collaborative video-sharing space—becomes more readily to collapse. After all, YouTube would not be possible without the creators and if the creators aren’t heard about the injustices and vagueness of the newly implemented demonetization policies, there might very well be a future with no more original content. While regulation is absolutely necessary in ensuring the proper distribution of the product, YouTube is the perfect instance of when too much regulation or too many restrictions create a suffocating environment for the creators responsible for making the videos or products.
- Beser, James. “Our Responsibility Approach to Protecting Kids and Families on YouTube.” YouTube Official Blog. YouTube, October 25, 2021. https://blog.youtube/news-and-events/our-responsibility-approach-protecting-kids-and-families-youtube/.
- Ferguson, Samantha. “YouTube Demonetization: What to Do If You’re Penalized in 2022.” Wyzowl. Wyzowl, May 26, 2022. https://www.wyzowl.com/youtube-demonetization/#:~:text=The%20main%20reason%20given%20by,their%20advertiser%2Dfriendly%20content%20guidelines.&text=This%20list%20seems%20pretty%20straight,content%20associated%20with%20those%20topics.
- Gay, Paul Du. “Introduction.” Introduction. In Doing Cultural Studies: The Story of the Sony Walkman, 1–5. Los Angeles Calif., etc.: Sage, 2013.
- GamerSault. YouTube Demonetization STORY EXPLAINED. YouTube. YouTube, 2021. https://www.youtube.com/watch?v=phynyJ0l9U&t=1s&ab_channel=GamerSault.
- Kumar, Sangeet. 2019. “The algorithmic dance: YouTube’s Adpocalypse and the gatekeeping of cultural content on digital platforms”. Internet Policy Review 8 (2). DOI: 10.14763/2019.2.1417. https://policyreview.info/articles/analysis/algorithmic-dance-youtubes-adpocalypse-and-gatekeeping-cultural-content-digital.
- McFadden, Christopher. “YouTube’s History and Its Impact on the Internet.” IE. Interesting Engineering, May 20, 2021. https://interestingengineering.com/culture/youtubes-history-and-its-impact-on-the-internet.
- Thomson, Piper. “Understanding Youtube Demonetization and the Adpocalypse.” Learn Hub. G2, June 14, 2019. https://learn.g2.com/youtube-demonetization.
- Wille, Matt. “YouTube Warns Creators of Kids’ Content to Shape up or Face Demonetization.” Input. Input, October 26, 2021. https://www.inputmag.com/culture/youtube-warns-creators-of-kids-content-to-shape-up-or-face-demonetization.
- “YouTube Community Guidelines & Policies – How YouTube Works.” YouTube. YouTube. Accessed October 6, 2022. https://www.youtube.com/howyoutubeworks/policies/community-guidelines/.
- YouTube Homepage, Screenshot, YouTube, October 6th, 2022, https://www.youtube.com/
- Zappin, Anthony, Haroon Malik, Elhadi M. Shakshuki, and David A. Dampier. “YouTube Monetization and Censorship by Proxy: A Machine Learning Prospective.” Procedia Computer Science 198 (2022): 23-32.