Living an Algorithmic Life
No matter who you are or your level of online activity, in some way, a content algorithm affects your daily online life. If you look up the answer to a question, an algorithm sorts through hundreds of thousands of pages and sources to find the one that best fits your entry. If you watch a video on YouTube, an algorithm has spent time and processing power trying to read in between the lines of your activity and bring you content you never even knew you wanted to watch. If you ever scroll through apps like Instagram, Twitter, or Facebook, an algorithm sifts through all the users on the site to bring you the ones that you might want to follow. Algorithms rule the way we experience many of the most popular internet activities, but we really know very little about these expensive, proprietary machines.
Before I get too ahead of myself, I first want to go over what an algorithm, or content algorithm, is. The Digital Marketing Institute describes an algorithm as “a mathematical set of rules specifying how a group of data behaves.” They also describe a social media algorithm as something that “helps maintain order and assist in ranking search results and advertisements.” In essence, an algorithm is in charge of curating all the content that a user sees on the platform, and makes sure to adjust and tailor the content to the needs of the user.
The reason I wanted to investigate more into this topic was because of TikTok. TikTok is one of the fastest-growing social media apps ever; users spanning generations have heard or even seen something from the app in the summer of 2020, when President Trump was supposedly planning to ban the app and signed the executive order to do so. Trump attempted to ban the app on the grounds of national security and gave ByteDance a chance to either sell TikTok to an American company or be shut down in the U.S. Selling would have entailed giving an American company the domain rights to TikTok, as well as the transfer of its algorithm. However, after a series of injunctions and business deals, TikTok, or rather, its parent company ByteDance, was saved by Joe Biden who revoked the ban on one of his first days in office.
ByteDance has close ties to the Chinese government and is suspected of partially being held by government entities that hold 1 of the 3 seats on its Board of Directors. However, the Chinese government denies these claims, and Chinese citizens can’t even access the app; instead, they must use replacements. If these claims are accurate, it indicates a very high level of control over the running of ByteDance. Back when ByteDance was under pressure to sell TikTok, China had just implemented a new law that forced companies to get government approval to sell artificial intelligence (AI) content algorithms.
The issue with that is that TikTok is the algorithm, and without the AI process in the background, TikTok is a worthless platform no different than Instagram or Twitter. The TikTok algorithm is one of the most envied and revered entities in the social media or technology world. Ask any user of the app, and they will tell you about its ability to suck you in for 10 or 20 minutes, just getting caught in the endless cycle of scrolling to the next video. What makes TikTok special is that it keeps track of everything: what you watch, how long you watch it, if you visit the creators account after watching the video, if you like, if you comment, what the tags are on the video, how many times you watch it, what sound is attached to the video. It watches all of this in real time and subsequently edits the video queue accordingly. It provides an utterly unbeatable level of personalization all at the possible cost of user data. This algorithm is so powerful, that it can know things about its users without the user even knowing themselves. Things like their sexuality, or whether they are depressed, are all readable just in the data that the user provides through app usage. Having that level of recognition over its user base can be dangerous, creating an echo chamber of endless content based on the user’s perceived interest. An experiment from Vice shows that in just one day, the Tik Tok algorithm can take a completely vanilla TikTok front page to far-right conservative content.
TikTok also suffers from heavy content moderation and censorship. Before any content is uploaded, a team of bots and humans alike add tags, based on content, sounds, and many other things. Depending on the tags added, the video could be taken down quietly with no notification given to the creator. Another term for this is ‘shadow-banning.’ It’s not just unsightly content that’s being suppressed, but political content as well; videos ranging from Black Lives Matter protests to content about Tiananmen Square were also given this same treatment. When teenaged Feroza Aziz tried to make a video bringing attention to China’s Uighur Muslim concentration camps, her account was banned. Within TikTok’s leaked guidelines, it specified “highly controversial topics, such as separatism, religion sects conflicts, [and] conflicts between ethnic groups, for instance exaggerating the Islamic sects conflicts” as the reason for content bans.
While a human most likely has to make the choice to suppress activism-based content or videos, the TikTok algorithm was the tool they used to implement it. It is an unprecedentedly strong piece of software that is just getting better with each passing second. It’s only a matter of time before other major social media developers catch up. The question we as people and as users have to ask is: do we want to be managed by an omnipotent bot locked within China’s borders?
YouTube is one of the most prolific and commonly used sites on the internet, with over 122 million active users every day. Personally, this is the site that I spend the most time on. I’ve been using YouTube as my primary form of entertainment for the better part of a decade. I believe that the reason I’ve stayed on the platform for so long is solely because of the algorithm. YouTube has a unique ability to cater to you – it does the hard part, and all you have to do is watch. People have a unique relationship with YouTube and its algorithm because of YouTube’s easily monetized structure. Companies have been built just to get corporations to the top of the YouTube charts, or the trending page as it’s called on the site. Chances are that you use YouTube at least a few times a week, whether it’s educational, entertainment, or maybe even as a creator. Even if you don’t use it, you should know how it is run.
The YouTube algorithm is relatively less complex and multifaceted than TikTok’s, but it is still a feat of engineering and software development. It’s able to constantly monitor 500 hours of video uploaded every minute, or 720,000 hours of video every day. That in and of itself is a feat, but it’s also able to catalog material, tag it, and then cycle it all into people’s recommended tab. Out of all that content, something has to emerge from the pack, and that is tracked on the top ten page. If you were to look up “how does the YouTube algorithm work?” or “how was the YouTube algorithm made?” you will be bombarded with results from marketing and ad agencies on how to make the best, most popular video to promote your brand. YouTube’s algorithm is robust, but it can be gamed.
If you were to swap tabs as you read this and look at the YouTube trending page, you would see a few music videos, a movie or show trailer, sports highlights, several videos under a minute long, and maybe some current events or news channels. The trending page is something of an enigma, a trailer for an upcoming movie with ten million views uploaded yesterday can be right next to a video with 150 thousand views uploaded two days ago. YouTube claims that they have no say on what goes on the trending page, and that it is all handled by the algorithm that just records data like watch time, likes, and view count. Something else that dominates the trending page is kids’ content, showing videos of certain games or slime-making tutorials from creators like Mr. Beast. A problem that arises from this is that it’s not always content that is made for kids to watch.
Despite what you just heard, the answer to the problem of harmful or otherwise malignant videos on YouTube lies not with getting kids off the platform. While in a perfect world that might be the solution, it’s simply not a realistic or achievable goal in the world that we live in. I believe that the answer is more human intervention. Content algorithms are fantastic tools to moderate the incomprehensibly large internet. They are true feats of human engineering. However, the same thing that makes them effective is the same thing that brings them down: an algorithm doesn’t really have judgment. They can choose between a series of often complex options, but they can rarely create their own, and that is where humans have to step in. The only way for this self-feeding cycle of content to be broken is for a team of people to look at the issue, and, from the ground up, implement new policies and patches to their algorithm so it no longer recognizes these patterns in the same way.
These algorithms are the future of media and will only continue to grow into other aspects of our lives. But I don’t want this article to be foreboding, I want it to be a reminder of the ever-changing digital world that we find ourselves in. It should also be a celebration of what we have been able to build and accomplish as a species, whose digital age only started relatively recently in the span of humanity. But it’s important that people know and become able to recognize where their own wants and needs end and the algorithms begin.