The Negative Impact of Excellent Social Media Algorithms

Tik-Tok, the China-based social media app, has grown exponentially over the last few years, becoming the world's fastest-growing social network in that period. Apart from the easily viral nature of the platform's videos, a major part of its success can be credited to its extremely good algorithm which uses a wide array of factors to not only keep users on the platform for as long as possible but also show them exactly what they want to see while collecting very minimal information about users.

So how does it manage to know so much about a user though it asks so little on sign up? Well, Tik-Tok claims that it uses your "shares", "likes" and the content that you consume to be able to recommend "For You" videos but is that it? The accuracy of recommendations suggests that the app might be using much more than that.

An experiment carried out by the Wall Street Journal suggests that your "watch time", and not necessarily your likes, shares, and what you watch, might be much more valuable to the Tik-Tok algorithm. So every time you hesitate to scroll on a recommended video or you rewatch a video or linger on a video, the app feeds this watch time habit into its algorithm and is able to map out the type of content it should recommend to you, leading to mind-blowing accuracy.

For example, say that you are emotionally down so you get on the app to watch a few videos. Now, remember that nowhere did you tell the app that you are feeling down. So you scroll through the stream of videos and start watching random ones. You come across a sad video so naturally, because it matches your current mental state, you spend a bit more time watching that particular video than others. The app records this.

So as you keep scrolling and spending more time on sad videos, the app now starts recommending more sad content because from monitoring your watch time, it is assuming that's what you want to watch. After a number of videos, it gets to a  point now where the app recommends over 90% sad content. This is of course a problem.

Tik-Tok users comprise mostly of teenagers who are at a stage where their emotions are all over the place and they are still learning so much about themselves. The way the Tik-Tok algorithm works can end up fueling the presence of sad emotions because these teens will be taken into a rabbit-hole of sad content which is obviously not good for them. Tik Tok defends itself by starting that most harmful sad content like self-harm videos are removed from the platform by content moderation tools, but is that enough?

Even without harmful content, the fact that users can be exposed to so much repetitive sad content can be enough to build to severely affect the mental wellbeing of these young people. There should be much more being done to break such cycles and get users out of those content rabbit holes.

From creating filter bubbles to fostering polarization and groupthink, since the dawn of the social media age, the adverse effects of the algorithms which have made all the different apps so successful have been clear and well documented but to this day, not much has been done by the apps themselves to curb the negative effects of their creations. Why not? Because these effects are what bring the profits home. Engagement is everything to these apps and users spending as much time on the app as possible is what's good for their bottom line so expecting them to change their algorithms is futile.

The least they can do is offer algorithm transparency and education so that users know exactly what they are getting themselves into by engaging with certain types of content. If users know how their usage habits are contributing to the content they see, perhaps they can be warier about what they consume. This transparency and education can come in the form of disclaimers on content and much clearer terms of use.

Social media has positively contributed to our lives over the years but this should not mean its negative effects should be swept under the rug. Algorithms become better by being fed our user information and habits so why can't tech companies return the favor by making users learn more about their modus operandi through education and transparency so that they can use apps more wisely? That sounds like a fair trade to me.

Comments

ADVERTISEMENT