Member-only story

Extremist Recommendations — Unanticipated Consequences of AI

Babar M Bhatti
2 min readApr 11, 2019
Image by Pete Linforth — Pixabay

As technologists it is hard for many of us to admit that our work may potentially used in a negative way. It is particularly hard to fess up if you are a company that generates $16 billion per year. We are talking about YouTube.

What is the problem? It has to do with the recommendation engine that recommends next set of videos for users. It was originally designed to keep users on the site — i.e. keep them engaged. The more time users spend on YouTube, the more money it makes. So far so good.

The problem — noticed by many in the industry and openly talked about by former google engineer who wrote the recommendation algorithm — is that the recommendations push the user towards more extreme, intense, shocking and fringe content e.g. conspiracy theories, right-wing radicals and white nationalists. Essentially it surfaces content from long-tail that is likely to keep the user on the site for more time.

To be fair, YouTube is not the only user generated site that faces this problem. However it is one of the biggest in term of its audience size and its impact.

This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. (NYT)

--

--

Babar M Bhatti
Babar M Bhatti

Written by Babar M Bhatti

AI, Machine Learning for Executives, Data Science, Product Management. Co-Founder Dallas-AI.org. Speaker, Author. Former Co-founder @MutualMind

No responses yet