Perhaps YouTube Fixed Its Algorithm. It Did Not Fix its Extremism Problem

Cameron Ballard is Director of Research at Pluro Labs, a non-profit that harnesses AI to deepen and defend democracy in the United States and globally. Recent research appears to suggest that YouTube has substantially addressed the problem of online “rabbit holes” that lead individuals to extreme content and misinformation. The reality is that despite whatever improvements have been made to its algorithms, YouTube is still a massive repository of dangerous content that spreads across other social media and messaging apps both organically and through recommendations, particularly in non-English speaking communities. Too little is known about these phenomena, but what is clear is that YouTube is hardly without fault when it comes to the overall volume of hatred, conspiracy theories, and misinformation on social media. Algorithms are not the entire story Algorithms are undeniably influential in modern life. They affect not just the online content we consume, but access to credit, employment, medical treatment, judicial sentences, and more. The companies that make them present them as an inscrutable system, impossible for outsiders to understand. The supposed complexity of an algorithm is used not just for marketing; it also allows tech companies to shirk responsibility for their own policies and development priorities. When something goes wrong, a “bad algorithm” is blamed.  However, if you peel back the layers of statistical complexity, at the end of the day, an algorithm is just a set of instructions; a recipe. If you go to a restaurant and are told the food is bad just because…Perhaps YouTube Fixed Its Algorithm. It Did Not Fix its Extremism Problem