A Quick Note On The Hearing About Algorithmic Manipulation

Updated on

A quick note from Fight for the Future on the hearing about algorithmic manipulation.

Get The Full Henry Singleton Series in PDF

Get the entire 4-part series on Henry Singleton in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues

Q1 2021 hedge fund letters, conferences and more

Algorithmic Manipulation Is A Poison

It is high time that legislators focused on the invisible forces that have been shaping the public’s experiences on major tech platforms for over a decade. Hopefully, this will be a first step toward accountability for the devisive, manipulative, and fear-mongering algorithmic practices that move like ghosts among us all. Opaque algorithms are one of Big Tech’s largest harms to society, made all the more insidious because they are wholly invisible to the average person. By design, many algorithms on major platforms amplify the worst speech on the internet with the intent of profiting off of promoting this speech to the people most susceptible using micro-targeted ads. Algorithmic manipulation on the scale we’re talking about isn't some special sauce that makes internet platforms great, it is a poison that distorts our views of ourselves and our world. Even children are being caught in this data collection and manipulation web and served addictive content that harms their mental health and threatens their lifetime privacy.

Blowing up Section 230 won’t fix this. Constantly demanding that platforms remove more content faster won’t fix it. There is no silver bullet solution that will address all harm while preserving freedom of expression and human rights. But one thing is clear: Big Tech has shown they are unwilling or unable to responsibly make decisions about what content should be amplified or suppressed. They can’t have it both ways. If Big Tech doesn’t want to be the arbiters of truth, they shouldn’t be picking and choosing what goes viral either. We’re calling for an industry-wide moratorium on all algorithmic manipulation of content that is not wholly transparent and that cannot be easily controlled by the user.

YouTube's Role

It is a step forward that legislators are speaking with heads of public policy - the rule enforcers - rather than the CEOs themselves. It is also encouraging to see YouTube differentiated from Google at this hearing, as YouTube in particular has played a major role in spreading mis- and dis- information and is also particularly notorious for the ways it applies manipulative algorithms to children’s content. Additionally, while we are pleased to see representation from civil society and academia with Tristan Harris and Dr. Joan Donovan, we call on legislators to include human rights organizations, experts in global freedom of expression, and representatives of marginalized groups, who are disproportionately impacted by changes in policy.