View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

Why you shouldn’t care about the Facebook experiment

Furore over the social media's happiness algorithm is mere moral panic.

By Cbr Rolling Blog

For some time the internet has proved to be a chaotic orchestra of sanctimonious clatter. Correctly pitched, an event reverberates around the echo chambers of social media, drawing out cranks of all kinds to add their wail to the din, shortly before the next event starts a second tune.

A slight tweak in Facebook’s filtering algorithm sending happy content to some and unhappy content to others was always likely to cause controversy, and unsurprisingly many obliged. "I wonder if Facebook killed anyone with their emotion manipulation stunt," said Lauren Weinstein, a privacy activist. "At their scale and with depressed people out there, it’s possible."

Joining him was Jim Sheridan, a Labour MP who equated the algorithm to thought control, adding that people needed to be protected from the evil manipulative corporations intent on infesting their minds with, er, whatever thoughts he believed Facebook was pushing.

Sheridan, like many others, seemed not to realise that all media relies on some sort of filter when it is aggregating content. Before computers we had editors, and now we have algorithms, choosing content that enrages, saddens or otherwise amuses us, because that’s what people enjoy. All Facebook did differently was to write up a paper in an academic journal at the end of it, in which it said the effect on users was minimal.

Controversy around the assumed consent of the tech industry predates the furore, with one software engineer from Washington DC even comparing the practice of pre-checked boxes for email subscriptions to rape culture, the alleged process by which society encourages or excuses sexual violence. She wants tech to stop manipulating users by asking them to check the boxes themselves.

But if corporations are liable for the effects of everything they put before their audience, the information age is over. The internet has demonstrated the potential for the free flow of ideas more clearly than any other technology, and if people are upset with what they see it is easier than ever to look elsewhere than it ever was.

"Based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary," said Katherine Sledge Moore, a psychology professor at Elmhurst College in Illinois. All it proved was that seeing positive content made you slightly happier, and negative content slightly sadder. Big deal.

Content from our partners
Powering AI’s potential: turning promise into reality
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU