For some time the internet has proved to be a chaotic orchestra of sanctimonious clatter. Correctly pitched, an event reverberates around the echo chambers of social media, drawing out cranks of all kinds to add their wail to the din, shortly before the next event starts a second tune.

A slight tweak in Facebook’s filtering algorithm sending happy content to some and unhappy content to others was always likely to cause controversy, and unsurprisingly many obliged. "I wonder if Facebook killed anyone with their emotion manipulation stunt," said Lauren Weinstein, a privacy activist. "At their scale and with depressed people out there, it’s possible."

Joining him was Jim Sheridan, a Labour MP who equated the algorithm to thought control, adding that people needed to be protected from the evil manipulative corporations intent on infesting their minds with, er, whatever thoughts he believed Facebook was pushing.

Sheridan, like many others, seemed not to realise that all media relies on some sort of filter when it is aggregating content. Before computers we had editors, and now we have algorithms, choosing content that enrages, saddens or otherwise amuses us, because that’s what people enjoy. All Facebook did differently was to write up a paper in an academic journal at the end of it, in which it said the effect on users was minimal.

Controversy around the assumed consent of the tech industry predates the furore, with one software engineer from Washington DC even comparing the practice of pre-checked boxes for email subscriptions to rape culture, the alleged process by which society encourages or excuses sexual violence. She wants tech to stop manipulating users by asking them to check the boxes themselves.

But if corporations are liable for the effects of everything they put before their audience, the information age is over. The internet has demonstrated the potential for the free flow of ideas more clearly than any other technology, and if people are upset with what they see it is easier than ever to look elsewhere than it ever was.

"Based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary," said Katherine Sledge Moore, a psychology professor at Elmhurst College in Illinois. All it proved was that seeing positive content made you slightly happier, and negative content slightly sadder. Big deal.