Facebook users are constantly annoyed by the social network's ongoing tweaks to their newsfeed.
Now it has emerged that some of the social network's changes were part of a massive psychological experiment conducted with the cooperation of the company, The Atlantic reports.
The tinkering has just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
It turns out that, two years ago, Facebook allowed data scientists to skew what almost 700,000 Facebook users saw when they logged into its service.
Specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can.
The experiment is almost certainly legal, by The Atlantic's reckoning. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research."
But the broader question remains: was it ethical?