” I bet you thought Facebook tweaked your feed to provide you with “relevant” information you might find useful.
What if you found out that instead they “tweaked” it in an attempt to determine whether they could manipulate your emotional state?
Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published inThe Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can!
There seems to be this idea that you can get “something for nothing”, or the “something” you “pay” is in some way beneficial. But what if it’s not? What if it’s intended to manipulate how you think and respond to events around you? What if that manipulation doesn’t end with an advertising message trying to get you to buy one product over another?
A rather liberal reading of Facebook’s terms would probably lead you to conclude that you agreed to this when you signed up for an account. But this much is certain — you weren’t told about it in advance, and you didn’t explicitly consent.
This particular instance came to light because it led to a published research paper. How much of this is going on in other venues entirely in secret, covered by the same sort of “policy agreement”, without your consent?
My best guess: A whole lot of it.”