Forbes: Facebook's Experiment Reveals A Much Deeper Problem With The Internet Today
When a technology company behaves badly, you hear one defense brought up repeatedly: they could have done so much worse. When Google decided that they would use your face in their advertisements, you shouldn't have been outraged, you should have been relieved they didn't tell everyone your darkest secrets. The message is, given what they know about you, you should be grateful that they treat you as well as they do.
This line of thinking gets brought up frequently around the discussion of Facebook's now infamous 2012 "emotional contagion" study. There are legitimate questions as to whether Facebook crossed ethical lines, but the more important issue for users is the fact that they were able to do this so easily. It's not hard to demonize Facebook for their actions, but this problem is the same with every major company on the internet today.
The current relationship between consumers and internet companies is unsustainable.
It's built on agreements that would take weeks to read, if anyone actually ever attempted to do that. We trust companies to provide us relevant information by "personalizing" our experience, but we give little thought to the downsides. While we are aware that websites are "optimizing" content for us, we don't think about how that constrains our experience. And the enormous amount of personal information that has been collected about us over the last decade will continue to be acted upon - the Facebook experiment was just a glimpse at what is ahead.
The personalization of the internet can bring greater convenience, but it comes at a cost. On an individual level, we still don't really comprehend just how much of what we see is specifically tailored to us. Our Google searches often turn up different results - while this may not seem like much, on aggregate, this can have an unexpected impact. One study even suggested that Google's algorithms could possibly sway the outcome of an election. A personalized internet is one that is dominated by intermediaries who decide what to show you and when.
The promise of the internet has always been the free exchange of ideas, but that potential is being undermined by filtering everything we see and do.
Meanwhile, there is an arms race to delve deeper into your personal information to make it actionable. While the last ten years were focused on how to collect as much information as possible, the next will be focused on how to turn that information into action. Legal scholar Ryan Calo argues that we need to watch out for "digital market manipulation" here - where companies use your background, details, and emotional state to coerce you into buying products you don't need or paying higher prices than you normally would. He's got a point; knowing and influencing your emotional state can be a major advantage in getting your attention, a factor that influenced Facebook to undertake this study in the first place.
For as long as they've been around the news media and advertisers have been trying to emotionally manipulate us, but the intimacy and specificity with which messages can be targeted to you online sets it apart entirely. Grouping all of these things together and suggesting they're the same is not just disingenuous, it's unproductive. Facebook inventors and backers may see short-term gains in trivializing these discussions, but these issues need to be addressed. So much of the real-world success of science and technology is based on applicability and appropriateness.
The different forces at play here need to be taken apart and analyzed individually, each in their own contexts. A/B testing, for instance, can be quite helpful to users because it can help companies figure out how to help them get what they're looking for. And the need for curation is just a reality of the modern internet - Facebook, for example, has to choose between an average of 1,500 different storiesto show you in your News Feed. Intention matters enormously here, but it is quite hard to measure, let alone regulate.
The reality is that in an environment where companies control all the information, decide when and how it is displayed, and don't have to discuss or disclose what they are doing, you will always be a "lab rat" - whether you are part of a Facebook study or not. The terms that consumers agree to today will be used to justify whatever companies want to do tomorrow; they will determine how you are treated. The status quo is broken, but with every battle consumers have the opportunity to shape a more mutually beneficial internet. Facebook may have done us all a favor by giving us a chance to say that we draw the line at emotional manipulation.