Get Out My Pocket: Facebook’s War on Free Will

Over the past five years, I have saved hundreds of long-form articles using Pocket.

My intent was to save articles I didn’t have time to read in the moment. As time passed my eyes were bigger than my attention span. The problem with that is there is SO MUCH GREAT LONG-FORM WRITING out there.

Seriously. If you are on any social media platform you will be exposed to at least 10 truly great articles. These are just the articles that people in your network shared. Believe it or not, there is a wide world outside of that. So over these five years, I have found myself saving everything. As a result, I have hundreds of articles saved that are begging to be read. So I’m going to read them in random order.

The first article is Franklin Foer’s Guardian piece from September 19, 2017, called “Facebook’s war on free will.” To sum up the article I’ll paraphrase Kanye West “No one (corporation) should have all that power.”

So what power does Facebook have? Well, it all lies in the algorithm. We’ve all heard the word before but Foer explains it’s utilization:

The essence of the algorithm is entirely uncomplicated. The textbooks compare them to recipes – a series of precise steps that can be followed mindlessly. This is different from equations, which have one correct result. Algorithms merely capture the process for solving a problem and say nothing about where those steps ultimately lead.

These recipes are the crucial building blocks of software. Programmers can’t simply order a computer to, say, search the internet. They must give the computer a set of specific instructions for accomplishing that task. These instructions must take the messy human activity of looking for information and transpose that into an orderly process that can be expressed in code. First do this … then do that. The process of translation, from concept to procedure to code, is inherently reductive. Complex processes must be subdivided into a series of binary choices. There’s no equation to suggest a dress to wear, but an algorithm could easily be written for that – it will work its way through a series of either/or questions (morning or night, winter or summer, sun or rain), with each choice pushing to the next.

In short, algorithms are one of many steps that will lead us to the singularity. When you log-on you are surrounded by them. They recommend you a TV show on Netflix, a song or album you might like on Spotify, an item you might be interested in buying on Amazon, and that isn’t scratching the surface. Algorithms are capable of doing things that a room full of the brightest minds on Earth can’t even comprehend. Foer continues:

Algorithms can translate languages without understanding words, simply by uncovering the patterns that undergird the construction of sentences. They can find coincidences that humans might never even think to seek. Walmart’s algorithms found that people desperately buy strawberry Pop-Tarts as they prepare for massive storms.

So what does this have to do with Facebook? Everything.

Over a billion people have Facebook accounts. More people than you would like to admit use Facebook as a one-stop shop for socializing, news, entertainment, and memes. While you get all of this information on your timeline it isn’t by chance that they end up there. It is, you guessed it, algorithms that suggest posts and point you in the direction of news that reflects your mindstate. It was this very situation that allowed Facebook to be exploited during the last Presidential election.

If it were only the news that you received, videos you played, or memes you saw that would be bad enough. Instead, Facebook’s power is exposed by them using their super large user base which allows them to run experiments.

Facebook sought to discover whether emotions are contagious. To conduct this trial, Facebook attempted to manipulate the mental state of its users. For one group, Facebook excised the positive words from the posts in the news feed; for another group, it removed the negative words. Each group, it concluded, wrote posts that echoed the mood of the posts it had reworded. This study was roundly condemned as invasive, but it is not so unusual. As one member of Facebook’s data science team confessed: “Anyone on that team could run a test. They’re always trying to alter people’s behaviour.”

If this was a thought experiment that would be one thing. The fact this was an actual experiment that they tested on people is scary. What’s even scarier is no oneknew, besides the people running the experiment, it was taking place. If someone suggested this type of abuse of power was possible a few years ago they would have been accused of being a conspiracy theorist.

The ability to control that many people through a suggestion is dangerous. One could argue the positives in nudging people away from negative behaviors subconsciously. I would say that it involves impeding on someone’s free will, and it is a slippery slope that is destined to lead to abuses.

The many Facebook experiments add up. The company believes that it has unlocked social psychology and acquired a deeper understanding of its users than they possess of themselves. Facebook can predict users’ race, sexual orientation, relationship status and drug use on the basis of their “likes” alone. It’s Zuckerberg’s fantasy that this data might be analysed to uncover the mother of all revelations, “a fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about”.

When you consider all the information we willingly give Facebook. Through each status, each image, each comment we give a little bit more of ourselves. That is not enough for Facebook. It is not for any altruistic goal, but to control us to point us in a direction that benefits them the most. They strive to know us better than we know ourselves by keeping track of our likes, what we view, and more. It’s all enough to make me wish I could give it all away, but it might be too late for all of us.