I’ve been unhappy on Facebook for quite a while. When I started using it so many years ago, it was a lot of fun. I loved finding old friends, and even making new ones through the platform. But in the past few years, it seems more and more depressing to be there. I’ve tried managing my experience to make it more positive—posting upbeat, non-political things, and snoozing any friends who are posting too many rants, diatribes, and scoldings (yes, I do that). What I found when I snoozed people on my newsfeed was that there were fewer actual posts and more ads, so I gradually stopped reading it. But I did skim. Every day. I couldn’t seem to stop checking the newsfeed. Eventually I came to the conclusion (not for the first time) that Facebook was wasting my time, and worse, it made me unhappy. So I decided to deactivate my account, stopping short of deleting it because, well, there is all this archived stuff. But also, at the back of my mind, I wonder: Isn’t how this is with addiction? You always think you are in control, and that you can have just one drink, one cigarette, one hit of heroin, and you can stop any time you want. So, I dunno. Maybe I will delete it eventually.
I made this decision earlier in the week, and have started the process of tying up the loose ends on the platform. In the meantime, the book An Ugly Truth: Inside Facebook’s Battle for Domination was released and I read it yesterday. It is a comprehensive history of Facebook’s policy and actions regarding its growth as a platform for “connecting,” which they (meaning, Zuckerberg) consider the single most important reason for their existence. It seems an innocuous purpose, but as it turns out, it has had far reaching effects that have been terribly destructive. I urge everyone to read the book. It is horrifying. Here are some takeaways for me:
1) “Connecting” is everything, the most important thing, the thing that drives all policy decisions, the thing that overrides all. “Connecting” people is more important than any other ethical consideration.
2) At one point Facebook conducted an experiment on us. They divided users into two groups, and set the newsfeed algorithms so that one group got positive posts, and one got negative posts, and over a very short period of time, they found that the positive newsfeed readers posted positive things in response, and the negative newsfeed readers posted negative things. Some people found out about the experiment and objected, and Facebook apologized, and said that they meant no harm by it. They promised to fix the problem, but as the book shows, over and over, what they actually did is fix the PR. It turns out that this is their standard response to criticism.
From the book: “Facebook was well aware of the platform’s ability to manipulate people’s emotions, as the rest of the world learned in early June 2014, when news of an experiment the company had secretly conducted was made public. The experiment laid bare both Facebook’s power to reach deep into the psyche of its users and its willingness to test the boundaries of that power without users’ knowledge. “Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” Facebook data scientists wrote in a research paper published in the Proceedings of the National Academy of Sciences. They described how, over the course of one week in 2012, they had tampered with what almost 700,000 Facebook users saw when they logged on to the platform. In the experiment, some Facebook users were shown content that was overwhelmingly “happy,” while others were shown content that was overwhelming “sad.” A happy post could be the birth of a baby panda at the zoo. A sad post could be an angry editorial about immigration. The results were dramatic. Viewing negative posts prompted users to express negative attitudes in their own posts. And the more positive content a user saw, the more likely that user would spread upbeat content.”
3) Facebook’s (Zuckerberg’s) primary drive is “connections,” which they measure by “sessions,” which is how often and how long people check their newsfeeds and engage. And, as it turns out, the more negative the newsfeed is, the more frequent and longer the ”sessions” are. The algorithms are set accordingly.
From the book: “But Facebook’s engineers, who were not aware of the experiment before the paper was published, recognized that it revealed something more nefarious about the News Feed’s ability to influence people. Over the years, the platform’s algorithms had gotten more sophisticated at identifying the material that appealed most to individual users and were prioritizing it at the top of their feeds. The News Feed operated like a finely tuned dial, sensitive to that photograph a user lingered on longest, or the article they spent the most time reading. Once it had established that the user was more likely to view a certain type of content, it fed them as much of it as possible.”
That is, algorithms are designed to give users more of what they engage with the most. So whether you click more on negative news links, weird-ass conspiracy theories involving pizza places, or pictures of cute cats, the algorithms move those types of posts up to the top of your news feed, thus creating an ever expanding vortex of “more of same.” This might not be so bad when it involves cute cats, but when it concerns dangerous conspiracy theories?
And finally, this from the book, after Facebook began to realize how much the negativity had started to become dangerous and out of control, and made a half-hearted attempt to fix the problem: “For the past year, the company’s data scientists had been quietly running experiments that tested how Facebook users responded when shown content that fell into one of two categories: good for the world or bad for the world. The experiments, which were posted on Facebook under the subject line “P (Bad for the world),” had reduced the visibility of posts that people considered “bad for the world.” But while they had successfully demoted them in the News Feed, therefore prompting users to see more posts that were “good for the world” when they logged into Facebook, the data scientists found that users opened Facebook far less after the changes were made. The sessions metric remained the holy grail for Zuckerberg and the engineering teams. The group was asked to go back and tweak the experiment. They were less aggressive about demoting posts that were bad for the world, and slowly, the number of sessions stabilized. The team was told that Zuckerberg would approve a moderate form of the changes, but only after confirming that the new version did not lead to reduced user engagement. “The bottom line was that we couldn’t hurt our bottom line,” observed a Facebook data scientist who worked on the changes. “Mark still wanted people using Facebook as much as possible, as often as possible.”
That is, they discovered that negative news, though it makes people unhappy, engages people more, and so they adjust the algorithms to place negative news (and disinformation) in your newsfeed in order to maximize your “sessions.” More angry rants, fewer baby pandas. So. It turns out that there is a reason I have become increasingly miserable over the years while using Facebook.
It boils down to this: They do this deliberately because it maximizes user engagement.
4) Facebook makes its money off of your data. They make us feel better with our “privacy settings,” but every time you like a post, that is a tiny piece of data about yourself. That photo of your chicken coop? Data. Your picture of the deep dish pizza you ate on vacay. The groups you belong to. The news stories you click on. The printer or pair of shoes you search for on Google—that feeds straight to Facebook, who feeds it to advertisers. Data, and data, and data, and data. Oh, and it is ”connected” (that word again) to Instagram and WhatsApp, so they mine the data from there, too. All of this is handed over to parties (for a fee) who want to sell you something, whether that is a pair of shoes or a political idea. Thus it is that when you post about playing tennis with your friends, in a matter of seconds you begin to see ads for T-shirts cautioning people to fear grandmas who play tennis.
Maybe you thought you were using Facebook and Instagram to let your friends see the real side of you (as a socially awkward introvert, this was a large part of it its appeal to me), but the real entity you were revealing yourself to was Facebook and its algorithms. They have got it down to a very precise science. You only think you are in charge. They tell us they’ll change, then give a us new button or something to mollify us, show some feel good ads on television, and proceed to keep doing what they do.
5) When the Russians started their disinformation campaign in the 2016 election, the Facebook security team watched it happen in real time, reported it to the next higher ups, and the company proceeded to do nothing about it. They knew what was happening—that scores of Americans were falling for misinformation planted by a foreign power (all internet trails led back to Saint Petersburg), using fake accounts and posing as Americans—and they did nothing about it. They also knew that the Russians had hacked the Democratic National Committee’s emails and were hawking them to journalists. They literally—literally—sat there and watched it happening in real time. It was only the tip of the iceberg, as it turned out. Read the book for the whole story. Here is my takeaway from this part of the sordid history: Facebook is less interested in fixing very real problems with security and disinformation than they are in covering them up and managing the PR surrounding them. Oh, and the head of the security team that uncovered how the Russians were using Facebook? They forced him out of the company.
So I had at first decided to deactivate because Facebook it was making me unhappy, but now I am deactivating because I have come to realize that I am unhappy by design and manipulation. Facebook is an unprincipled company run by a manchild with a poorly developed ethical center, and I don’t think the company has the will to fix what is broken. Maybe it is impossible to fix at this point. In their (Zuckerberg’s) view, more and more people are connecting every day, so what’s the problem?
My reaction to the book is my own. You can read it yourself and make up your own mind. If you are like me, what you read will sicken you. Written by two New York Times reporters, Sheera Frenkel and Cecilia Kang, it is heavily researched and sourced.
I’ll deactivate in a few days. In the meantime, I’ll post something in my newsfeed letting people know that I am coming over here, to my blog. Blogs are sort of passé these days, and maybe it is an illusion to think that I can escape data mining, but at least I control my own newsfeed here. I am an algorithm of one. I will try not abuse that.