facebook’s existential crisis

read randy david’s Politics in the age of big data

… Facebook knew the potential uses of its platform in electoral campaigns, and, indeed, its people actively promoted these in workshops they gave to campaign strategists. For Zuckerberg and his associates, bringing electoral discourse to Facebook would not only increase their traffic, it was also good for democracy.

Clearly, this was a naïve view. There were people who saw Facebook’s uses beyond these civic-minded intentions. One of them was Alexander Nix, CEO of Cambridge Analytica, who said: “If you know the personality of the people you’re targeting, you can nuance your messaging to resonate more effectively with those key groups.” His firm today stands accused of improperly using data it had obtained from Facebook on false pretenses in order to craft campaign software for their clients, including some from the Philippines. Facebook itself is accused of the unauthorized sharing of users’ accounts with Cambridge Analytica, including those of around 1,175,870 Filipino users.

The issues against Cambridge Analytica and Facebook have mainly centered on breaches of privacy.  I think these issues pale in comparison to what has become Facebook’s biggest danger — the intensification of bigotry and partisan resentments resulting from the micro-targeted manipulation of Facebook users’ media feeds. Just take a look at the normalization of hate speech on social media. Politics in the age of big data preys upon desires, hopes, and fears, that often lie at the level of the unconscious. That is what makes it insidious and, quite often, deadly.

INSIDIOUS (adj., gradually and secretly causing harm) is the operant word.  and it is said that facebook CEO mark zuckerberg was aware of it and went ahead anyway.  read  Facebook Founder Warns “God Only Knows What It’s Doing To Kids’ Brains” (nov2017)

38-year-old founding president of Facebook, Sean Parker, was uncharacteristically frank about his creation in an interview with Axios. So much so in fact that he concluded, Mark Zuckerberg will probably block his account after reading this.

Confirming every ‘big brother’ conspiracy there is about the social media giant, Parker explained how social networks purposely hook users and potentially hurt our brains

“When Facebook was getting going, I had these people who would come up to me and they would say, ‘I’m not on social media.’ And I would say, ‘OK. You know, you will be.’ And then they would say, ‘No, no, no. I value my real-life interactions. I value the moment. I value presence. I value intimacy.’ And I would say, … ‘We’ll get you eventually.’

“I don’t know if I really understood the consequences of what I was saying, because [of] the unintended consequences of a network when it grows to a billion or 2 billion people and … it literally changes your relationship with society, with each other … It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.

“The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?‘”

“And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments.”

It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.

“The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway. 

and this.  ANOTHER FACEBOOK EXECUTIVE ISSUES WARNING ABOUT ITS DISASTROUS EFFECT ON PSYCHOLOGY AND SOCIETY (dec2017)

In a recent talk with the Stanford Graduate School of Business, former vice-president of user growth for Facebook, Chamath Palihapitiya, made some rather startling comments about the impact Facebook and social media are having on human culture. He acknowledged feeling ‘tremendous guilt’ about his involvement with Facebook, citing the fact that the technology is so widely used that it is actually affecting how human beings interact with one another, upending our entire cultural history of communication.

When asked what ‘soul-searching’ he is doing right now, Palihapitiya responded:

“I feel tremendous guilt… I think in the back deep, deep recesses of our minds, we kind of knew something bad could happen…
 
It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are.
 
It is a point in time where people need to hard break from some of these tools, and the things that you rely on.
 
The short-term, dopamine-driven feedback loops we’ve created are destroying how society works…
 
No civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem… this is a global problem. 
 
It is eroding the core foundations of how people behave by and between each other.”

jaron lanier, said to be the founding father of virtual reality, in a 2018 TED conference attributed all the troubles of facebook and other tech giants like google to a “globally tragic” mistake made in the late 1990s and early 2000s.

“Early digital culture had a sense of lefty socialist mission about it,” he said, noting that a common sentiment in Silicon Valley at the time held that everything on the internet must be purely public and free. At the same time, there was (and is) an ongoing love affair with tech entrepreneurship and industry titans like Steve Jobs.

“How do you celebrate entrepreneurship when everything is free? The solution is ads,” Lanier said. “[Services like Google and Facebook] were free with ads. In the beginning, it was cute. Then the customers and other entities who use this system became more experienced and clever. Advertisement turned into behavior modification.”

Negative stimuli tends to rise to the top of social networks, Lanier said. This is because negative emotions rise up faster than positive ones. And that ultimately makes it easier for misinformation and other manipulative pieces of information to take over.

At this point, Lanier said, we shouldn’t call companies like Facebook social networks. “Call them behavior modification empires.”

yes.  behaviour modification through operant conditioning, “discovered” (in a manner of speaking) as in, demonstrated and defined by psychologist b.f. skinner:

Behavior which is reinforced tends to be repeated (i.e., strengthened); behavior which is not reinforced tends to die out-or be extinguished (i.e., weakened)  

Skinner (1948) studied operant conditioning by conducting experiments using animals which he placed in a ‘Skinner Box‘.

read smithsonian.com‘s B.F. Skinner: The Man Who Taught Pigeons to Play Ping-Pong and Rats to Pull Levers.  read psychology today‘s The New Skinner Box: Web and Mobile Analytics

… successful social media companies like Facebook and Instagram are able to capture and keep our attention for as long as possible by tapping into the evolutionary reward systems in our brains. While it’s no secret to anyone that a web engineer’s job is to do whatever it takes keep you on their site, few individuals are aware that their methods are founded in the classic operant conditioning experiments conducted by BF Skinner. Skinner put rats in a cage and varied the type, amount and timing of rewards to reinforce different types of behavior. He found that when he manipulated the rats’ schedule such that rewards came at random times, the rats became more engaged and attentive so that they would not miss an opportunity to receive the much-anticipated reward. But his findings hold true for humans as well. For example, it is precisely this phenomenon that allows casinos to make most of their money from slot machines.

… While no research exists yet on the brain’s response to social media, we can generalize the results of classical operant experiments to help us hypothesize why so many of us obsessively check our social media accounts. As Skinner revealed, when a reinforcer rewards us enough times, we learn to return to it out of anticipation of future rewards. When we post content on Facebook (e.g. a status update), we are primed to monitor how many “likes” or comments our post will yield. The number of likes and comments we receive for our post then reinforces how brilliant, interesting or witty we feel and how much people love us. It is therefore reasonable to assume that you might get a brief burst of dopamine each time you receive positive social feedback on Facebook. However, the true effect may be remarkably more sinister.

insidious.  sinister.  to think that facebook was initially intended to be, and seen as, a force for “good”.  read zuckerberg’s Testimony released just before his back-to-back Q & A in US senate and house hearings on the data scandal.

Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring. As Facebook has grown, people everywhere have gotten a powerful new tool to stay connected to the people they love, make their voices heard, and build communities and businesses.

… But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.

… It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.

as it turns out, facebook‘s euphoric mission, i.e., to focus on “all the good that connecting people can bring,” is an epic fail.  facebook has not only intensified bigotry and partisan resentments in this third world corner of the world, it has also enabled divisiveness of toxic proportions, much to the delight no doubt of trad-pols who know only to divide-and-rule.

Comment