how-facebook-programming-us
Polygonal hand and puppet. Vector mesh spheres from flying debris. Thin line business or political concept. Blue structure style illustration
AI

How Facebook is Programming Us to Be Magical, Manic, and Monstrous

author
12 minutes, 26 seconds Read

What if you combined a group of super-smart private school and Ivy League Ph.D. grads from subjects ranging from statistics and behavioral science to marketing and psychology? The team is then strengthened by the lateral thinking of a maverick pink-haired high-school dropout, as well as a rich trove of 5,000 data points for 90 million individuals to work with. Get an angel investor to finance them as a computer scientist-turned-hedge-fund-billionaire and embrace the Facebook culture of moving quickly and breaking things.

What results do you get?

Is it ethical to discourage opposition voters from voting in Nigerian elections, to incite and exploit ethnic tensions in Latvia, to manufacture a “graffiti youth campaign” in Trinidad and Tobago, to manipulate Brexit and an American Presidential Election?

Technology alone will not enough. Education, advanced science degrees, or personal intellect are not pricey. If you don’t believe me, consider Enron, Worldcom, Tyco, Freddie Mac, AIG, Lehman Brothers, or Arthur Andersen, all of which had far more human and other resources than the SCL Group or its child Cambridge Analytica[CA]. Ultimately, their brilliance, money, power, and technology enabled them to inflict far more harm than good.

Narrative Weaponization and Story

The most powerful instrument in the world is story. For millennia, military and political leaders have recognized and employed story as a weapon. Because if you can modify and control your story, you can manipulate and control yourself.

This is not a new occurrence. However, improvements in science and technology have just lately enabled us to weaponize story on a global scale with hitherto unimagined repercussions and catastrophic efficiency.

The following is how it works:

To begin, you “harvest” data. Then you examine it for “vulnerabilities,” and last you exploit the discovered vulnerabilities to influence people’s behavior. This might include affecting whether or not people purchase; whether or not they fight a repressive regime or a foreign invasion; whether or not they vote, and so forth.

So, in principle, if you approach people through a media they trust and give them the appropriate tale in the right voice, exposing their specific weaknesses, and do it enough times, you can finally get them to believe, purchase, and do anything you want.

That is what weaponizing story is all about, regardless of the reality. And we now know it’s more than just a hypothesis. Because that is exactly what the SCL Group and its offspring Cambridge Analytica and AggregateIQ [AIQ] intended to achieve during Brexit and the recent presidential elections. And the evidence implies they were successful since even a “shift” of 2% of targeted votes would have been enough to change the outcome.

Let’s take a closer look at each phase of the process:

Data Collection and Monetization

Data is the new oil because it powers everything. As a result, you require data. Data that is correct. And a lot of it – the more the better, since data is, of course, never enough. To get data, you must go where the data is – primarily Facebook, but also Google, Amazon, Netflix, YouTube, other social media platforms, Internet Service Providers, Cable TV, security camera feeds, credit card transactions, phone company, library, or medical records – and anything else that is or can be tracked.

Facebook, like most other corporations, is driven by the fundamental goal of producing money for its stockholders. But, unlike most other organizations, Facebook faces a unique problem: it collects more data than anybody else while lacking a significant alternative income stream. As a result, Facebook must continue to innovate in order to capture more data on its users and convert it into cash. As a result, the need to monetize becomes the imperative to monitor and, finally, the facilitator of weaponization. To be as profitable as possible, Facebook has to transform itself into a vast monitoring operation. And so it has been.

In other words, the problem with Facebook, as well as Google and others, is not a few “bad actors,” as COO Sheryl Sandberg described them; rather, their whole business model centers around gathering and utilizing personal data. It then sells the data to companies that want to send tailored messages to certain users. So it’s still important emphasizing that we are Facebook’s product, not its customers. Actual clients include firms such as SCL Group and Cambridge Analytica, who pay Facebook millions of dollars. As a result, the more people and data Facebook can collect, the more money it can generate. It’s a business concept based on monetization through monitoring. The same can be said about Google and, to a lesser extent, Amazon, Netflix, and most other internet behemoths.

Psychographics and PsyOps

When the military talks about “winning hearts and minds” in Afghanistan, Iraq, or anywhere else, they are referring to psychological operations [PsyOps]. PsyOps, in essence, use psychological methods to influence people’s ideas and behaviors. [You may also refer to it as social engineering or social programming if you choose.]

Psychographics is a way for discovering and comprehending personality traits and characteristics. Simply said, it is the manipulation of people’s psyches to drastically alter how they think, feel, and act regarding any given issue. This is significant because personality drives behavior, and conduct is how one does everything – purchasing, voting, arguing, and so on.

Cambridge researcher Alexander Kogan, for example, ran a poll deceiving Facebook users into giving him all of their data, as well as all of their friends’ data, and Cambridge Analytica obtained information for 87 million Facebook users, including 61 million Americans. The corporation then utilized that wealth of private information, which it obtained both illegally and for free, to build a model that can predict the personality of every single American.

The model utilized is the experimental psychology’s OCEAN model, where the acronym stands for:

Openness – your willingness to try new things.

Conscientiousness refers to whether you value order, planning, and routines in your life.

Extraversion refers to how sociable you are.

Agreeableness – if you prioritize the needs of others and society over your own.

Neuroticism is a measure of how much you worry about things.

While the concept appears easy, it is important to note that it is not only an academic exercise or an informed guessing machine. It is, rather, a scientifically rigorous model that enables them to target, forecast, test, and fine-tune the message they deliver to each individual voter or consumer, based on that person’s specific vulnerabilities, and elicit the intended outcome or action. [Again, if you choose, you can call it social engineering, programming, or good old-fashioned brainwashing.]

Micro-Targeting

After determining each voter’s psychological profile, you may proceed with a micro-targeted message that is personalized to that individual person with the purpose of influencing their behavior.

This implies that instead of yelling in the town square, you may now go about whispering in everyone’s ear after knowing everything there is to know about each person. And facts or truth are useless at this stage since it’s all about pushing people’s buttons by abusing their weaknesses in order to get them to do your bidding. So you say one thing to one voter and something very different to another.

For example, you can depress Bernie supporters to the point that they don’t vote at all. By manipulating their unique worries and emotions with false news and other tailored information, you may drive moderates to go extreme. You can take advantage of existing cultural, religious, or racial differences or create new ones. It makes no difference what you do or if it is true or not, as long as it fits your objective. Mark Turnbull, managing director of Cambridge Analytica, explains:

“It’s pointless fighting an election campaign on facts because it’s all about emotion.” The biggest error political parties make is attempting to win the debate rather than discovering the emotional root of the problem – the worry – and appealing directly to it.”

In other words, regardless of the truth or facts, it is all about influencing people’s emotions. As a result, Cambridge Analytica is providing their clients with a scientifically fine-tuned full-service tailored propaganda engine designed to achieve precisely that. And they are more than willing to lease it to anybody who can afford it, including corrupt and dictatorial countries all around the world. However, the SCL Group, CA, and AIQ are only the visible icebergs. Hundreds of other corporations and organizations, such as Palantir and the Russian Federal Security Service, are working on or have already achieved similar capabilities.

How Facebook is Programming Us to Be Magical, Manic, and Monstrous

As remarked by well-known futurist Gerd Leonhard, Facebook began as pure magic. Then it became so addictive that we were manic. Finally, it reveals itself to be hideous. But if you assume this was a random conclusion unforeseeable by its naive but idealistic creators, you’d be incorrect. Former Facebook CEO Sean Parker stated publicly:

“The thought process that went into building these applications, with Facebook being the first to really understand it, was all about consuming as much of your time and conscious attention as possible.”‘”

“That means we’ll have to give you a little dopamine hit every now and then, because someone liked or commented on a photo or a post or whatever.” And that will encourage you to offer more material, which will result in… more likes and comments.”

“It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”

“The creators and inventors — me, Mark [Zuckerberg], Kevin Systrom on Instagram, all of these people — understood this consciously.” And we still did it.”

In other words, Facebook set out from the start to create an addictive platform that exploited flaws in human nature. Facebook is now actually programming us.

It is conditioning us to log in, check in, click, tag, share, update, remain updated, go live, like, hate, get angry, be pleased, get depressed, friend, establish a group, unfriend, give up privacy, and basically never leave the platform. It has conditioned us to rely on and trust it to the point that we cannot envision living without it. So much so that the inquiry “Do you have Facebook?” has become obsolete.” is no longer in use. “Does Facebook have you?” is the actual question.” And, of course, the answer is “Yes, it does.” [Regardless of whether you have an account or not. So I’m not certain that the #DeleteFacebook movement is the best long-term strategy, though I hope it has some short-term advantages.]

Democracy, Privacy, and Free Will

The truth is that Facebook has become the most popular venue for telling stories. And, on the one hand, it has told us all a tale – the story of what Facebook is [or, rather, what Facebook purports to be]. On the other hand, it has been so successful because it has also provided a forum for all of us to document and share our own tales. So, in some way or another, the tale of most individuals and most businesses ends up there.

However, it is Facebook’s platform for storytelling, not ours. They control the medium, they control the message, they control the stories we see, and they are increasingly controlling us. As Plato observed, “Those who tell the stories rule society.” Given Facebook’s success in telling stories and programming us to listen and respond, it should come as no surprise that a seemingly infinite number of people and organizations want to jump on the Facebook bandwagon and program their own messages into our psyche. Thus, money in Facebook’s bank account is personal data for us. [This is why, according to Roger McNamee, one of Facebook’s early investors, “the incentives are fundamentally stacked against user privacy.”]

So, when a mercenary contractor with a career in military intelligence and psychological warfare in the service of many authoritarian governments looks at Facebook, it’s like a wet-dream come true. Because Facebook, at least to them, does not stand for social. It is an abbreviation for surveillance. As a result, the need to monetize becomes an enabler to weaponize. Of course, data collection, psychographics, and micro-targeting aren’t only a part of Cambridge Analytica’s business strategy. First and foremost, it is Facebook’s business model. Only, at least in the past, Facebook utilized it to display us advertisements that sold us goods. But the genie has escaped, and political campaigns will never be the same again.

The next question is, “Why stop at elections?””

Soon, AI will be advising us not just how to vote, but also where to go to school, what sports we should play, who we should hang out with, where we should eat, what books we should read, what movies we should watch, whom we should marry, and what we should do with our life. Do you believe Facebook isn’t doing all possible to ensure that it is their AI that is speaking into our ears? [Of course, Google is no new to such aspirations, with Eric Schmidt revealing that Google is not a search engine, but rather an AI firm striving to know what you want to know before you want to know it.]

And now we’ve arrived at the crux of the issue:

Can free will or democracy survive in a world where loss of personal privacy, big data, AI, and a wide range of entities ranging from ill-intentioned foreign governments to trillion-dollar businesses have made brainwashing an exact science?

According to Cambridge Analytica whistleblower Christopher Wylie:

“We risk fragmenting society to the point where we no longer have shared experiences or shared understanding.” How can we be a functional society if we don’t have any more shared understanding?”

Of course, the Silicon Valley lobby has forced the US government to take a hands-off approach, essentially enabling firms to do anything they want with our data. And we are only now becoming aware of the implications. However, this is only the beginning. Things are likely to become lot worse before they get any better.

Facebook’s initial motto was “move fast and break things,” and it has done both admirably: it has moved quickly to reach more than 2 billion people, and it has also broken significant things, such as democracy, in the cases of Brexit, Trump, and others. Furthermore, Chamath Palihapitiya, Facebook’s longest-serving CEO, has stated that social media in general, and Facebook in particular, is tearing the fabric of how society functions. [And for what purpose?!] Making money by exploiting psychological flaws and selling our personal info.]

We are only now becoming aware of the costs and risks involved with Facebook, but it is unlikely that the firm would modify its primary business strategy and risk lower growth. According to Andrew Bosworth, co-inventor of Facebook’s News Feed and currently in charge of virtual reality efforts, “anything the company [Facebook] did to grow” was “de facto” good and justified, including aiding bullying or terrorism and thus being a cause for death.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *