Recently, a team of international journalists uncovered an Israeli-based group called “Team Jorge” that allegedly influenced elections in various countries through disinformation campaigns, blackmail, sabotage and hacking. This group, said to manipulate public opinion through disinformation and propaganda campaigns using social media, brought to mind the 2017 Cambridge Analytica scandal that revealed Facebook users’ social media data was used in election campaigns.
How effective are the companies that claim to influence voters through social media? How is social media used in election campaigns?
Team Jorge
According to reports in The Guardian and Le Monde, three reporters posing as clients used hidden cameras to record online and in-person interviews with Tal Hanan, alias “Jorge”, the head of the Israeli private group Team Jorge.
Describing his team’s work as “black ops”, Hanan claims that they have worked for intelligence agencies, political campaigns and private companies in various countries in Africa, South America, Europe and the United States, manipulating public opinion.
The product Hanan is trying to peddle to the journalists is a software package called AIMS – Advanced Impact Media Solutions. The software allows for the control of thousands of fake accounts on Twitter, LinkedIn, Facebook, Telegram, Gmail, Instagram and YouTube. Hanan, who alleges to have spread various fake news with these accounts, claims he can hack Gmail and Telegram accounts and amazes the reporters with various demonstrations, hacking the gmail and telegram accounts of previous clients’ associates.
What makes Hanan’s bots special is their sophistication. Some bots have linked accounts on multiple platforms, some even have credit card information. The bots, which Hanan claims are controlled by a special artificial intelligence, have been used in various operations on social media.
Cambridge Analytica
In a similar secret recording in 2018, Cambridge Analytica CEO Alexander Nix boasted to journalists posing as clients that they had influenced elections in several countries, describing the blackmail methods they used to eliminate their clients’ opponents and how they spread propaganda through fake accounts.
Cambridge Analytica (CA) is the American subsidiary of SCL (Strategic Communications Laboratory), an older and well-established British company. SCL, a behavioral science and strategic communications organization specializing in influencing mass behaviour, was founded in 1993 by Nigel Oakes. Using data mining and data analysis methods, SCL worked on military disinformation and propaganda campaigns, social media branding and voter targeting, as well as campaigns to manipulate public opinion for various political figures and institutions, especially in underdeveloped countries.
CA was founded by SCL in 2012 to work on the US elections and has worked on many campaigns. In 2017, the workings of CA were exposed and scandalized by the confessions of a former employee, Christopher Wylie.
In Silica: A computer model of society
Born in 1989, Christopher Wylie is a Canadian data analyst and a graduate of the London School of Economics. Since his student years, Wylie dreamed of creating a computer model that could predict the preferences and actions of individuals in a society using their personality profiles, and he wanted to work in a field where he could use this model in politics and political campaigns.
Building a virtual society that models the behavior of all individuals in a society is a big dream of economists. Such a model, which is called an “in silica” copy of society, could enable various predictions and analyses of how a given community would behave. Data analysts, mathematicians, behavioral scientists and scientists in various fields have long been working on such a model.
Wylie’s solution to this problem was the “psychography” method, which models individuals in society using the 5 Factor model, a psychological method for modeling personality. For this, Wylie needed test data on the personality traits of individuals, which was provided by an app made by Cambridge University. The Facebook app paid users to take various tests and provided the data for use in “scientific research” for a fee. But the existing data was insufficient.
Do psychographic models work?
In 2013, Wylie was hired by Cambridge Analytica and commissioned a data analyst from Cambridge University to create a special app to conduct personality tests on Facebook. However, while collecting data, the app asked users for permission to access all their profile data as well as all their and their friends’ likes. Through this app CA obtained the profile data and likes of more than 87 million Facebook users. The claim was that they could very accurately predict a person’s personality through their Facebook likes.
Unlike traditional methods of microtargeting that use publicly available data such as people’s previous votes, shopping history, etc., CA’s method creates a “psychographic” profile specific to each person.
The effectiveness of the method is a matter of debate, but experts who have analyzed the models, which caused a sensation when they first emerged, say that the personality profiles created by analyzing Facebook likes are at most as accurate as astrological signs.
Facebook and the Cambridge Analytica Scandal
In 2017, Christopher Wylie, who had left CA years prior, became a whistleblower through independent journalist Carole Cadwalladr, describing how Cambridge Analytica obtained this data from Facebook and used it to influence various elections.
The scandal raised serious questions about how social media companies use user data, while also providing fodder for rumors of a Russian connection to President Trump.
Wylie became an icon for the American and European left, exposing the company’s and Facebook’s questionable ethical practices, providing a tangible perpetrator for Trump’s election and Brexit, and even suggesting Russian agents were working in CA. Wylie wrote a book about his time at Cambridge Analytica and testified before Congress in the investigation into Trump’s Russia ties.
However, the British Information Commissioner’s Office, which examined the Facebook data in question and the psychographic personality analysis method developed by Wylie as part of the investigation into allegations that the company influenced the Brexit referendum, published a report that the method was not very effective and that the allegations of a Russian connection was unsupported.
Christopher Wylie: The story of a whistleblower
Research by Anthony Mansuy of Société magazine shows that Wylie has been interested in analyzing and modeling societies through psychography since his university years and has long wanted to create a company that would market these methods to political campaigns.
When Wylie left Cambridge Analytica in 2013, he took with him the analysis of 87 million people’s Facebook data. Using this Facebook data, which he later revealed was unethically obtained, Wylie set up a company called Eunoia to do exactly what CA does: contacted campaign officials for both Brexit and the 2016 Trump campaign. However, when Trump campaign officials informed CA CEO Alexander Nix that “another company from Cambridge that uses the same data” has contacted them, Nix put two and two together and sued Eunoia for taking the data and stealing his clients. Following this lawsuit, Eunoia ceased operations in 2017.
That same year, journalist Carole Cadwalladr’s investigation into Trump’s election campaign led her to Cambridge Analytica, the Facebook data and the psychographic methods developed by Wylie.
Forced to shut down Eunoia, and realizing journalists were onto him and his work in CA, Wylie blew the whistle.
The data analysis market in politics
In his book, Wylie describes himself as “the gay Canadian vegan who somehow ended up creating Steve Bannon’s psychological warfare mindfuck tool” and gives the impression that he was an unwilling participant in the acquisition and use of Facebook data, but he does not mention the company he founded, Eunoia, or his efforts to work for the Trump campaign. In a 2020 interview, Mansuy asked Wylie why he did not mention Eunoia and his work for the Trump campaign in his book. Wylie was stunned by the question.
His reply: “So what?”
Since 2018, Wylie has been employed as a Research Director at H&M developing, in his own words, “ethical AI systems to help the company become more sustainable and more profitable and better to serve its customers’ needs without exploiting them in the process.” Myths about the efficacy of psychograpic method undoubtedly play a role in his employment.
There is a huge market for companies that provide data analysis and strategic communication methods for political campaigns. Companies that offer new and effective data analytics and effective social media campaigns to clearly identify target audiences are paid incredible amounts of campaign money. Every rumor, or news report -positive or negative- on the efficacy of a certain method or company is an advertisement, increasing the market value of the “product”.
Team Jorge software package: Telemarketer vibes
Emails obtained by journalists at The Guardian show that Team Jorge tried to market their software (and access to fake social media accounts) to Cambridge Analytica in 2015 and 2017, but the company was not interested. There can be only one reason why CA, whose own president has admitted that they have no qualms about using unethical methods in their work, might not be interested in Team Jorge’s product: It doesn’t work.
Hanan, who was caught on hidden camera demonstrating hacking into various people’s Telegrams and Gmails, responds to the reporters’ question “Where do you get these pictures?” with “I don’t want to tell you this because then I’d have to kill you,” behaves more like a telemarketer peddling an all-in-one cleaning product than an experienced undercover operative.
The Guardian report says “Given their expertise in subterfuge, it is perhaps surprising that Hanan and his colleagues allowed themselves to be exposed by undercover reporters.”
Perhaps not.
Guardian reports that “This week Meta, the owner of Facebook, took down Aims-linked bots on its platform after reporters shared a sample of the fake accounts with the company. On Tuesday, a Meta spokesperson connected the Aims bots to others that were linked in 2019 to another, now-defunct Israeli firm which it banned from the platform.”
Bots, fake accounts and disinformation
Disinformation, conspiracy theories and the spread of false news on social media is perhaps the biggest problem of our time. Russia, in particular, is known to use fake accounts on social media to bolster certain political movements in order to destabilize Western countries. But no matter how many bots or fake accounts you have, the process is far from accurate and almost impossible to control.
Russian bots spread various conspiracies and fake news through trial and error, and it is impossible to predict which will catch on and spread.
It is not surprising that there are those who try to make money by claiming to control and even direct these chaotic processes. While their polluting of the information landscape is indisputable, there are serious doubts as to how effective they are, or more precisely, to what extent they can shape the effect they produce.
“Team Jorge” is certainly not unique, but after this much publicity, we can expect to see a hike in the price of their software package and various “services”.