With now over a year since Cambridge Analytica’s major data breach, information warfare and psychographic targeting are still prominent tools preying on users’ private data in order to manipulate them, with more yet-to-be-seen consequences.
According to the New York Times, Cambridge Analytica, a once-powerful data analyticsrm that worked with both the Trump and Brexit campaigns in 2016, was able to illegally obtain and use 55 million Facebook users’ information.
The data were used to build models to categorize personalities, predict political behaviors and exploit their findings to political candidates and campaigns willing to pay the price.
According to whistleblower and former employee Christopher Wyle, as said to The Guardian, in order to create such models, these steps were taken:
First, the rm paid 32,000 voters in the U.S. to take a lengthy and detailed personality and political test by logging in with their Facebook accounts.
Then, all participants’ and their friends’ data on Facebook were collected, including their personal information, likes and other social media activity.
Afterwards, the test results were matched with their personal data to predict sought patterns.
Subsequently, this resulted in algorithmic models paired with voter records to then target individuals with “highly personalized advertising” based on the combination data.
Facebook claimed to be uninformed about the security breach; however, the short fallout after Wyle exposed Cambridge Analyitica suggested otherwise, with previous contact between the two entities as proof. Documents seen by the Observer, which conducted the year-long undercover investigation alongside The Guardian to expose the rm, revealed that Facebook was aware of the problem harvesting much earlier than the company admitted, yet failed to inform users about the issue at all.
Paul-Olivier Dehaye, a data protection specialist who investigated Facebook at the time, said to The Guardian that the tech company had intentionally misled “politicians and congressional investigators, and it has failed in its duties to respect the law.” As new problems arise in the technical world, laws must adapt to this quick pace of change. It should be acknowledged that such laws are difficult to craft. However, with an out-of-touch and old-fashioned Congress dealing with a very modern problem, the task becomes even more arduous.
Although it is difficult to know whether this law will affect much change if implemented, left unregulated, the matter at hand will only build on itself, with an increase in deception and a decrease in transparency.
However, privacy protection also depends on the population that uses these social media outlets.
It is likely that users will not change their habits, often because they are not made aware of the full scope of the consequences, or will often choose convenience over greater risks.
With the upcoming 2020 elections, the issue will become even more apparent to the voting population, and will hopefully create a much-needed push for change.