How Facebook Emphasised the Need for Data Privacy

Cambridge Analytica - Lessons from Facebook scandal

Just one year on after news broke of the shocking relationship between Cambridge Analytica and Facebook, the scandal has lost none of its relevance in terms of how organisations worldwide handle data on consumers and employees.

No stranger to public discontent, Facebook Inc. had of one of its biggest crises yet. The personal data of up to 87 million users, mostly in the U.S., was obtained by an analytics firm that, among its other work, allegedly helped elect President Donald Trump. In response to that revelation, lawmakers and regulators in the U.S. and U.K. increased their scrutiny of the social media giant, and one in ten American Facebook users have now deleted their accounts. The uproar has only added to the pressure on Facebook and Chief Executive Mark Zuckerberg over how the company was used during the 2016 presidential campaign to spread Russian propaganda and false headlines.

  1. Who are Cambridge Analytica?

They are a company that offers services to businesses and political parties who want to ‘change audience behaviour’. It claims to be able to analyse huge amounts of consumer data and combine that with behavioural science to identify people who organisations can target with marketing material. It collects data from a wide range of sources, including social media platforms such as Facebook, and its own polling. With its headquarters in London, the firm was set up in 2013 as an offshoot of another company called SCL Group, which offers similar services around the world.

  1. Who took what from Facebook?

During the summer of 2014 the UK affiliate of US political consulting firm Cambridge Analytica hired a Soviet- born American researcher, Aleksandr Kogan, to gather basic profile information of Facebook users along with what they chose to ‘like’. About 300,000 Facebook users, most or all of whom were paid a small amount, downloaded Kogan’s app, called ‘This is your Digital Life‘, which presented them with a series of surveys. Kogan collected data not just on those users but on their Facebook friends, if their privacy settings allowed it- a universe of people initially estimated to be 50 million strong, then upped to 87 million. The app, in its terms of service, disclosed that it would collect data on users and their friends.

  1. Did Kogan have Facebook’s permission?

In general sense, yes. Since 2007, Facebook has allowed outside developers to build and offer their own applications within its space. When Kogan offered his app, Facebook also allowed developers to collect information on friends of those who chose to use their apps if their privacy settings allowed it. “We clearly stated that the users were granting us the right to use the data in broad scope, including selling and licensing the data.” Kogan wrote in an email obtained by Bloomberg.

  1. Then what’s the issue here?

Facebook says Kogan ‘lied to us‘ by saying he was gathering the data for research purposes and violated the company’s policies by passing the data to Cambridge Analytica. Kogan says his app’s terms and conditions specifically allowed ‘commercial use’. Facebook says that after it learned of the situation in 2015, it removed Kogan’s app and demanded that he, and all parties he had given data to destroy the data.

  1. Has the data been destroyed?

The New York Times– which broke the story along with The Observer of London- reported on March 18th 2018 that emails and documents suggest the firm ‘still possesses most or all of the trove.’ Cambridge Analytica has maintained that it deleted all the data Kogan Provided and, at Facebook’s request, “carried out an internal audit to make sure that all data, all derivatives, and all backups had been deleted”. Facebook’s chief technology offers, Michael Schroepfer, said in an interview on April 5th 2018 that pending the results of investigations, ‘we don’t know exactly what they have.’

  1. Why did Cambridge Analytica want the data?

It uses such data to target voters with hyper-specific appeals, including on Facebook and other online services, that go well beyond traditional messaging based on party affiliation alone. This is known as ‘psychographic’ targeting or modeling. Kogans costs were covered in creating his app- more than $800,000 and allowed him to keep a copy for his own research, the Times reported, citing company emails and financial records.

  1. Did the Facebook data help Trump with the presidency?

Cambridge Analytica flatly denied using Facebook data from Kogan’s firm in the 2016 election or employing psychographic modeling techniques on behalf of Trump’s campaign. However it’s not clear whether the firm used the Facebook data in other ways to better understand and target voters. Whether Cambridge Analytica’s models really work is itself a point of contention; even some of the firm’s clients have said they saw a little value in it.

  1. Did this violate any rules?

That remains to be seen. The U.K has data-protection laws that ban the sale or use of personal data without consent. Consent means giving people genuine choice and control over how you use their data. If the individual has no real choice, consent is not freely given and it will be invalid. And in 2011, Facebook settled privacy complaints by the U.S Federal Trade Commission by agreeing to get clear consent from users before sharing their material. The FTC is now investigating whether Facebook violated the terms of that 2011 consent decree. The company would face millions of dollars in fines if it were found to have violated that pact. Lawmakers in the U.S and U.K are conducting their own inquiries.

  1. What’s been the fallout?

Facebook shares dropped almost 18% in the 10 days after the news broke on March 17th 2018. An online ‘#DeleteFacebook’ movement drew some high-profile support, though Zuckerberg says there’s been no ‘meaningful impact’ on Facebook’s business. Facebook said it removed a feature that let users enter phone numbers or email addresses into Facebook’s search tool to find other people. The company also will make it easier for users to adjust their privacy settings.

  1. Has public trust declined?

The incident also raised awareness about the issue of personal data and privacy in general. As a result, consumer trust in tech companies like Facebook has declined sharply. This awareness has led to renewed calls from both users and regulators for greater control over how companies can use personal data, and for more transparency about how it is collected. However, despite this greater awareness about personal data, consumer’s attitudes and behaviour haven’t fundamentally changed.

Fast forward

The Cambridge Analytica scandal was a shock to the tech industry and the whole consumer market. It forced many companies to re-evaluate their data and privacy policies in reaction to increased regulations and demands from privacy-concerned consumers.

Facebook ended up taking most of the flak from the incident, perhaps unfairly- after all, it was Cambridge Analytica who harvested the data. However, it was the social media platform’s responsibility to protect its users’ information, and it has since come under immense political and economic pressure to change its ways. How earnest and effective this pivot to privacy will be remains to be seen- Facebook has built a business empire through utilising personal data, so it will be a challenge to significantly change its model overnight.

If Facebook don’t change their ways and get hit by another data scandal of the same magnitude as last year’s, it may prove to be the final straw for its users.

Facebook and encryption?

Even with encryption in place, it would still technically be possible for Facebook to derive keywords from people’s messages and use them to further impose Facebook’s ad-targeting model on the app. Whatsapp is an application like any other, made up of code that developers can manipulate as they see fit. Right now Whatsapp’s code is programmed to take a message and send it encrypted, but there’s no technical reason why the code couldn’t first identify key words in sentences and send them to Facebook’s servers to be processed for advertising, whilst separately sending the encrypted message.

How to have privacy in the age of the cloud:

-Firstly, recognise that using cloud services can easily leak your data- even if legally they shouldn’t. A technological malfunction, like a bug in the cloud providers systems, can cause a data breach, or a hacker can compromise your cloud account. Be extra careful with what information you put out there, and with whom you share it with.

-Take control over your own data by using a secure communications app, such as Salt Communications which allows you to have end-to-end encryption conversations between mobile devices in confidence anywhere, any time.

With more awareness of how our data can be used, individuals can protect themselves from these situations.

If you would like to protect your organisation consider Salt Communications. We understand the security of mobile communications in today’s global business environment is paramount, that’s why Salt Communications is built with the features and technology to keep your communications private and compliant.

If you have any questions about this article, please contact us on marketing@saltcommunications.com and we’d be happy to assist you in any way.

About Salt Communications

Salt Communications, ranked in the top half in the Cybersecurity 500, provides a fully enterprise-managed software solution that enables absolute privacy in mobile communications. It is easy to deploy and uses multi-layered encryption techniques to meet the highest of security standards. The Salt Communications Desktop and Mobile apps are intuitive and easy to install and use. Salt Communications’ Communication Manager provides a console for tight management of users and can be configured for the management of regulatory compliance. Salt Communications is headquartered in Belfast, Ireland, for more information visit saltcommunications.com.

Share This Post

Explore More

News

Government Communications: The Threats

In an age where information flows freely and rapidly, government communications have never been more vulnerable. The digital age has ushered in a revolution in