Facebook Scandal Raises Data Privacy Concerns

Hilary Tuttle

|

May 1, 2018

facebook cambridge analytica gdpr

In 2014, approximately 270,000 people used a Facebook app to take a personality test for academic research purposes. Because of Facebook’s terms of service and its application programming interface (API) at the time, however, the app’s developer, Aleksandr Kogan, was also allowed to collect information about users’ Facebook friends (a function it shut down in 2015). Ultimately, Kogan obtained data from up to 87 million users, which he then handed to political consulting firm Cambridge Analytica, which in turn used it to build profiles of individual voters and their political preferences to best target advertising and sway voter sentiment. According to Cambridge Analytica whistleblower Christopher Wylie, the firm used this data to help Donald Trump’s campaign predict and influence voters in the 2016 presidential election.

While many have called this incident a data breach, that is technically inaccurate and misleading, and risks obscuring the critical lessons for both individuals and businesses. This was neither a breach nor a bug: APIs are officially sanctioned ways to access data from a company’s databases, so while perhaps a breach of user trust, there was no breach of Facebook’s security measures or its user database. Facebook sanctioned access to this breadth of data when asking users to grant permissions to apps and Facebook’s terms of service asked users to acknowledge that their friends could authorize access to their data. The company knew a third party was collecting user data, but it did not know that the data would be passed to an additional party and used the way it was. The quiz-takers had to consent to using the app, but their friends had no way of knowing that their data was being harvested as well.

The scandal broke the evening of Friday, March 16. The following Monday, Facebook shares fell 7%, wiping almost $40 billion off the company’s value, and the company posted a series of subsequent losses as the scandal continued unfolding. By the end of March, Facebook had lost more than $60 billion in market capitalization.

Class action lawsuits have been filed on behalf of both Facebook users whose data Cambridge Analytica obtained and investors looking to recoup losses from the stock hit. Regulators in the United States and Europe were quick to demand answers about the company’s handling of personal user data, eventually prompting CEO Mark Zuckerberg to testify before Congress. The U.S. Federal Trade Commission confirmed an official probe of Facebook’s privacy practices as a result of the scandal, as have international authorities, which have also initiated investigations of Cambridge Analytica.

The FTC previously investigated the social network for deceptive privacy claims in 2011. As part of a settlement, Facebook ultimately promised to give users “clear and prominent notice” and get their consent before “enacting changes that override their privacy practices.” While the company avoided fines in the wake of that investigation, it may face financial penalties from the Cambridge Analytica incident if the FTC finds the firm violated the consent decree. The maximum exposure could theoretically run into the trillions (at a rate of $40,000 per violation), but it is more likely to be in line with last year’s record-setting $280 million settlement with Dish Network.

The regulatory risk implications also extend to renewed and amplified calls for the government to regulate social media platforms, which Zuckerberg for the first time conceded may be necessary.

“The FTC takes the allegations that the data of millions of people were used without proper authorization very seriously,” FTC Commissioner Terrell McSweeny said in a statement. “The allegations also highlight the limited rights Americans have to their data. Consumers need stronger protections for the digital age such as comprehensive data security and privacy laws, transparency and accountability for data brokers, and rights to and control over their data.”

Facebook has begun what it pledges will be an intensive audit of app developers to look for any that have pulled large amounts of data or that look suspicious, and the company has begun to significantly restrict the use of its many APIs. They also updated their terms of service for the first time since 2015, explaining how some of these changes result in more limitations on data access.

Navigating the Regulatory Risk Landscape


Facebook announced a number of data privacy and transparency changes in response to the Cambridge Analytica situation coming to light and, in turn, demands from outraged users and lawmakers worldwide. Some of these changes, however, are also measures the social network already needed to roll out soon to comply with the EU’s General Data Protection Regulation (GDPR).

While the circumstances of the Facebook and Cambridge Analytica case are certainly quite specific, the scandal offers critical lessons for other companies about data governance today. Had the incident occurred after the May 25 implementation of GDPR, many experts believe Facebook could have faced the maximum penalties under the fining schedule—4% of their global revenue ($1.6 billion, based on $40.65 billion in revenue reported in 2017). Identifying the misused data that pertained specifically to European users would be a challenge, but the regulation could have applied as it is extremely likely that at least some of the users in question are European citizens or residents, not to mention Cambridge Analytica is a UK-based company that currently falls under EU regulation.

The case poses a clear example of exactly the issues GDPR is meant to address: The personal information of millions of individuals was collected and used in ways they were not made aware of, did not consent to, and had no control over. GDPR requires companies to disclose and explain to users, in clear language, every use of their data. Customers must then actively consent to each use, and they must be able to revoke that consent at any time, which also requires companies to have a procedure to identify and delete the user’s personal data “without delay” and provide documentation to that effect. Further, entities that collect data are not only responsible for ensuring their own compliance with the 99 articles that make up GDPR, they must also ensure compliance by any third party with which they share, or to which they send, data.

For a company like Facebook, there are several forms of risk at play. Beyond the reputation crisis and the #DeleteFacebook movement, regulators worldwide are implementing new, often demanding measures with regard to data collection and privacy, and they are cracking down on violators with investigations and fines. In addition, some users are (hopefully) developing a better understanding of the use and misuse of data they entrust to companies, and some are demanding that companies do better.

Noncompliance clearly has a significant price tag, but compliance will also cost Facebook substantially: In addition to spending on the work required to achieve compliance, Facebook’s business model is centered around collecting and monetizing user data. Allowing users to opt out—a key provision of GDPR—will undoubtedly cut into the company’s advertising revenue, 24% of which comes from the EU. According to a January estimate from Deutsche Bank, if 30% of EU users opt out of targeted advertising, the efficacy and price of ads could drop 50%, resulting in an overall revenue loss of 4%.

The collection and use of user data is a cornerstone of the business model for not only Facebook, but many other websites and apps as well. Ad-supported apps typically use third-party advertising libraries that often access far more data than is truly necessary, including deeper device functions like the calendar or photo library, and, in the process, expose users to additional third-party risks as these providers are usually not regulated or even monitored by the app developer. According to a study by app security firm Appthority, for example, out of more than two million iOS apps scanned, 24,000 explicitly ask users for access to deeper device functions specifically for advertising purposes. That is to say nothing of the potentially excessive data accessed, stored or processed by the app itself, or the apps that are not so blatant about what they intend to do with user data.

As country- and region-specific data governance requirements continue to develop, any company that collects or stores users’ personal information will have to navigate the increasing complexity of the regulatory risk landscape. This case should be a reminder for companies of the urgent need to understand and plan for the responsibilities and costs that come with being custodians of data, whether that information rests purely with them or goes on to third parties for storage, analysis or other perfectly legal use. A recent study by Globalscape and the Ponemon Institute found that compliance with data protection regulations cost multinational companies an average of $5.47 million in 2017—a year when, while many were moving toward compliance with GDPR, the regulation and subsequent fines were not yet in force. This represents a 43% increase from 2011. These measures appear well worth the investment, though, as noncompliance with data protection regulations cost companies an average of $14.82 million.

“Like most cyber services where technical capabilities and convenience first drive adoption, security and consumer rights follow,” said Doug Howard, vice president of global services at security giant RSA. “Over the next five years, security and privacy rights will be built as foundational elements of a business, and existing cyber businesses—like the brick and mortar businesses of yesteryear—will either adapt or perish. The rapid rise of social media and building of mega-businesses based on use of consumer data as the primary currency will undergo massive change. How consumers adapt and provide consent, while businesses adapt to varying preferences of consumers and regulatory requirements, will create risk and threats to existing global businesses while simultaneously creating new opportunities for innovative and privacy-savvy entrepreneurs.”

According to Marc French, senior vice president and chief trust officer at data security firm Mimecast, the incident also highlights the importance of modeling privacy misuse cases as part of a privacy-by-design program, as required by GDPR. “The concept of misuse cases has existed in the security space for many years, but it is a concept we see many organizations miss in their privacy-by-design implementation,” French said. “A privacy misuse case is a scenario that involves using a perfectly customer-aligned/secure feature in an unintended way that affects an individual’s privacy. Facebook produced something that their customers wanted and, personally knowing some of the staff on Facebook’s security team, I am confident they ran the feature through the proverbial ‘security testing gauntlet.’ It is those unintended privacy uses that can trip people up—even the largest of us.”

The crisis also offers a reminder of the risks posed by third parties. Clearly Facebook’s policies with regard to third-party apps at the time were insufficient to mitigate data privacy risks, but the company largely functioned as it was designed to, collecting and selling data that users of the initial app provided. It is experiencing this crisis now because it lost control of that data and, whether by oversight or intent, did not monitor what the third party did with the information.

Article 17 of GDPR outlines the “right to erasure,” which can be a complicated process for any entity to facilitate, but poses particular challenges in situations involving third party relationships. “As organizations increasingly rely on third parties, they can no longer claim to have a vice grip on the fate of digital assets entrusted to them. Consequently, they lose their ability to guarantee that any digital traces of your personal information will be eradicated should you withdraw consent,” explained Zulfikar Ramzan, chief technology officer at RSA. “This issue is especially problematic for organizations like Facebook that try to foster an open ecosystem of partners while maintaining a business model that is largely predicated on maximizing the value extracted from the data of its users.”

Hilary Tuttle is managing editor of Risk Management.