Navigating Student Data Privacy Laws

Grant Waterfall , Carolyn Holcomb


September 1, 2015

student data privacy

Advances in technology have revolutionized the classroom experience. Information generated from the applications and tools running on students’ computers allows teachers to personalize lesson plans and provide real-time feedback tailored to individual academic needs. This boom in digital solutions for the classroom is not only revolutionary but arguably critical to student learning and development. In recent months, there has been increased legal and regulatory focus on how tech companies and educational organizations use student data. The challenge for these organizations is how they can continue to make progress in the classroom while complying with an abundance of new and changing laws, regulations and guidance.

The Regulatory Environment

Two primary federal laws impact the use of student data: the Family Educational Rights Act (FERPA), which prohibits the unauthorized disclosure of education records, and Children’s Online Privacy Protection Act (COPPA), which regulates marketing to children 13 years old or younger.

These laws are nearly 40 and 20 years old, respectively, so the challenge for businesses and organizations handling student data is figuring out how these legacy laws apply to modern classrooms.

In addition to federal laws, 36 states introduced a whopping 110 education-related bills in 2014. California distinguished itself as a leader in student data protection legislation by enacting the Student Online Personal Information Protection Act (SOPIPA), which prohibits technology companies from collecting student information for advertising and marketing purposes. To date, this is one of the more comprehensive student privacy laws enacted. Subsequent state laws have modeled their requirements on California’s legislation. While many state laws reflect the same legislation, however, the volume of bills introduced, discussed and enacted across the country has created a complex patchwork of laws. Tech companies, vendors and educational organizations are finding it increasingly challenging to navigate the varying—and potentially conflicting—state laws.

In January 2015, President Obama visited the Federal Trade Commission (FTC) to propose legislative improvements on student privacy. During the State of the Union address the same month, he encouraged Congress to enact legislation to help protect student data. As a result, Congress proposed three different bills specifically addressing children’s educational data: Student Privacy Protection Act (Sen. David Vitter, R-La.), Student Digital Privacy & Parental Rights Act of 2015 (Reps. Luke Messer, R-Ind., and Jared Polis, D-Colo.), and Protecting Student Privacy Act (Sens. Edward Markey, D-Mass. and Orrin Hatch, R-Utah). While it is unclear whether any of these bills will eventually establish a new compliance standard, it is nevertheless a key focus for the White House and, in turn, something tech companies and educational organizations must keep in mind.

Protecting Student Data Beyond Legislation

While federal and state laws are in flux, President Obama publicly encouraged tech companies to sign and immediately adhere to the K-12 School Service Provider Pledge to Safeguard Student Privacy. Commonly known as the Student Privacy Pledge, this measure has quickly become the baseline standard by which tech companies can assess their compliance with generally accepted student privacy practices. While compliance with the pledge is currently voluntary, it is enforceable by the FTC under Section 5 of the Consumer Protection Act. More than 150 companies have signed on to honor the pledge to date.

At a glance, the Student Privacy Pledge requires that education tech companies:

  • Not sell student information

  • Not use behavioral advertising

  • Use data for authorized education purposes only

  • Not change privacy policies without notice and choice

  • Enforce strict limits on data retention

  • Support parental access to, and correction of errors in, their children’s information

  • Develop comprehensive security standards

  • Be transparent about collection and use of data

The U.S. Department of Education earlier this year encouraged school districts, as a part of the contract negotiation process, to check whether education tech providers had signed the pledge. A failure to do so could indicate inadequate privacy and security governance. Without the pledge as a baseline, the provider might be forced to negotiate “one-off” assurances around security and privacy across multiple agreements with different customers. These assurances could be even more burdensome and more difficult to monitor, increasing the risk of non-compliance.

Education tech companies must weigh the potential risks and benefits of signing the pledge. In doing so, the organization will need to assess its practices in relation to student data. Taking a more holistic view of all personal information, however, may be beneficial and necessary.

Whether an organization is struggling with how to identify the laws and regulations that apply, or whether it is debating signing the Student Privacy Pledge, there are steps to take to assess general compliance around accepted privacy and security practices. By implementing these initial best practices, organizations will better understand their data handling procedures and determine the resulting impact on student data. These include:

Identify regulatory compliance requirements. Creating a blueprint for regulatory compliance will help identify the specific state, federal and/or regulatory guidance that needs to be followed. This includes determining whether signing the pledge is appropriate.

Conduct a program maturity assessment. A privacy and security program maturity assessment can help an organization identify whether policies, procedures and training are in place to protect student data. A third-party assessor can determine an organization’s maturity as compared to its peers.

Create a data inventory. To understand how to protect information, it is essential to create an inventory to better understand how data is collected, used, shared and stored (including by vendors), as well as what risks are associated with the data flow.

Implement an information governance model. A strong governance model creates a necessary culture around protecting data. Defining roles for accountability and responsibility are the first steps to establish strong governance within an organization, especially if protecting student data has not been a focus before.

Implement a vendor management program. A robust vendor onboarding program that analyzes contracting processes can identify how data is being collected, used and shared based on defined purposes. For multi-year contracts, this process should include measures to re-assess controls and re-validate contracts periodically to make sure that high-risk vendors remain operationally effective. Control assessments can be made by reviewing a vendor’s Service Organization Control (SOC) 2 report, or by performing an independent audit.

Implement privacy by design. Integrating privacy throughout a product’s lifecycle, including design, architecture and associated marketing campaigns, can increase the visibility of privacy in product development, as well as the likelihood that potential misuse of data is identified.

Require privacy training and promotion. Training employees on their obligations to protect and safeguard sensitive educational data highlights a commitment to compliance.

Continuously evaluate and improve. Organizations should view this effort not as a one-time project, but as an ongoing program, and implement oversight, monitoring and independent verification to identify and mitigate risks throughout the lifecycle of their student data.
Grant Waterfall is a partner on international assignment from the PwC UK, London office. He is based in New York and is PwC’s global cybersecurity, privacy & technology risk assurance leader.
Carolyn Holcomb is a partner and leader of the risk assurance data protection and privacy practice at PricewaterhouseCoopers.