Menu
Get In Touch
Share

Legal Articles

Countdown to 2021: Privacy & Data Security

Ring in the New Year with a Digital Audit to Reduce Online Privacy, Security, and Accessibility Risks ‎

December 21, 2020

Overview

The COVID-19 outbreak has forced much of the world to shelter in place. As a result, the things we used to leave our homes to do—work, go to school, gather with friends and family, exercise, dine out, and even grocery shop—we are now doing from home, via internet-connected devices and services. On the one hand, it is remarkable to consider the sheer volume and convenience of the connected services and capabilities available to us compared to just a few years ago. On the other hand, while we shelter in place, there are more people of all ages online every day, and for longer sessions, than ever before.

The increased usage of online or connected services amplifies areas of legal risk for businesses leveraging connected technologies, as well as providers of internet content and services. Among others, some of the biggest risk factors are data privacy and security, as well as internet content accessibility requirements.

Prior to the COVID-19 outbreak, the United States was in the midst of a privacy law evolution. A number of new state privacy and security laws either became effective, will begin enforcement, or are pending before legislative bodies this year. Further, all 50 states now have data breach notification laws, with many states updating existing breach laws to expand the scope of what is considered protected personal information. In addition to laws relating to the use and security of personal information, there have been updated guidelines regarding making website services more accessible to a wider range of people with disabilities. This has led to an increase in accessibility lawsuits against certain types of online businesses whose websites do not follow such guidelines or similar industry standards.

The current risk and compliance landscape related to internet content and data usage looks quite a bit different for U.S. companies than it did even just two years ago. For a business that may not have recently assessed its online assets from a privacy, security, or accessibility perspective, a digital audit can help mitigate associated risks and start fresh in the new year.

Below we discuss some online privacy, security, and accessibility risks that may be amplified by increased user activity while the world shelters in place, as well as some steps that businesses can take right now to reduce risk, lower costs, and have a more secure and efficient online presence when we return to normal life.

INFORMATION PRIVACY

Privacy—The Rise of Domestic Consumer Privacy Laws

In the summer of 2018, the California legislature passed the California Consumer Privacy Act, or “CCPA,” which provides California residents with enhanced rights of transparency and control over how businesses process their personal information, including a right to opt out of having their personal information sold. Under the CCPA, a sale is broadly defined, including nearly any disclosure to another business or a third party for monetary or other valuable consideration. The act requires businesses that process personal information to make much more detailed disclosures to consumers about the businesses’ data processing activities, as well as to implement procedures for responding to consumer requests.

The CCPA applies to for-profit businesses anywhere in the world that do business in California and meet any one of the following thresholds: (a) have gross annual revenues of $25 million or greater; (b) annually process the personal information of 50,000 or more California residents for commercial purposes; or (c) derive 50% or more of their annual revenue from the sale of personal information.

The CCPA went into effect January 1, 2020. The California Attorney General’s Office intends to begin enforcement for violations of the act on July 1, 2020, despite industry calls for an enforcement delay due to COVID-19. And while the Attorney General is primarily responsible for CCPA enforcement, the act also allows a limited private right of action and the potential for the award of statutory damages in certain circumstances, such as a data breach. There are already a number of cases pending in California alleging CCPA violations from unauthorized acquisitions of data.

A popular saying is: “As California goes, so goes the nation.” Accordingly, following the enactment of the CCPA, many other states began work on their own consumer privacy laws. Some of these are already in effect (though none are as robust as the CCPA), and others are making their way through the legislative process. There have also been a number of privacy bills proposed by members of Congress at the federal level. We anticipate that in the next few years, most organizations in the United States will have similar compliance obligations under either state or federal consumer privacy laws.

Privacy—Existing Business Practices Should Be Reviewed Against New Privacy Requirements

Complying with privacy laws is conceptually straightforward. A business should know what personal information it is collecting and processing and be able to disclose details of the same to the individuals to whom the information pertains. However, in practice this is not always so simple. Often the most cost-effective way to develop digital assets is to outsource certain functions to a vendor or other third-party provider. Services that are often outsourced include website development, site integrations (such as shopping carts, contact list management, and social media buttons), and search engine optimization and other online marketing tools.

Some of these services inherently involve processing personal data. Under modern privacy laws, a business is responsible for its downstream vendors’ handling and use of consumers’ personal information that is processed on the business’s behalf. However, this is a relatively new legal requirement in the United States, aside from a few sectors, such as healthcare, finance, or education, that have been subject to stronger privacy laws for decades. In the past, it was common for a vendor’s service agreement to state that it may use the personal information it processes on behalf of a business for its own purposes, and often without restriction. Many of these uses are predictable and even desirable, such as for threat analysis or product security, or for research and product development and improvement.

Other uses of personal information are less obvious, including selling or otherwise disclosing user data to downstream third parties for their marketing or profiling use. Such uses have historically been the risk areas in the United States for privacy regulatory enforcement. Prior to the current push for state consumer privacy laws, the Federal Trade Commission (“FTC”) has been the primary enforcement body for matters involving use of consumer data. In such matters, the FTC tends to look for unexpected uses or disclosure of data that may deceive, confuse, or surprise consumers.

Perhaps the most widely known example of such unexpected use is the Cambridge Analytica scandal. That incident involved a business permitting a third party to access the personal information of nearly 90 million individuals—unbeknownst to the individuals—for use in developing targeted political advertising. The fallout from the Cambridge Analytica scandal, including widely publicized revelations regarding corporate collection and use of personal information, is largely regarded as the triggering event for the current U.S. privacy law evolution.

The way the privacy legal landscape is taking shape in the United States (and as it already exists in much of the rest of the world), a business is responsible for controlling and limiting downstream use of the personal information processed on behalf of the business, and, upon request, must be able to provide details of such processing to individuals whose information is being processed. In addition to transparency, modern privacy laws provide individuals with a greater ability to control how a business uses their data—including the right to opt out of having their data sold or disclosed in certain circumstances.

The rights and obligations under modern privacy laws create compliance challenges and risks for both businesses and their vendors. In the business-to-business context, there are many business-vendor relationships that pre-date the CCPA and whose contracts do not contemplate the legal requirements that the act imposes. The reality is that a business might not know all the ways its various vendors are using the personal information they process on behalf of the business, or even what data they are collecting. From a risk and compliance perspective, that is a problematic gap. Fortunately, there are steps a business can take immediately to mitigate its risks.

Privacy—Compliance Begins with Understanding Your Data Processing Practices

Data mapping is the first step that a business should take when evaluating its privacy compliance posture. This involves making an accounting of all of the personal information processing that the business conducts, as well as the reasons for processing and any vendors or third parties that are involved.

When a business understands its data processing landscape, it can then take steps to shore up its privacy program, working with its attorneys to understand its compliance obligations and legal risks. The business can then review existing services and contracts against the compliance analysis and, if necessary, engage with vendors to understand the scope and purposes of any data processing activities. Data mapping can also help a company evaluate whether it is collecting and storing more data than it ought to. Storing excess data can increase storage costs, as well as liability exposure and response costs in the event of a data breach. Since consumers and businesses are demonstrating a heightened awareness of data privacy practices, a business can differentiate itself from competitors with a robust privacy program and responsible data processing practices.

INFORMATION SECURITY

Security—Data Security Laws Are Evolving, Imposing Stricter Requirements—and Liability—on Businesses

While new data privacy laws and corporate data processing practices  have garnered increased attention in recent years, cybersecurity continues to present the more cognizable threat to most businesses. Businesses should consider the potential cybersecurity ramifications of having large portions of the workforce working from home during the COVID-19 pandemic, with a very real potential that many people will elect to continue working from home even after shelter-in-place orders lift. Further, as privacy laws continue to evolve to address novel data processing practices, so do data security laws—often to the effect of requiring businesses to implement and maintain baseline security controls to protect the personal information they process. New York’s Stop Hacks and Improve Electronic Data Security, or SHIELD, Act is one such law. As of March 21, 2020, it requires businesses that process New Yorkers’ personal information to implement and maintain reasonable security measures, as defined under the act, as well as conduct regular employee training. It also requires certain businesses to ensure that their service providers implement similar security safeguards.

The CCPA has also created a lot of buzz about its security components. Although largely concerned with consumer privacy, it imposes a duty on businesses to implement a security program appropriate for the type of data the business handles. The data breach notification statutes in all 50 states impose a similar duty, either expressly or implicitly, and many of the state statutes permit individuals whose data was involved in a breach to pursue a private right of action to recover actual economic damages (which tend to be difficult to prove beyond a nominal amount). The CCPA also provides for a limited private right of action for violations of the security requirement, but it is unique in that it allows plaintiffs to recover statutory damages in the amount of $100 to $750 per consumer, per incident, in the event of a data breach. Considering that the average U.S. data breach involves more than 30,000 consumer records[1], the cost of a breach under the CCPA could easily reach seven figures just in statutory damages, and that’s without factoring in costs associated with investigation, remediation, and notification to consumers.

There are factors that can mitigate—even avoid—damages under breach statutes, including the CCPA. Namely, these include implementing and maintaining reasonable security controls to protect personal data. As noted above, this is a requirement under the SHIELD Act. One of the strongest controls a business can implement to shield itself from liability in the event of a data breach is to make sure it is encrypting personal information in transit and at rest (and storing the encryption key separately from the data). If the data compromised in a cybersecurity incident is encrypted, redacted, or otherwise cannot be accessed by the bad actor, then it is not considered personal information under most breach notification statutes. In a case where no personal information is involved, the business is relieved of its notification obligations under the statute, and likely avoids significant damages as well.

Security—Cybercriminals Are Exploiting COVID-19 with Increased Attacks Aimed at Remote Workers

The COVID-19 outbreak presents a new challenge for businesses under new and existing data security laws. Typically, a business designs its information security program to secure an office, data center, or other location that the business can control. But with shelter-in-place orders in effect, a great deal of business is now conducted from employees’ homes, presenting new threats not previously contemplated.

Chief among these threats may be the risk of exposing corporate information and conversations to third parties present in the home, such as housemates, family members, guests, and even connected smart devices. None of these third parties are subject to the business’s privacy or security programs or policies, but some of them may work for other businesses, including a competitor.

Cyberfraud attacks are another threat where risk increases for businesses operating with a remote workforce. Attacks such as phishing, spoofing, and other social engineering tactics are on the rise during the COVID-19 pandemic. Attackers are exploiting the global climate of FUD—fear, uncertainty, and doubt—to pray on people and businesses that are focused on navigating unprecedented circumstances. Some common attack strategies are for scammers to impersonate financial or medical relief providers, such as small business associations, government agencies, and even COVID-19 testing providers or centers, in order to prompt an action that would expose an individual or business to financial harm.

Unfortunately, a lot of people—and therefore many businesses—are more susceptible to these types of attacks right now. The ambient differences between office and home environments can present unexpected security risks. For most people, home is a safe place where they can let their guard down. It’s also an environment that demands attention to non-work related matters. Homes typically also do not have the on-site resources of an office, including colleagues and a help desk, which can be crucial to spotting and stopping cybercrime. The reality is that most people behave and think differently at home than they do at the office. It’s a personal space, so work-related threat awareness may be unintentionally relaxed, and bad actors are trying to exploit this.

SecurityMaintaining a Remote Workforce Warrants a Fresh Look at Cybersecurity Controls

Despite the increased risk of certain cyber threats, businesses are generally experiencing that a remote workforce works. The result is that many individuals and businesses are planning to expand or continue remote work arrangements even after shelter-in-place orders lift. As such, organizations would be wise to reassess their information security programs to account for new risks associated with maintaining a large remote workforce. There are a number of steps that a business can take to harden its security posture. These range from deploying hardware and software solutions across the workforce, to implementing more robust and engaging training protocols.

As with privacy concerns, a business’s best first step is to understand its data security landscape, including obligations, risks, and potential gaps in its program. It should also implement regular reviews of vendor contracts and insurance policies into its risk management program. Cyber threats evolve quickly, and having awareness of the types of attacks the business is most susceptible to will enable it to develop better incident prevention, detection, and response programs. With this knowledge in hand, the business can consult with its attorneys, vendors, and insurers as needed about the actions to take to enhance its information security program and minimize associated risks.

ONLINE ACCESSIBILITY

Accessibility—ADA Litigation Related to Website Accessibility Is Rapidly Increasing

Online accessibility is relatively new in terms of risks associated with digital assets, but litigation related to this topic is growing at an alarming rate. As used in this context, accessibility refers to making website content more accessible to a wider range of people with disabilities.

Title III of the Americans with Disabilities Act of 1990 (“ADA”) prohibits discrimination on the basis of disability in places of public accommodation. Title III does not directly address whether places of public accommodation include websites, mobile applications, or other platforms available to the public. On the other hand, the Department of Justice (“DOJ”), the primary federal government agency responsible for enforcing the ADA, has taken the position that Title III applies to all public-facing websites used by companies that otherwise qualify as places of public accommodation. To complicate the matter, courts around the country are split on when to apply the ADA to websites. In October 2019, the U.S. Supreme Court declined to review a Ninth Circuit decision that a global restaurant chain’s website was required to be accessible to individuals with disabilities.

The Supreme Court decision may turn out to be the tipping point in what has been a steady increase in litigation alleging violations of Title III of the ADA over the past few years. Such lawsuits target companies whose websites are places of public accommodation (including mobile applications) but do not comply with Web Content Accessibility Guidelines (“WCAG”) 2.0 or higher, or similar frameworks. In general, these lawsuits allege that such websites and apps are not accessible by individuals who are blind or visually impaired, and who rely on screen-reading tools or assistive technologies.

The number of Title III lawsuits has steadily risen over the last decade. Website accessibility lawsuits in particular have sharply increased over the past few years, jumping from 262 cases in 2016, to 814 cases in 2017, and up to approximately 2,300 cases in each of 2018 and 2019. In view of the Supreme Court decision in late 2019 to decline to review the Ninth Circuit decision requiring a company’s website to meet certain accessibility standards, there is anticipation that website accessibility lawsuits will see a jump in 2020.

Pursuing Title III violations can be attractive to plaintiffs’ attorneys. Title III violations are relatively easy to spot, and although plaintiffs are limited to equitable relief—i.e., curing the violations—they are entitled to recover attorneys’ fees for a successful claim. As such, website accessibility suits often result in a quick settlement in addition to the requirement that the alleged violator brings its website into compliance with an accepted accessibility framework.

In 2018, the DOJ outlined its position on the matter as follows: (1) the ADA applies to websites that qualify as places of public accommodation; (2) the absence of specific regulation does not serve as a basis for noncompliance; and (3) places of public accommodation have flexibility in how they choose to comply with the ADA’s general requirements (at least until formal regulation is adopted). Various courts have cited WCAG 2.0 as the standard for ADA compliance.

The CCPA will potentially add an additional layer of regulatory risk to these cases. The most recent set of draft enforcement regulations for the CCPA published by the California Attorney General’s Office indicates that required CCPA disclosures posted online must comply with a generally recognized industry accessibility standard, such as WCAG version 2.1. The Attorney General can levy civil penalties of up to $2,500 per unintentional violation of the CCPA, and $7,500 for intentional violations.

In our current climate, with many people working from home and a rise in online shopping and traffic on company websites, it is more important than ever to make sure your website is compliant with the ADA if your site qualifies as a place of public accommodation. Doing so will also make your website accessible to more visitors.

Steps places of public accommodation should take now:

  • Engage in an assessment/evaluation of your website accessibility. Online accessibility checkers can give companies a precursory assessment of their website and necessary improvements. This should not substitute for a formal evaluation.
  • Create a transition plan with steps and deadlines for which you intend to become compliant. While transition plans do not eliminate the risk of lawsuits, some company-defendants have used them to help limit the damages.
  • Begin working toward WCAG 2.0 Success Criteria (or a similar standard). This includes vetting vendors for website or app development or integrations to ensure compatibility, as well as listing compliance with WCAG 2.0 or similar as a requirement when outsourcing projects.

If you have questions about creating a website accessibility compliance plan or if you have received complaints about your website, contact an attorney who is admitted to practice in your jurisdiction, so that you can discuss next steps.

CONCLUSION

With the ongoing evolution of laws related to privacy, security, and online accessibility, as well as threats from cybercrime, businesses must continually assess and evolve their data governance, risk, and compliance programs. The transition to a remote workforce, whether temporary or long term, presents an opportunity for businesses to review such programs through a new lens, and engage with employees about information privacy security, and online accessibility, through the shared experience of sheltering in place. By taking the time to understand the obligations and risks associated with its digital assets, a business will be better prepared to anticipate and respond to future developments in the online legal and risk landscape.

Business representatives should consult their attorneys or advisors to discuss any questions about their legal and compliance obligations and risks related to their online and digital assets.

[1] Per the 2019 Cost of a Data Breach Report by the Ponemon Institute and IBM Security, the average size of a data breach in the U.S. involves 32,434 records. [Regan: can you look into proper citation format for this reference? Also, the report states we need permission to quote or reuse it (I have a request pending), but here we’re neither directly quoting nor reusing the report, so not sure permission is needed – the data we reference is widely available from other sources as well (some of which link back to the Ponemon report) which don’t list any restrictions on use, but I personally like the Ponemon report.]

Professionals

Related Services

Related Industries

Written By

Share