0
Top Privacy and Cybersecurity Trends of 2021Year in Review: 2021’s Top Privacy and Cybersecurity Trends

Year in Review: 2021’s Top Privacy and Cybersecurity Trends

Despite the ongoing COVID-19 pandemic, 2021 proved to be another incredibly busy year for consumer privacy and cybersecurity. In this blog post, we revisit some of the most important domestic and international privacy and cybersecurity trends of the past year. 

 

New State Consumer Privacy Laws 

On the heels of the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), Virginia and Colorado became the next two states to enact comprehensive consumer privacy laws. Signed into law by Governor Ralph Northam back in March, the Virginia Consumer Data Protection Act (VCDPA) becomes effective on January 1, 2023 and applies to all companies who operate a business or produce products or services that are targeted to residents of Virginia and meet certain thresholds. Months later in July, Governor Jared Polis signed the Colorado Privacy Act (CPA) into law. Set to go into effect on July 1, 2023, the CPA applies to controllers that conduct business in Colorado or produce or deliver commercial products or services that are intentionally targeted to residents of Colorado and meet certain thresholds. Both the VCDPA and the CPA carve out several exemptions for entities that are already covered under the privacy and security requirements of other federal laws. Unlike the CCPA and the VCDPA, however, the CPA does not provide an exemption for non-profit organizations. Furthermore, neither the VCDPA nor the CPA offer a private right of action. 

Other notable state privacy developments include New York’s new rules on employee electronic monitoring as well as Nevada’s SB260 amendment, which expanded the right to opt-out of sales and created new requirements for “data brokers”. 

As we head into 2022, we anticipate that the patchwork of state consumer privacy laws will continue to grow. Beckage recommends that businesses take proactive steps to first evaluate what laws and regulations apply to their business and then develop a comprehensive roadmap and plan to mature their data privacy and security posture both internally and externally.   

 

Continued Focus on Cybersecurity 

Threat actors in 2021 continued to launch increasingly sophisticated ransomware and cyberattacks against businesses of all sizes and in all industries. In the wake of highly disruptive attacks such as SolarWinds and the Colonial Pipeline ransomware attack, both the federal government and also state governments sought to increase their focus on cybersecurity standards. For example, the New York State Department of Financial Services (NYDFS) issued guidance to cyber insurers in the form of the Cyber Insurance Risk Framework. The Cybersecurity and Infrastructure Security Agency (CISA) also regularly issued advisories informing businesses of vulnerabilities. In an effort to secure critical infrastructure, President Biden signed an Executive Order on “Improving the Nation’s Cybersecurity” in May. The new Civil Cyber-Fraud Initiative announced by the Department of Justice back in October further indicates the increasing importance of developing and maintaining resilient cybersecurity protocols.  

The federal government’s response to this year’s exponential increase in ransomware attacks has led several high-profile threat actors – such as DarkSide, REvil, and Black Matter – to take their dark web platforms offline.  At the same time, however, new variants of ransomware are constantly emerging and there is significant evidence that experienced cyber criminals are rebranding to evade law enforcement rather than shutting down their operations.   

In this complex threat landscape, companies across industries are wisely seeking to secure or renew cyber liability coverage in an increasingly competitive market.  Insurers are asking meaningful questions about applicants’ security programs and expecting strong safeguards in place.  For organizations of all sizes, the past year has shown that cybersecurity incidents are now a question of when rather than if.  

Beckage’s Incident Response Team urges businesses to develop plans and procedures to mitigate cyber and legal risk. Beckage recommends businesses continue to dedicate internal resources to refining compliance programs and testing incident response plans through tabletop training exercises. 

 

Health Privacy and Compliance Challenges 

Our lives have become increasingly digitized, and 2021 was no different – especially with the COVID-19 pandemic. The proliferation of apps and technologies handling personal health data led the FTC to confirm back in September that the requirements contained in the agency’s Health Breach Notification Rule extend to health apps and connected device companies. And as the world continued to operate under the shadow of the COVID-19 pandemic, businesses faced – and will continue to face – uncertainty regarding new federal vaccination and testing policies. Beckage’s Data Security and Privacy Compliance and Health Law Teams recommend businesses take stock of their employee data collection practices in their efforts to prevent the spread of COVID-19. 

 

Biometrics Class Actions, BIPA Claims Accrual, and Statute of Limitations 

In 2021, litigation under Illinois’ Biometric Information Privacy Act (BIPA) remained at the forefront of the data privacy landscape. As we noted back in JanuaryMarch, and April, BIPA’s private right of action has contributed in part to an increase in the number of class actions. In September, the First District of the Illinois Appellate Court found that the statute of limitations period could range from one year to as much as five years depending on the nature of the alleged violation. But as the year closed out, Illinois courts continued to wrestle with the issues of BIPA claims accrual and statute of limitations. As this blog post goes to press, the U.S. Court of Appeals for the Seventh Circuit had just issued its decision in Cothron v. White Castle, certifying the issue of BIPA claims accrual to the Illinois Supreme Court.  

 

Website Accessibility Litigation and What Counts as a Place of Public Accommodation 

The Beckage Accessibility Team continues to see a drastic increase in litigation filed under Title III of the Americans with Disabilities Act (ADA) as well as the rapidly evolving caselaw surrounding website accessibility claims. 2021 is set to be a record-breaking year, with approximately of 4,000 new lawsuits filed this year alone, with most of these cases filed against small to medium sized businesses. The issue of whether websites qualify as places of public accommodates under the ADA continued to take shape in 2021. For example, in May the Eleventh Circuit Court of Appeals held in Gil v. Winn-Dixie Stores that a website is not a “place of public accommodation” under Title III of the ADA, creating a clear conflict with 9th Circuit authority that has held a website is a place of public accommodation if there is a nexus to a brick and mortar location. In September, the United States District Court for the Eastern District of New York issued a decision in Winegard v Newsday LLC, which also concluded that a website is not a “place of public accommodation” under Title III of the ADA. Despite this unsettled landscape, we anticipate more litigation to come around the specific statutory definition of what constitutes a “public accommodation.” 

Nevertheless, there is no end in sight for companies facing lawsuits under the ADA. Accordingly, Beckage recommends that businesses with any online presence or mobile application take proactive steps and prioritize accessibility internally. Minimizing legal risk through a digital accessibility compliance buildout that includes both a full scale audit of digital assets and internal and external policy development is recommended for all businesses looking ahead in to 2022.  

 

Telephone Consumer Protection Act (TCPA) 

TCPA class actions are numerous. Beckage’s TCPA team has charted the complex legal landscape surrounding text message marketing and telemarketing throughout the course of 2021. In April, we covered the decision by the Supreme Court of the United States in Facebook v. Duguid et al., which narrowed the scope of the TCPA down to systems that utilize random number generators. In November, we covered Florida’s new telemarketer requirements. As we head into 2022, TCPA compliance will continue to be an important area of focus for businesses. Businesses that leverage text messaging marketing as part of their consumer outreach should evaluate compliance initiatives and stay up to date on this fast moving area of the law. 

 

More Global Privacy and Cybersecurity Developments 

Privacy and cybersecurity continued to be areas of significant focus on an international scale. For example, China’s new Data Security Law (DSL) and new Personal Information Protection Law (PIPL) became effective on September 1 and November 1, respectively. Along with the Cybersecurity Law (CSL) of 2017, these two new laws have added a set of new cross-border requirements for international companies seeking to do business in China. Furthermore, following the Schrems II decision, which invalidated the EU-US Privacy Shield, the EU Commission released new standard contractual clauses (SCCs) intended to provide more flexibility and options for cross-border data exchange. The new SCCs are applicable for all new contracts entered into as of September 27, and businesses have until December 27, 2022 to transition all contracts using the older SCCs to ones with the new SCCs. Additionally, Québec’s Bill 64, which received royal assent a few months ago, has a series of new requirements coming into effect within the next couple of years for businesses both within and outside the province. 

On the global data privacy class action front, the UK Supreme Court’s recent decision in Lloyd v. Google suggests that opt-out class action cases for data privacy claims will be very difficult to bring. 

 

Conclusion and Key Takeaways 

In the midst of the ongoing COVID-19 pandemic and a rise in sophisticated cyberattacks, 2021 saw many privacy and cybersecurity trends and developments. There were new laws and regulations on both a domestic and an international scale. Case law in relevant areas developed rapidly, with some issues still unresolved as we embark on 2022. Things do not seem to be slowing down at all in the realm of privacy and cybersecurity. Beckage’s team of attorneys and technologists work with businesses of all sizes and industries to develop comprehensive scalable data security and privacy infrastructures to navigate this fast moving area. 

*Attorney Advertising. Prior results do not guarantee similar outcomes. 

Subscribe to our newsletter. 

AIAccountability and the Use of Artificial Intelligence

Accountability and the Use of Artificial Intelligence

As artificial intelligence (“AI”) and automated decision-making systems make their way into every corner of society – from businesses and schools to government agencies – concerns about using the technology responsibly and accountability are on the rise. 

The United States has always been on the forefront of technological innovations and our government policies have helped us remain there.  To that end, on February 11, 2019, President Trump issued an Executive Order on Maintaining American Leadership in Artificial Intelligence (No. 13,859).  See Exec. Order No. 13,859, 3 C.F.R. 3967.  As part of this Executive Order, the “American AI Initiative” was launched with five guiding principles:

  1. Driving technological breakthroughs; 
  2. Driving the development of appropriate technical standards; 
  3. Training workers with the skills to develop and apply AI technologies; 
  4. Protecting American values, including civil liberties and privacy, and fostering public trust and confidence in AI technologies; and
  5.  Protecting U.S. technological advantages in AI, while promoting an international environment that supports innovation. Id. at § 1. 

Finally, the Executive Order tasked the National Institute of Standards and Technology (“NIST”) of the U.S. Department of Commerce with creating a plan for the development of technical standards to support reliable, robust, and trustworthy AI systems.  Id. at § 6(d). To that end, the NIST released its Plan for Federal Engagement in Developing Technical Standards in August 2019.  See Nat’l Inst. of Standards & Tech., U.S. Leadership in AI: A Plan for Federal Engagement in Developing Technical Standards and Related Tools (2019). 

While excitement over the use of AI was brewing in the executive branch, the legislative branch was concerned with its accountability as on April 10, 2019, the Algorithmic Accountability Act (“AAA”) was introduced into Congress.  See Algorithmic Accountability Act of 2019, S. 1108, H.R. 2231, 116th Cong. (2019).  The AAA covered business that: 

  1. Made more than $50,000,000 per year;
  2. Held data for greater than 1,000,000 customers; or
  3. Acted as a data broker to buy and sell personal information.  Id. at § 2(5). 

The AAA would have required business to conduct “impact assessments” on their “high-risk” automated decision systems in order to evaluate the impacts of the system’s design process and training data on “accuracy, fairness, bias, discrimination, privacy, and security”.  Id. at §§ 2(2) and 3(b).  These impact assessments would have required to be performed “in consultation with external third parties, including independent auditors and independent technology experts”.  Id. at § 3(b)(1)(C).  Following an impact assessment the AAA would have required that business reasonably address the result of the impact assessment in a timely manner.  Id. at § 3(b)(1)(D).  

It wasn’t just the federal government who is concerned about the use of AI in business as on May 20, 2019, the New Jersey Algorithmic Accountability Act (“NJ AAA”) was introduced into the New Jersey General Assembly.  The NJ AAA was very similar to the AAA in that it would have required businesses in the state to conduct impact assessments on “high risk” automated decisions. See New Jersey Algorithmic Accountability Act, A.B. 5430, 218th Leg., 2019 Reg. Sess. (N.J. 2019).  These “Automated decision system impact assessments” would have required an evaluation of the systems development “including the design and training data of the  automated  decision  system,  for  impacts  on accuracy,  fairness,  bias,  discrimination,  privacy,  and  security” as well as a cost-benefit analysis of the AI in light of its purpose.  Id. at § 2.  The NJ AAA would have also required businesses work with independent third parties, record any bias or threat to the security of consumers’ personally identifiable information discovered through the impact assessments, and provide any other information that is required by the New Jersey Director of the Division of Consumer Affairs in the New Jersey Department of Law and Public Safety.  Id

While the aforementioned legislation has appeared to have stalled, we nevertheless anticipate that both federal and state legislators will once again take up the task of both encouraging and regulating the use of AI in business as the COVID-19 pandemic subsides.  Our team at Beckage contains attorneys who are focused on technology, data security, and privacy and have the experience to advise your business on the best practices for the adoption of AI and automated decision-making systems. 

*Attorney Advertising. Prior results do not guarantee future outcomes. 

Subscribe to our Newsletter

FingerprintBiometric Litigation Continues To Rise As Businesses Work To Minimize Risk

Biometric Litigation Continues To Rise As Businesses Work To Minimize Risk

In 2008, Illinois enacted the Illinois Biometric Information Privacy Act (“BIPA”) with the purpose of recognizing a person’s privacy right to their “biometric information” and “biometric identifiers”.  BIPA was enacted in response to the growing use of biometrics by businesses.   

In part because of its private right of action, by which plaintiffs may bring suit against businesses directly, BIPA litigation remains at the forefront of the data privacy litigation landscape as businesses continue to collect the biometric identifiers of their employees.  Recent BIPA class action settlements with major tech companies like Facebook and TikTok have been in the hundreds of millions of dollars, but the majority of BIPA litigation is brought against small and medium sized enterprises who collect biometric information in employee timekeeping or for access controls to physical spaces.   

To date, defendants have found courts to be generally unwilling to dismiss BIPA litigation at early motion practice.  Two recent cases, Thornley v. Clearview AI and Barton v. Swan Surfaces, demonstrate that there are some potential limits to BIPA litigation. 

Thornley  v. Clearview AI 

In Thornley, Melissa Thornley accused Clearview AI of scaping publicly available photos from her social media accounts for facial recognition purposes and selling her biometric information to third parties without her consent.  Thornley v. Clearview AI, Inc., 984 F.3d 1241, 1242-1243 (7th Cir. 2021).  Thornley initially filed a complaint in Illinois state court, alleging as a class representative, that Clearview violated § 15(c) of BIPA, which requires in relevant part, that “[n]o private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information.”  Id. at 1246.  Clearview removed the case to federal court on the basis that the allegation of a statutory violation gave rise to a concrete and particularized injury-in-fact that is necessary for Article III standing.  Id. at 1243.  Under the Constitution, a plaintiff must have Article III standing to sue in federal court, which requires that the plaintiff prove: (1) an injury in fact; (2) causation of the injury by the defendant; and (3) that the injury is likely to be redressed by the requested relief.  See Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547 (2016).  In Spokeo, the Supreme Court of the United States held that a statutory violation could be sufficient to constitute an injury in fact; however, it did not provide any analysis as to which types of statutory violations necessarily implicate concrete and particularized injuries in fact.  Id.   

The district court held that Clearview alleged violation of § 15(c) of BIPA was “only a bare statutory violation, not the kind of concrete and particularized harm that would support standing”, the case must be remanded to the state court.  Thornley., 984 F.3d at 1242.  Clearview then appealed to the Seventh Circuit, who concurred with the District Court and remanded the case back to the Illinois State Court for much the same lack of standing.  Id.  Clearview has now petitioned the Supreme Court of the United States to take its case.  See Porter Wells, Clearview AI Will Take BIPA Standing Challenge to Supreme Court. 

Barton v. Swan Surfaces, LLC 

In Barton, a unionized employee of Swan Surfaces, LLC (“Swan”) was required to clock in and out of her employer’s manufacturing plant using her fingerprints as part of company protocol.  Barton v. Swan Surfaces, LLC, No. No. 20-cv-499-SPM, 2021 WL 793983 at *1 (S.D. Ill March 2, 2021).  On May 29, 2020 Barton filed a complaint in the United States District Court for the Southern District of Illinois alleging that she represented a class of individuals who “while residing in the State of Illinois, had their fingerprints collected, captured, received, otherwise obtained and/or stored by Swan”.  Id. at *2.  Barton asserted Swan violated BIPA in: (1) failing to institute, maintain, and adhere to publicly available retention schedule in violation of 740 ILCS 14/15(a); and (2) failing to obtain informed written consent and release before collecting biometric of information.  Id.  On July 31, 2020, Swan filed a Motion to Dismiss, asserting in relevant part, that Barton’s BIPA claims were preempted by § 301 of the Labor Management Relations Act (“LMRA”).  Id.  

On March 2, 2021, the court held that as Barton was a unionized employee, her Collective Bargaining Agreement (“CBA”), which contained a management rights clause and grievance procedure, controlled and as such Barton’s BIPA claims were preempted by § 301 of the LMRA.  In coming to its conclusion, the court heavily relied on the courts holding in Miller v. Southwest Airlines, Inc., 926 F.3d 898 (7th Cir. 2019). Id. at *6. In Miller, the Seventh Circuit held an adjustment board had to resolve the employees’ dispute over the airline’s fingerprint collection practices because their unions may have bargained over the practice on their behalf.  Miller, 926 F.3d 898.  The court in Barton noted that the United States “Supreme Court has held that the RLA preemption standard is virtually identical to the pre-emption standard the Court employs in cases involving § 301 of the LMRA” and therefore the same outcome should apply.  Barton, 2021 WL 793983 at *4. 

Key Takeaway 

While these cases demonstrate the potential to circumvent or limit BIPA litigation, the increased volume of biometric information being used by companies and the push for biometric policies that govern the use of these technologies and promote safeguards for consumers will undoubtedly continue.  

With many states looking to implement biometric privacy laws similar to BIPA, it is important to have legal tech counsel to address compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. We have a team of highly skilled lawyers that stay up to date on all developments in case law on BIPA and who can help your company best defense given the current legal landscape. Our team can help assist your company in assessing and mitigating risks associated with emerging technologies. 

*Attorney Advertising: Prior results do not guarantee a similar outcome. 

Subscribe to our newsletter. 

Canada PrivacyCanada’s New Privacy Bill Aims to Strengthen Privacy Rights for Citizens

Canada’s New Privacy Bill Aims to Strengthen Privacy Rights for Citizens

On November 17, 2020, the Canadian Minister of Innovation, Science, and Industry introduced a new federal privacy bill that would reshape Canada’s privacy framework with a main goal of strengthening interoperability with both the European Union and the United States. Bill C-11 proposes the Digital Charter Implementation Act, 2020 which includes the Consumer Privacy Protection Act. This legislation would significantly increase protection of Canadian personal information by enhancing Canadian control over data and demanding more transparency from companies as to their handling of personal information. The Digital Charter Implementation Act includes:

  1. Increased control and transparency of Canadian personal identifiable information being handled by companies,
  2. Ability for Canadians to move information from one organization to another in a secure manner,
  3. Right for Canadians to destroy their information,
  4. Ability of the Privacy Commissioner to force an organization to comply and order businesses and corporations to stop collecting data or using personal information, and
  5. Strongest fine among G7 privacy laws.

Penalties and Provisions

There are significant fines for noncompliant businesses – up to 5% of revenue or a sum of Can$25 million, whichever is higher. The bill would also modernize the Consumer Privacy Protection Act (CPPA) to protect an individual’s personal information while regulating organizations collection, use, and disclosure of personal information. The CPPA would also further consent requirements for handling personal information, create transparency requirements with respect to algorithms and artificial intelligence (AI), mobility of personal data, retention and disposal of personal information, and codifies legitimate interests where consent is not required. The CPPA updates the Personal Information Protection and Electronic Documents Act, which governed how private sector organizations collect, use, and disclose personal information in commercial business.

Part of Bill C-11 also introduces the Personal Information and Privacy Protection Tribunal Act (PIPPTA). The PIPPTA was established to create an accelerated and more direct path to enforcement of orders from the Office of the Private Commissioner to meet its expanded role and provide strong enforcement. The PIPPTA also includes a private right of action, allowing individuals to sue where the commissioner issues a finding of a privacy violation and it will be upheld by the Tribunal. However, all cases must be brought up within two years of the violation.

Impact

Canada’s proposed federal privacy bill follows the lead of the European Union’s General Data Protection Regulation and the United States’ California Consumer Privacy Act. Canada’s privacy bill was created to impose obligations on any business that collects Canadian personal data. Businesses and companies that fail to comply will be subject to the penalties outlined above. If Bill C-11 is passed, US businesses that collect and/or process the personal data of Canadians will have to enact procedures that comply with the Consumer Privacy Protection Act and other requirements in the bill. As with any new piece of data legislation, it crucial that companies potentially impacted perform a thorough review of their forward-facing privacy practices as well as update their internal procedures to address any new compliance requirements.

At Beckage, we have a team of Global Data Privacy Attorneys that continue to monitor the constantly evolving data privacy and cybersecurity legislation landscape. The Beckage team is made up of technologists and Certified Information Privacy Professionals (CIPP/US & CIPP/E) who can help develop and review new and existing privacy policies compliant with Bill C-11 and other international legislation to help protect your business.

*Attorney Advertising. Prior results do not guarantee similar outcomes.

Subscribe to our Newsletter.

Apple Privacy UpdateMobile App Developers Take Notice Of New Apple Privacy Requirements

Mobile App Developers Take Notice Of New Apple Privacy Requirements

Companies that have, or are in the process of developing, mobile applications that are connected to the Apple Store should be aware of recent privacy updates and should take steps to prepare your business for these new privacy requirements in 2021. 

Apple’s Announcement

Beginning on December 8, 2020, Apple will impose specific requirements for the disclosure of privacy practices for all applications on the product page in the Apple Store.  This change will help users understand an app’s privacy practices before they download the app on any Apple platform.  The App Store product page will now feature a new privacy information section to help users understand an app’s privacy practices, such as data collection practices, the types of data collection, the data linked to the user, user tracking, and privacy links.  More details about Apple’s announcement can be found at the privacy details page and additional guidance on how to provide app privacy information can be found in Apple’s App Store Connect.

In addition to providing information about some of your app’s data collection practices on your product page, on iOS 14, iPadOS 14, and tvOS 14, apps will be required to receive user permission (opt-in consent) to track users across apps or websites owned by other companies or to access the device’s advertising identifier. This change allows users to choose whether they permit an app to track them or access their device’s advertising identifier.

Tracking refers to the act of linking user or device data collected from your app with user or device data collected from other companies’ apps, websites, or offline properties for targeted advertising or advertising measurement purposes.  Tracking also refers to sharing user or device data with data brokers.  To provide developers time to make necessary changes, apps will be required to obtain permission to track users starting early next year.  Additional guidance can be found at the Apple developer’s blog page.

What To Do Now

Businesses should take steps to make sure their current practices are legally compliant and address Apple’s new guidelines.

Now is an ideal time to work with your tech legal counsel to review your privacy policy and the App Store guidelines as well as applicable laws to confirm that the statements made throughout your policy are true and accurate representations of your data collection and sharing practices. Apps will need to create standardized privacy disclosures for the App Store to meet format and content requirements, but these responses should be carefully reviewed as not to conflict with any existing privacy statements.  Your internal business practices and collection protocols may change from time to time, which is why Beckage recommends an annual review of your privacy policy and related practices.  

Additionally, business should consult with their tech legal counsel to review and update consent language and disclosures for pop-up and any related consent forms that are utilized.  There may be specific regulatory or statutory requirements for obtaining consent through a mobile application that may need to be evaluated.  For example, although there are not currently opt-in requirements under the CCPA, there are specific requirements for consent under the GDPR and that may need to be met should the GDPR apply to your application.

Beckage lawyers have worked with numerous mobile app developers on privacy matters.   The Beckage team of lawyers is made up of technologists and certified privacy professionals who can help develop and review new and existing privacy policies to ensure compliance with Apple’s new privacy requirements. To reach a Beckage attorney, call 716.898.2102.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

1 2 3 4