Biometric InformationBIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

BIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

COVID-19 is accelerating company adoption of biometric technologies.  With a global shift towards remote working, biometric technologies, which measure physiological, behavioral, and psychological characteristics, can promote, or at least monitor, productivity by recording employee performance.  Facial recognition biometric systems have also been vital in contactless engagement, especially in the airline and retail sectors, and such systems will remain after the pandemic subsides.  This burgeoning biometric industry is garnering interest from lawmakers. Given the firm’s technology-driven focus, Beckage has been tracking biometric laws and will continue to monitor legal and business developments surrounding biometric technologies. 

Biometric Data and the Law

Unlike other personal data, such as passwords, social security numbers, and payment card information, biometric identifiers cannot easily be changed once breached.  Because they are immutable by nature, regulations classify them as a sensitive class of personal data.  Notable laws that govern biometric data include the E.U. Global Data Protection Regulation (GDPR) and U.S. state laws, including California’s comprehensive privacy law. Three states, Illinois, Texas, and Washington, have passed biometric specific laws. New York State recently introduced the Biometric Privacy Act, a bill that is nearly identical to Illinois’ BIPA, and other states, such as Arkansas and California have amended their breach notification laws to reflect biometric data as personal identifying information.

The first step to knowing whether biometric regulations apply to your business is understanding the definition of biometric data.  The GDPR defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.” Art. 4(14).  Similarly, U.S. biometric laws protect biometric data characterized in terms of personal identifiers, including retina scan, iris scan, fingerprint, voiceprint, hand scan, and face geometry.  For example, the Illinois Biometric Data Act (BIPA) defines biometric information as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.” Sec.10.

U.S. Biometric Litigation Trends

Recent rulings in biometric litigation indicate that BIPA currently drives the legal landscape on biometric data protection in the U.S.  BIPA litigation is on the rise following the Illinois Supreme Court 2019 decision in Rosenbach v. Six Flags.  The plaintiff in Rosenbach was the mother of a minor whose fingerprint was captured to verify his identity for entry to an amusement park owned by the defendant.  The Court rejected the defendant’s allegations that the plaintiff had not suffered any actual or threatened harm.  Consequently, the Court held a plaintiff can sue based on a mere technical violation of the law.  This decision means that a person does not have to suffer actual harm to pursue a biometric suit under BIPA.  Further, federal courts have agreed that failure to implement privacy policies outlining procedures for collection, retention, and destruction of biometric identifiers is sufficient to demonstrate a violation of the law.  For example, in May 2020, the Seventh Circuit in Bryant v. Compass found the Rosenbach ruling instructive in holding the plaintiff can pursue a lawsuit against a vending machine operator if the vending machine installed at a workplace integrated biometric authentication in lieu of credit card payments.

The types of companies involved in BIPA litigation are diverse.  Any company that collects, stores, or uses biometric information related to Illinois residents is subject to BIPA.  To that end, no industry seems immune: plaintiffs have sued big tech companies using facial recognition technologies and smaller companies, such as nursing homes, using fingerprinting systems for timekeeping.  The Compass ruling illustrates that third-party vendors who provide biometric authentication systems in the workplace are within the reach of BIPA.

The diversity in cases signals the legislative impact of the law and spotlights the role of privacy policies and procedures.  BIPA is the only biometric law in the U.S that allows individuals to sue a company for damages in amounts ranging from $1,000 to $5,000 per violation.  Thus, the stakes can be high for companies without proper biometric data governance.

What should companies do?

To comply with the evolving BIPA compliance and other biometric laws, companies should work with experienced lawyers who understand biometric technologies and regulations to address the following controls and practices:

  • Properly inform individuals or responsible parties about the purpose of collecting their biometric data.
  • Properly inform individuals or responsible parties about the company’s biometric collection, retention, storage, and dissemination policies and procedures.
  • Obtain written consent from individuals or their responsible party before collecting biometric data.
  • Make the company’s written biometric policy establishing retention schedule and destruction guidelines publicly available.

A robust biometric compliance program should reflect current laws and be flexible and scalable to adapt to the changes laws that new biometric legal rules will inevitably bring to their privacy compliance programs.  Beckage’s lawyers, who are also technologists, are equipped with the skills and experience to build a robust biometric compliance program.  We stand ready to answer any of your questions.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

2020Looking Back on 2020’s Top Privacy and Cybersecurity Trends

Looking Back on 2020’s Top Privacy and Cybersecurity Trends

As 2020 comes to a close, Beckage looks back on the ways this difficult and unprecedented year impacted the data privacy and cybersecurity landscape both domestically and across the globe.

Enhanced Privacy Challenges and Concerns Due to Covid-19

In response to the COVID-19 pandemic, businesses around the globe made a major pivot to online or virtual operations early this year. An intentional focus on data protection and a solid understanding of the regulatory landscape is a legal requirement that demands the integration of data protection up front in any network design or business practice. The increase in exposure of company assets made it necessary to implement a variety of technical safeguards. Companies still had to meet the compliance milestones of the NY SHIELD Act and California’s Consumer Protection Act (CCPA) while dealing with new privacy challenges caused by a distributed workforce and a global health pandemic. Beckage reminds organizations of the importance of revisiting their readiness through business continuity, incident response, and more expansive administrative, technical, and physical safeguards when shifting to a work-from-home model and recommends continued assessment of your company’s privacy pitfalls in this ever-shifting legal landscape.

Increased Ransomware and Cyberattacks

With rapid changes in organizational operations caused by the COVID-19 pandemic, attackers became more sophisticated in their strategies and unleashed several unrelenting, simultaneous attacks on service providers and the organizations they serve in 2020. Victims of recent cyber attacks, such as the SolarWinds campaign carried out in December, include government agencies, healthcare providers, consulting agencies, and , technology, telecom, and oil and gas companies. In many of these campaigns, attackers were able to gain access and move freely throughout an organization’s server, installing additional software, creating new accounts, and accessing sensitive data and valuable resources while remaining largely undetected. In response to the uptick in data incidents this year, the Beckage Incident Response Team recommends organizations implement several preventative steps to safeguard their organization to help minimize legal risk.

Patient Access Rights and Interoperability

Recent developments in 2020 concerning patients’ right to access health information to implement interoperability and record access requirements intend to help patients obtain access to health records and payment data to make informed decisions about their healthcare. The CMS Proposed Rule and the OCR Proposed Rule represent a complete overhaul of well-established standards and an introduction of new and highly technical requirements with healthcare compliance. The experienced Health Law Team at Beckage can help to distill these lengthy and complicated rules so organizations can understand practical implications on daily operations.

Increased International Focus on Consumer Privacy

On the heels of EU’s General Data Protection Regulation (GDPR), many countries followed suit by establishing legal frameworks for governing how organizations collect, use, and store their citizens’ personal data. One example is Brazil’s Lei Geral de Proteção de Dados (LGPD), which went into effect in August of 2020. This general data protection law, which closely mimics the GDPR, places strict requirements on organizations that process Brazilian citizen’s personal data.

At the same time, Europe continued to elevate its enforcement of the GDPR, with major decisions from various member state Data Protection Authorities, the European Court of Justice (ECJ), and the European Data Protection Board (EDBP). The most impactful for businesses across the globe was the ECJ’s decision in Schrems II, which invalidated the EU-US Privacy Shield and called into question the long-term viability of the Standard Contractual Clauses (SCCs) to transfer data from the EU to the US. In 2021, companies should closely monitor the evolving guidance on international data transfers and be prepared to mitigate risk of global data transfers.

Beckage’s Global Data Privacy Team expects continued adoption of data protection regulations across many regions, and an emphasis on creating global security and privacy compliance programs in the year ahead.

Uptick in ADA Litigation

This past year, the Beckage Accessibility Team has witnessed a drastic increase in litigation under Title III of the Americans with Disabilities Act. On average, about eight new lawsuits are filed a day by disabled individuals alleging unequal access to goods and services provided on a company’s digital platforms. While the Department of Justice (DOJ) has consistently held that the ADA applies to websites and mobile apps, they have failed to clarify the precise requirements for a business to be deemed compliant. This has prompted a wave of litigation by plaintiffs’ who claim a website or mobile app’s incompatibility with assistive technology, like screen-reading software, has denied them full access to and equal enjoyment of the goods, services, and accommodations of the website, therefore violating the ADA. Most of these lawsuits are settled quickly out of court to avoid litigating in such uncertain legal terrain.

Beckage handles the defense of website accessibility lawsuits as well as assists companies in navigate pre and post-suit settlement agreements for this unique area of the law.  Beckage also works with clients under privilege to conduct internal and remedial audits of client websites and mobile applications, evaluate platform compatibility and oversee implementation of recommended remedial or accessibility-enhancement measures.

California Consumer Protection Act (CCPA)  

Enforcement of California’s comprehensive California Consumer Privacy Act (CCPA) began on July 1, 2020 and has brought a range of plaintiff related lawsuits under its private right of action provision expanding California breach laws. For a data breach to be actionable, the information accessed must be identified as personal information, as narrowly defined by California’s data breach notification law. Recently, in November 2020, the Consumer Right To Privacy Act (CRPA) ballot initiative was passed, creating additional privacy rights and obligations pertaining to sensitive personal information that will go into effect. CPRA also expands data breach liability created by the CCPA, adds a private right of action for unauthorized access that permits access to an account if the business failed to maintain reasonable security, and imposes data protection obligations directly on service providers, contractors, and third parties. Beckage urges businesses who operate in or serve California citizens to continue to follow CCPA developments and carefully monitor related litigation in the coming months.

Emerging Technologies

The recent expansion of the Illinois Biometric Information Privacy Act (BIPA) has resulted in numerous class actions suits against organizations alleged to have collected plaintiffs’ biometric data. With the expanding use of biometric equipment, these claims often allege defendants obtained plaintiffs’ biometric data without complying with the BIPA’s notification and consent requirements. Upcoming class suits may address the issue of BIPA having an extraterritorial effect when bringing claims against out of state vendors.

Similarly, computers that manipulate the media, known as deep fakes, advance the dangers of influenced perceptions. The advancements of deep fakes are giving rise to laws regarding defamation, trade libel, false light, violation of right of publicity, or intentional infliction of emotional distress. Sophisticated tech lawyers can assist in determining rights and technological solutions to mitigate harm. As former tech business owners, Beckage lawyers want to drive innovation with use of these new and emerging technologies while understanding standards and laws that may impact such development. Beckage recommends that companies proactively mitigate the risks associated with collecting biometric information and deep fakes to prevent legal repercussions and defamation. 

Key Takeaways

2020 proved to be an unpredictable year in more ways than one. The COVID-19 pandemic forced companies to rapidly adapt to new privacy and data security challenges caused by a distributed workforce, emerging technologies, and an increased focus on ecommerce with in-person shopping and events. As we move towards 2021 with no definitive end to the pandemic in sight, it is crucial for companies to prioritize data privacy and cybersecurity initiatives by consulting qualified legal tech experts who can help navigate the uncertainty next year will bring. Beckage attorneys can assist in creating, implementing, and evaluating robust data security and privacy infrastructures that will help put your business in a position to tackle all the challenges 2021 has in store.

*Attorney Advertising. Prior results do not guarantee similar outcomes.

Subscribe to our newsletter.

Artificial IntelligenceArtificial Intelligence Best Practices: The UK ICO AI and Data Protection Guidance

Artificial Intelligence Best Practices: The UK ICO AI and Data Protection Guidance

Artificial intelligence (AI) is among the fastest growing emerging information digital technology. It helps businesses to streamline operational processes and to enhance the value of goods and services delivered to end-users and customers. Given AI is a data-intensive technology, policymakers are seeking ways to mitigate risks related to AI systems that process personal data, and technology lawyers are assisting with compliance efforts.

Recently, the UK Information Commissioner Office (ICO) published its Guidance on AI and Data Protection. The guidance follows the ICO’s 2018-2021 technology strategy publication identifying AI as one of its strategic priorities.  

The AI guidance contains a framework to guide organizations using AI systems and aims to:

  • Provide auditing tools and procedures the ICO will use to assess the compliance of organizations using AI; and  
  • Guide organizations on AI and data protection practices.

AI and Data Protection Guidance Purpose and Scope

The guidance solidifies the ICO’s commitment to the development of AI and supplements other resources for organizations such as the big data, AI, and machine learning report and the guidance on explaining decisions made with AI which the ICO produced in collaboration with the Alan Turing Institute in May 2020.

In the AI framework, the ICO adopts an academic definition of AI, which in the data protection context, refers to ‘the theory and development of computer systems able to perform tasks normally requiring human intelligence’. While the guidance focuses on machine-learning based AI systems, it may nonetheless apply to non-machine learning systems that process personal data.

The guidance seeks to answer three questions. First, do people understand how their data is being used? Second, is data being used fairly, lawfully and transparently? Third, how is data being kept secure?

To answer these questions, the ICO takes a risk-based approach to address different data protection principles including transparency, accountability and fairness. The framework outlines measures that organizations should consider when designing artificial intelligence regulatory compliance. The applicable laws driving this compliance are UK Data Protection Act 2018 (DPA 2018) and the General Data Protection Regulation (GDPR).

The ICO details key actions companies should take to ensure their data practices relating to AI system comply with the GDPR and UK data protection laws. The framework is divided into four parts focusing on (1) AI-specific implications of accountability principle (2) the lawfulness, fairness, and transparency of processing personal data in AI systems (3) security and data minimization in AI systems and (4) compliance with individual rights, including rights relating to solely automated decisions.

AI Best Practices

This section summarizes selected AI best practices outlined in the guidance organized around the four data protection areas. When working towards AI legal compliance, organizations should work with experienced lawyers who understand AI technologies to address the following controls and practices:

Part One: Accountability Principle

  • Build a diverse, well-resourced team to support AI governance and risk management strategy
  • Determine with legal the companies’ compliance obligations while balancing individuals’ rights and freedoms
  • Conduct Data Protection Impact Assessment (DPIA) or other impact assessments where appropriate
  • Understand the organization’s role: controller/processor when using AI systems

Part Two: Lawfulness, Fairness, and Transparency of Processing Personal Data

  • Assess statistical accuracy and effectiveness of AI systems in processing personal data
  • Ensure all people and processes involved understand the statistical accuracy, requirements and measures
  • Evaluate tradeoffs and expectations
  • Adopt common terminology that staff can use to communicate about the statistical models
  • Address risks of bias and discrimination and work with legal to build into policies

Part Three: Principles of Security and Data Minimization in AI Systems

  • Assess whether trained machine-learning models contains personally identifiable information
  • Assess the potential use of trained -machine learning models
  • Monitor queries from API’s users
  • Consider ‘white box’ attacks
  • Identify and process the minimum amount of data required to achieve the organization’s purpose

Part Four: Compliance with Individual Rights, Including Rights Relating to Solely Automated Decisions

  • Implement reasonable measures respond to individual’s data rights requests
  • Maintain appropriate human oversight for automated decision-making

The ICO anticipates developing a toolkit to complement the AI guidance. In the meanwhile, the salient points to the ICO guidance’s rests upon these key takeaway’s organizations should understand the applicable data protection laws and assemble the right team to address these requirements.

Building privacy and security early into the development of AI can provide efficiencies in the long-term to address the growing focus of regulatory authorities on ensuring that these technologies include data protection principles.  Also working towards robust AI compliance efforts, organizations can find themselves having a competitive advantage.  Beckage’s lawyers, many who are also technologists and have been trained by MIT regarding business use of AI, have been quoted in national media about AI topics.  We stand ready to answer any of your questions.

*Attorney advertising. Prior results do not guarantee future outcomes.

Subscribe to our newsletter.

Apple Privacy UpdateMobile App Developers Take Notice Of New Apple Privacy Requirements

Mobile App Developers Take Notice Of New Apple Privacy Requirements

Companies that have, or are in the process of developing, mobile applications that are connected to the Apple Store should be aware of recent privacy updates and should take steps to prepare your business for these new privacy requirements in 2021. 

Apple’s Announcement

Beginning on December 8, 2020, Apple will impose specific requirements for the disclosure of privacy practices for all applications on the product page in the Apple Store.  This change will help users understand an app’s privacy practices before they download the app on any Apple platform.  The App Store product page will now feature a new privacy information section to help users understand an app’s privacy practices, such as data collection practices, the types of data collection, the data linked to the user, user tracking, and privacy links.  More details about Apple’s announcement can be found at the privacy details page and additional guidance on how to provide app privacy information can be found in Apple’s App Store Connect.

In addition to providing information about some of your app’s data collection practices on your product page, on iOS 14, iPadOS 14, and tvOS 14, apps will be required to receive user permission (opt-in consent) to track users across apps or websites owned by other companies or to access the device’s advertising identifier. This change allows users to choose whether they permit an app to track them or access their device’s advertising identifier.

Tracking refers to the act of linking user or device data collected from your app with user or device data collected from other companies’ apps, websites, or offline properties for targeted advertising or advertising measurement purposes.  Tracking also refers to sharing user or device data with data brokers.  To provide developers time to make necessary changes, apps will be required to obtain permission to track users starting early next year.  Additional guidance can be found at the Apple developer’s blog page.

What To Do Now

Businesses should take steps to make sure their current practices are legally compliant and address Apple’s new guidelines.

Now is an ideal time to work with your tech legal counsel to review your privacy policy and the App Store guidelines as well as applicable laws to confirm that the statements made throughout your policy are true and accurate representations of your data collection and sharing practices. Apps will need to create standardized privacy disclosures for the App Store to meet format and content requirements, but these responses should be carefully reviewed as not to conflict with any existing privacy statements.  Your internal business practices and collection protocols may change from time to time, which is why Beckage recommends an annual review of your privacy policy and related practices.  

Additionally, business should consult with their tech legal counsel to review and update consent language and disclosures for pop-up and any related consent forms that are utilized.  There may be specific regulatory or statutory requirements for obtaining consent through a mobile application that may need to be evaluated.  For example, although there are not currently opt-in requirements under the CCPA, there are specific requirements for consent under the GDPR and that may need to be met should the GDPR apply to your application.

Beckage lawyers have worked with numerous mobile app developers on privacy matters.   The Beckage team of lawyers is made up of technologists and certified privacy professionals who can help develop and review new and existing privacy policies to ensure compliance with Apple’s new privacy requirements. To reach a Beckage attorney, call 716.898.2102.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

AccessibilityOnline Accessibility Act Seeks to Clarify Accessibility Guidelines for Private Businesses’ Digital Presence

Online Accessibility Act Seeks to Clarify Accessibility Guidelines for Private Businesses’ Digital Presence

The Beckage Accessibility Team is closely following bipartisan legislation introduced into the U.S. House of Representatives on October 2, 2020. The Online Accessibility Act, sponsored by Congressmen Lou Correa (D-CA) and Ted Budd (R-NC), would add language to the existing Americans with Disabilities Act (ADA) and provide much-needed clarity on the legal requirements for consumer-facing websites and mobile applications to be considered accessible to individuals with disabilities, particularly blind and visually-impaired persons.

If passed, this legislation would have clear benefits for both disabled individuals and online businesses that operate consumer websites, defined as “any website that is purposefully made accessible to the public for commercial purposes.” The Online Accessibility Act would limit the number of predatory lawsuits filed against business owners while helping them improve accessibility for their disabled customers.

Beckage continues to monitor the state and federal dockets daily and the number of lawsuits that are filed continue at record speed.  On average we see about eight new lawsuits a day. These website accessibility lawsuits are filed by plaintiffs alleging unequal access to services on companies’ digital platforms due to incompatibility with assistive technology, like screen-reading software. While the Department of Justice (DOJ) has consistently held that the ADA applies to websites and mobile apps, it has fallen short of clarifying the precise requirements, leaving businesses confused as to whether their digital platforms are compliant. As result, a very high number of these cases are settled out of court to avoid gambling with high litigation costs in such uncertain legal terrain.

“This bill solves the problem by providing guidance to businesses on how to bring their websites into compliance. If our bill is passed, job-creators will be able to avoid costly lawsuits and be given a roadmap for how to help their disabled customers access online content,” said Rep Budd in a statement about the Act.

“We are optimistic that this bill will provide some much-needed clarity in the ADA legal landscape,” says Beckage Accessibility Team Leader, Kara Hilburger. “It is so important to have universal standards for accessibility to level the playing field and help businesses best serve their customers while avoiding lawsuits.”

This legislation is coming at a crucial time given the rapid increase in online shopping due to the COVID-19 pandemic, as consumers choose to avoid brick-and-mortar stores in favor of e-commerce options. However, the future of the Online Accessibility Act is still uncertain given its introduction during a particularly polarized election season and an unpredictable political landscape hanging in the balance.

“Beckage continues to advise clients to be proactive when it comes to website accessibility,” Hilburger confirmed.  “There are many low-cost, high impact steps companies can take immediately, such as publishing an Accessibility Statement, that can place them in a legally defensible position while they work to implement accessibility by-design into their new online products and offerings.”   

Beckage remains hopeful that the Online Accessibility Act will gain traction and provide much needed relief for the business community.  Beckage works with businesses from all sectors and industries as they navigate the uncertain legal landscape surrounding website accessibility.  Through collaborating with in-house technologists, outside developers, members of the disability community, and internal assistive technologies, Beckage attorneys work under privilege to conduct internal and remedial audits of client websites and mobile applications, evaluate platform compatibility, and oversee implementation of recommended remedial or accessibility-enhancement measures.  Our team helps companies develop and implement a sustainable accessibility programs that contemplates compliance with the WCAG guidelines while monitoring the development of website accessibility standards and best practices that can protect your business.  

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

1 2 3