AI Hiring BiasAI Hiring Algorithms Present Big Questions About Accountability and Liability

AI Hiring Algorithms Present Big Questions About Accountability and Liability

As artificial intelligence (AI) becomes an increasingly prevalent human resources tool, the algorithms powering those hiring and staffing decisions have come under increased scrutiny for their potential to perpetuate bias and discrimination.

Are There Any Federal Laws or Regulations Governing the Use of AI in Hiring?

Under Title VII of the Civil Rights Act of 1964, the United States Equal Opportunity Commission (“EEOC”) is responsible for enforcing federal laws that make it illegal to discriminate against job applicants or employees because of their membership in a protected class.  For decades, attorneys have relied on the jointly issued Employment Tests and Selection Procedures by the Civil Service Commission, Department of Justice, Department of Labor and EEOC.  See generally 28 CFR § 50.14; see also Fact Sheet on Employment Tests and Selection Procedures, EEOCNevertheless, the current form of the Employment Tests and Selection Procedures fail to provide any guidance on the use of AI tools in the hiring process.   

That isn’t to say Federal regulators and legislators aren’t keen on regulating this area.  On December 8, 2020, ten United States Senators sent a joint letter to the EEOC regarding the EEOC’s authority to investigate the bias of AI driving hiring technologies.  In relevant part, the letter poses three questions:  

  1. Can the EEOC request access to “hiring assessment tools, algorithms, and applicant data from employers or hiring assessment vendors and conduct tests to determine whether the assessment tools may produce disparate impacts?
  2. If the EEOC were to conduct such a study, could it publish its findings in a public report?
  3. What additional authority and resources would the EEOC need to proactively study and investigate these AI hiring assessment technologies?  Id.

As of the current date, the EEOC has yet to respond to the letter.  Nevertheless, given the questions above, the current political climate, and the lack of current guidance from the EEOC, we anticipate future guidance, regulation, and potential enforcement actions in this area. 

How Are States Handling AI Hiring Bias? 

Illinois was first state to legislate in the area of the use of AI in hiring.  On August 9, 2019, Illinois enacted the Artificial Intelligence Video Interview Act (“AIVIA”), imposing strict limitations on employers who use AI to analyze candidate video interviews.  See 820 ILCS 42 et seq.  Under AIVIA, employers must: 

  1. Notify applicants that AI will be utilized during their video interviews;
  2.  Obtain consent to use AI in each candidate’s evaluation;  
  3. Explain to the candidates how the AI works and what characteristics the AI will track with regard to their fitness for the position; 
  4. Limit sharing of the video interview to those who have the requisite expertise to evaluate the candidate; and
  5. Comply with a candidate’s request to destroy his or her video within 30 days.  Id

Illinois was quickly followed up by Maryland, which on May 11, 2020 enacted legislation prohibiting an employer from using certain facial recognition services during a candidate’s interview for employment unless the candidate expressly consents.  See Md. Labor and Employment Code Ann. § 3-717.  The Maryland law specifically requires the candidate to consent to the use of certain facial recognition service technologies during an interview by signing a waiver which contains: 

  1. The candidate’s name;
  2. The date of the interview;
  3. that the candidate consents to the use of facial recognition during the interview;
  4. and that the candidate has read the waiver.  Id.

As with AIVIA, the emerging nature of the Maryland law does not provide much insight into how the law will be interpreted or enforced.

There are a number of other jurisdictions which have bills in different states of progress.  On February 20, 2020 a bill was introduced into the California legislature which would limit the liability of an employer or a purveyor of AI assisted employment decision making software under certain circumstances.  See 2019 Bill Text CA S.B. 1241.  This Californian bill “would create a presumption that an employer’s decision relating to hiring or promotion based on a test or other selection procedure is not discriminatory, if the test or procedure meets specified criteria, including, among other things, that it is job related and meets a business necessity” and “that the test or procedure utilizes pretested assessment technology that, upon use, resulted in an increase in the hiring or promotion of a protected class compared to prior workforce composition.”  Id. The bill would also require the employer to keep records of the testing or procedure and submit them for review to the California Department of Fair Employment and Housing, upon request, in order to qualify for the presumption and limit their liability.  Id

Not to be outdone, a bill was introduced into the New York City Counsel on February 27, 2020 with the purpose of regulating the sale of automated employment decision making tools.  See Int. No. 1894.  The New York City Council bill broadly defines automated employment decision making tool as “any system whose function is governed by statistical theory, or systems whose parameters are defined by such systems, including inferential methodologies, linear regression, neural networks, decision trees, random forests, and other learning algorithms, which automatically filters candidates or prospective candidates for hire or for any term, condition or privilege of employment in a way that establishes a preferred candidate or candidates.”  Id.  The bill seeks to prohibit the sale of automated employment decision making tools if they were not the subject of an audit for bias in the past year prior to sale, were not sold with a yearly bias audit service at no additional cost, and were not accompanied by a notice that the tool is subject to the provisions of the New York City Council’s bill.  Id.  The bill would require any person who uses automated employment assessment tools for hiring and other employment purposes to disclose to candidates, within 30 days, when such tools were used to assess their candidacy for employment, and the job qualifications or characteristics for which the tool was used to screen.  Id.  Finally, the bill is not without bite, as violator are subject to “a civil penalty of not more than $500 for that person’s first violation and each additional violation occurring on the same day as the first violation, and not less than $500 nor more than $1,500 for each subsequent violation.”  Id.

What Can My Business Do Now to Prepare for Potential Liability Related to the Use of AI in Hiring?

As the current political and legal landscape continues to be in flux, one of the best things your business can do is stay on top of current statutes.  Your business could also audit both internal and external use of AI in hiring to validate and confirm the absence of bias in the system; however, testing external systems may require your vendors to open their proprietary technology and information to their customers, something that most are hesitant to do.  Finally, your business should consider conducting a thorough review of any and all indemnification provisions in its vendor agreements to see how risk might be allocated between the parties.

Beckage is a law firm focused on technology, data security, and privacy. Beckage has an experienced team of attorneys and technologists who can advise your business on the best practices for limiting its liability related to the use of AI in hiring.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

FingerprintBiometric Litigation Continues To Rise As Businesses Work To Minimize Risk

Biometric Litigation Continues To Rise As Businesses Work To Minimize Risk

In 2008, Illinois enacted the Illinois Biometric Information Privacy Act (“BIPA”) with the purpose of recognizing a person’s privacy right to their “biometric information” and “biometric identifiers”.  BIPA was enacted in response to the growing use of biometrics by businesses.   

In part because of its private right of action, by which plaintiffs may bring suit against businesses directly, BIPA litigation remains at the forefront of the data privacy litigation landscape as businesses continue to collect the biometric identifiers of their employees.  Recent BIPA class action settlements with major tech companies like Facebook and TikTok have been in the hundreds of millions of dollars, but the majority of BIPA litigation is brought against small and medium sized enterprises who collect biometric information in employee timekeeping or for access controls to physical spaces.   

To date, defendants have found courts to be generally unwilling to dismiss BIPA litigation at early motion practice.  Two recent cases, Thornley v. Clearview AI and Barton v. Swan Surfaces, demonstrate that there are some potential limits to BIPA litigation. 

Thornley  v. Clearview AI 

In Thornley, Melissa Thornley accused Clearview AI of scaping publicly available photos from her social media accounts for facial recognition purposes and selling her biometric information to third parties without her consent.  Thornley v. Clearview AI, Inc., 984 F.3d 1241, 1242-1243 (7th Cir. 2021).  Thornley initially filed a complaint in Illinois state court, alleging as a class representative, that Clearview violated § 15(c) of BIPA, which requires in relevant part, that “[n]o private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information.”  Id. at 1246.  Clearview removed the case to federal court on the basis that the allegation of a statutory violation gave rise to a concrete and particularized injury-in-fact that is necessary for Article III standing.  Id. at 1243.  Under the Constitution, a plaintiff must have Article III standing to sue in federal court, which requires that the plaintiff prove: (1) an injury in fact; (2) causation of the injury by the defendant; and (3) that the injury is likely to be redressed by the requested relief.  See Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547 (2016).  In Spokeo, the Supreme Court of the United States held that a statutory violation could be sufficient to constitute an injury in fact; however, it did not provide any analysis as to which types of statutory violations necessarily implicate concrete and particularized injuries in fact.  Id.   

The district court held that Clearview alleged violation of § 15(c) of BIPA was “only a bare statutory violation, not the kind of concrete and particularized harm that would support standing”, the case must be remanded to the state court.  Thornley., 984 F.3d at 1242.  Clearview then appealed to the Seventh Circuit, who concurred with the District Court and remanded the case back to the Illinois State Court for much the same lack of standing.  Id.  Clearview has now petitioned the Supreme Court of the United States to take its case.  See Porter Wells, Clearview AI Will Take BIPA Standing Challenge to Supreme Court. 

Barton v. Swan Surfaces, LLC 

In Barton, a unionized employee of Swan Surfaces, LLC (“Swan”) was required to clock in and out of her employer’s manufacturing plant using her fingerprints as part of company protocol.  Barton v. Swan Surfaces, LLC, No. No. 20-cv-499-SPM, 2021 WL 793983 at *1 (S.D. Ill March 2, 2021).  On May 29, 2020 Barton filed a complaint in the United States District Court for the Southern District of Illinois alleging that she represented a class of individuals who “while residing in the State of Illinois, had their fingerprints collected, captured, received, otherwise obtained and/or stored by Swan”.  Id. at *2.  Barton asserted Swan violated BIPA in: (1) failing to institute, maintain, and adhere to publicly available retention schedule in violation of 740 ILCS 14/15(a); and (2) failing to obtain informed written consent and release before collecting biometric of information.  Id.  On July 31, 2020, Swan filed a Motion to Dismiss, asserting in relevant part, that Barton’s BIPA claims were preempted by § 301 of the Labor Management Relations Act (“LMRA”).  Id.  

On March 2, 2021, the court held that as Barton was a unionized employee, her Collective Bargaining Agreement (“CBA”), which contained a management rights clause and grievance procedure, controlled and as such Barton’s BIPA claims were preempted by § 301 of the LMRA.  In coming to its conclusion, the court heavily relied on the courts holding in Miller v. Southwest Airlines, Inc., 926 F.3d 898 (7th Cir. 2019). Id. at *6. In Miller, the Seventh Circuit held an adjustment board had to resolve the employees’ dispute over the airline’s fingerprint collection practices because their unions may have bargained over the practice on their behalf.  Miller, 926 F.3d 898.  The court in Barton noted that the United States “Supreme Court has held that the RLA preemption standard is virtually identical to the pre-emption standard the Court employs in cases involving § 301 of the LMRA” and therefore the same outcome should apply.  Barton, 2021 WL 793983 at *4. 

Key Takeaway 

While these cases demonstrate the potential to circumvent or limit BIPA litigation, the increased volume of biometric information being used by companies and the push for biometric policies that govern the use of these technologies and promote safeguards for consumers will undoubtedly continue.  

With many states looking to implement biometric privacy laws similar to BIPA, it is important to have legal tech counsel to address compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. We have a team of highly skilled lawyers that stay up to date on all developments in case law on BIPA and who can help your company best defense given the current legal landscape. Our team can help assist your company in assessing and mitigating risks associated with emerging technologies. 

*Attorney Advertising: Prior results do not guarantee a similar outcome. 

Subscribe to our newsletter. 

Biometric InformationBIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

BIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

COVID-19 is accelerating company adoption of biometric technologies.  With a global shift towards remote working, biometric technologies, which measure physiological, behavioral, and psychological characteristics, can promote, or at least monitor, productivity by recording employee performance.  Facial recognition biometric systems have also been vital in contactless engagement, especially in the airline and retail sectors, and such systems will remain after the pandemic subsides.  This burgeoning biometric industry is garnering interest from lawmakers. Given the firm’s technology-driven focus, Beckage has been tracking biometric laws and will continue to monitor legal and business developments surrounding biometric technologies. 

Biometric Data and the Law

Unlike other personal data, such as passwords, social security numbers, and payment card information, biometric identifiers cannot easily be changed once breached.  Because they are immutable by nature, regulations classify them as a sensitive class of personal data.  Notable laws that govern biometric data include the E.U. Global Data Protection Regulation (GDPR) and U.S. state laws, including California’s comprehensive privacy law. Three states, Illinois, Texas, and Washington, have passed biometric specific laws. New York State recently introduced the Biometric Privacy Act, a bill that is nearly identical to Illinois’ BIPA, and other states, such as Arkansas and California have amended their breach notification laws to reflect biometric data as personal identifying information.

The first step to knowing whether biometric regulations apply to your business is understanding the definition of biometric data.  The GDPR defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.” Art. 4(14).  Similarly, U.S. biometric laws protect biometric data characterized in terms of personal identifiers, including retina scan, iris scan, fingerprint, voiceprint, hand scan, and face geometry.  For example, the Illinois Biometric Data Act (BIPA) defines biometric information as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.” Sec.10.

U.S. Biometric Litigation Trends

Recent rulings in biometric litigation indicate that BIPA currently drives the legal landscape on biometric data protection in the U.S.  BIPA litigation is on the rise following the Illinois Supreme Court 2019 decision in Rosenbach v. Six Flags.  The plaintiff in Rosenbach was the mother of a minor whose fingerprint was captured to verify his identity for entry to an amusement park owned by the defendant.  The Court rejected the defendant’s allegations that the plaintiff had not suffered any actual or threatened harm.  Consequently, the Court held a plaintiff can sue based on a mere technical violation of the law.  This decision means that a person does not have to suffer actual harm to pursue a biometric suit under BIPA.  Further, federal courts have agreed that failure to implement privacy policies outlining procedures for collection, retention, and destruction of biometric identifiers is sufficient to demonstrate a violation of the law.  For example, in May 2020, the Seventh Circuit in Bryant v. Compass found the Rosenbach ruling instructive in holding the plaintiff can pursue a lawsuit against a vending machine operator if the vending machine installed at a workplace integrated biometric authentication in lieu of credit card payments.

The types of companies involved in BIPA litigation are diverse.  Any company that collects, stores, or uses biometric information related to Illinois residents is subject to BIPA.  To that end, no industry seems immune: plaintiffs have sued big tech companies using facial recognition technologies and smaller companies, such as nursing homes, using fingerprinting systems for timekeeping.  The Compass ruling illustrates that third-party vendors who provide biometric authentication systems in the workplace are within the reach of BIPA.

The diversity in cases signals the legislative impact of the law and spotlights the role of privacy policies and procedures.  BIPA is the only biometric law in the U.S that allows individuals to sue a company for damages in amounts ranging from $1,000 to $5,000 per violation.  Thus, the stakes can be high for companies without proper biometric data governance.

What should companies do?

To comply with the evolving BIPA compliance and other biometric laws, companies should work with experienced lawyers who understand biometric technologies and regulations to address the following controls and practices:

  • Properly inform individuals or responsible parties about the purpose of collecting their biometric data.
  • Properly inform individuals or responsible parties about the company’s biometric collection, retention, storage, and dissemination policies and procedures.
  • Obtain written consent from individuals or their responsible party before collecting biometric data.
  • Make the company’s written biometric policy establishing retention schedule and destruction guidelines publicly available.

A robust biometric compliance program should reflect current laws and be flexible and scalable to adapt to the changes laws that new biometric legal rules will inevitably bring to their privacy compliance programs.  Beckage’s lawyers, who are also technologists, are equipped with the skills and experience to build a robust biometric compliance program.  We stand ready to answer any of your questions.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

0
Small BusinessData Breach Risks for Small & Medium Sized Businesses

Data Breach Risks for Small & Medium Sized Businesses

Today, small and medium sized businesses (SMBs) are sometimes at a greater risk of cyber-attacks and security breaches than large enterprises and corporations. Seventy-one percent of cyber-attacks happen at businesses with less than one hundred employees due to less secure networks, lack of time, budget constraints, and limited resources for proper security. Other factors, such as not having an IT network specialist, being unaware of risks associated with cyber security, lack of employee training on cyber security practices and protocols, failure to update security programs, outsourcing security, and failure to secure endpoints may play a role in the increased cyber-attacks on SMBs.

Common Cyber Attacks on SMBs:

  1. Advanced Persistent Threats. These are passive cyberattacks in which a hacker gains access to a computer or network over a long period of time with the intent to gather information.
  • Phishing. Criminals utilize phishing, via email or other communication methods, to induce users to perform a certain task. Once the target user completes the task, such as opening a link or giving personal information, the hacker can gain access to private systems or information.
  • Denial of Service Attacks (DoS, DDoS). Hackers will deny service to a legitimate user through specially crafted data that causes an error within the system or flooding that involves overloading a system so that it no longer functions. The hacker forces the user to pay a fee in order to regain working order of the system.
  • Insider Attacks. An insider attack may occur when employees do not practice good cyber safety resulting in stolen and/or compromised data.
  • Malware. Malware may be downloaded to the computer without the user knowing, causing serious data or security breaches.
  • Password Attacks. Hackers may use automated systems to input various passwords in an attempt to access a network. If successful in gaining network access, hackers can easily move laterally, gaining access to even more systems.
  • Ransomware. Ransomware is a specific malware that gathers and encrypts data in a network, preventing user access. User access is only restored if the hacker’s demands are met.

To help ensure your business is protected, it is important to know and understand the different ways hackers can gain access to a network and pose a threat to the data security of the business.

Some Ways SMEs Can Help Avoid Being a Victim of Cyber-Attacks

  1. Understand Legal Requirements

Often, SMBs are unaware of cybersecurity best practices, so they rely on vendors without first determining what their legal obligation is to have certain cybersecurity and data privacy practices in place. Some laws dictate what steps an organization are required to take. Thus, it is prudent for a company to develop a plan with legal counsel and then identify the ideal vendors to help execute that plan.

  • Use a Firewall

Firewalls are used to prevent unauthorized access to or from a private network and prevent unauthorized users from accessing private networks connected to the internet, especially intranets. The Federal Communications Commission (FCC) recommends all SMBs set up a firewall, both externally and internally, to provide a barrier between your data and cybercriminals.

  • Document Cybersecurity Policies

It is critical as a business to document your cybersecurity protocols. As discussed above, there may even be legal obligations to do so. There are many sources available that provide information on how to document your cybersecurity. The Small Business Administration (SBA) Cybersecurity portal provides online training, checklists, and information specific to protecting small businesses. The FCC’s Cyberplanner 2.0 provides a starting point for security documents and the C3 Voluntary Program for Small Businesses contains a detailed toolkit for determining and documenting the cybersecurity practices and policies best suited for your business.

  • Plan for Mobile Devices

With technology advancing and companies allowing employees to bring their own devices to work, it is crucial for SMBs to have a documented written policy that focuses on security precautions and protocols surrounding smart devices, including fitness trackers and smart watches. Employees should be required to install automatic security updates and businesses should implement (and enforce) a company password policy to apply to all mobile devices accessing the network.

  • Educate Employees on Legal Obligations and Threats

One of the biggest threats to data security is a company’s employees, but they also can help be the best defense. It is important to train employees on the company’s cybersecurity best practices and security policies. Provide employees with regular updates on protocols and have each employee sign a document stating they have been informed of the business’ procedures and understand they will be held accountable if they do not follow the security policies. Also, employees must understand the legal obligations on companies to maintain certain practices, including how to respond to inquiries the business may receive from customers about their data.

  • Enforce Safe Password Practices

Lost, stolen, or weak passwords account for over half of all data breaches. It is essential that SMB password policies are enforced and that all employee devices accessing the company network are password protected. Passwords should meet certain requirements such as using upper and lower-case letters, numbers, and symbols. All passwords should be changed every sixty to ninety days.

  • Regularly Back Up Data

It is recommended to regularly back up word processing documents, electronic spreadsheets, databases, financial files, human resource files, and accounts receivable/payable files, as well as all data stored on the cloud. Make sure backups are stored in a separate location not connected to your network and check regularly to help ensure that backup is functioning correctly.

  • Install Anti-Malware Software

It is vital to have anti-malware software installed on all devices and the networks. Anti-malware software can help protect your business from phishing attacks that install malware on an employee’s computer if a malicious link is clicked.

  • Use Multifactor Identification

Regardless of precautions and training, your employees will likely make security mistakes that may put data at risk. Using multifactor identification provides an extra layer of protection.

Both technology and cybercriminals are becoming more advanced every day. Cyber security should be a top priority for your SMB. The right technology experts can help identify and implement the necessary policies, procedures, and technology to protect your company data and networks.

Beckage is a law firm focused on technology, data security, and privacy. Beckage has an experienced team of attorneys, who are also technologists, who can help educate your company on the best practices for data security that will help protect you from any future cyber-attacks and data security threats.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

BIPABIPA Suits Against Third Parties: An Emerging Trend

BIPA Suits Against Third Parties: An Emerging Trend

Companies should take note of the recent expansion of biometric privacy laws, that could have significant impact on their businesses, changing how they collect and process biometric data and how third party vendors handle such data.

Background on BIPA

The Illinois Biometric Information Privacy Act (BIPA) was passed on October 3, 2008, and regulates how “private entities” collect, use, and share biometric information and biometric identifiers, collectively known as biometric data.  BIPA imposes certain security requirements including:

1. Developing a publicly available written policy regarding the retention and destruction of biometric data in an entity’s possession.

2. Providing required disclosures and obtaining written releases prior to obtaining biometric data.

3. Prohibiting the sale of biometric data.

4. Prohibiting the disclosure of biometric data without obtaining prior consent.

Expansion of BIPA to Third Party Vendors

In a significant turn of events, courts in Illinois are applying BIPA to third party vendors who do not have direct relationships with plaintiffs, but whose products are used by plaintiff’s employees or in other settings to collect plaintiff’s biometric data.

This is an alarming expansion of BIPA’s scope of which all third-party providers should be aware.  Under this caselaw, putting a biometric-collecting product into the stream of commerce does not immunize the manufacturer of that product from suit in Illinois.

Since the passing of BIPA, numerous class actions suits have been filed against those alleged to have collected plaintiffs’ biometric data, but claims brought up against vendors that sell the biometric equipment are exponentially growing.  These claims allege not that plaintiffs have had direct contact with the vendor defendants, but that the defendants obtained the plaintiff’s biometric data through timekeeping equipment without complying to BIPA’s requirements.

Recently, the U.S. District Court for the Northern District of Illinois held that a biometric time clock vendor could be liable for violations of BIPA in the context of employment, extending the liability to people who “collect” biometric information.  

Another recent decision, Figueroa et al v. Kronos, held that the plaintiffs sufficiently alleged that the collection function extended to the company, Kronos, and was responsible, along with the employer, for obtaining required employee consent.

These cases, among others, signify that third-party vendors are becoming defendants in BIPA consent cases and broaden third party contribution claims brought by employers against the vendors of Biometric clocks for failure to obtain required consent.  These decisions also allow insured employers to seek contributions from clock vendors for any judgement assessed against an insured employer under the Employment Practices Liability (EPL).

However, BIPA’s Section 15(a), which requires publicly available policies for the retention and destruction of biometric data, makes it difficult for plaintiffs to make claims against third parties in federal court.  BIPA Section 15(a) creates an issue of standing.  A state federal court could exercise jurisdiction over a vendor in connection with a BIPA claim if the vendor maintained continuous and systematic contacts with Illinois.  If the vendor is located in the forum state, then there is no jurisdictional dispute, but since many vendors sell their equipment nationally, the issue of whether the court has specific personal jurisdiction of the vendor must be addressed.

For example, in Bray v. Lathem Time Co., the US District Court for the Central District of Illinois alleged that the defendant sold a facial-recognition time keeping product to the plaintiff’s employer and violated BIPA because they failed to notify employees and obtain their consent.  The plaintiffs had no dealing with the defendant, who was located in Georgia but was sued in Illinois.  The court found no contacts between the defendant and the state of Illinois and concluded that the time keeping equipment was sold to an affiliate of the plaintiff’s employer and then transferred to Illinois by the employer.  The court concluded that it lacked jurisdiction over the defendant vendor.

Expansion of BIPA Outside Illinois?

Vendors being located in states outside of Illinois raises the question of whether BIPA is applicable to conduct in other states.  But while BIPA is applied to violations in Illinois, upcoming class suits may address the issue of BIPA having an extraterritorial effect when bringing claims against out of state vendors.  The extraterritorial application of BIPA is fact-dependent and courts acknowledge that decertifying extraterritoriality as being evaluated on an individual basis may be appropriate.  Companies collecting, using, and storing biometric information will face an increased risk in BIPA lawsuits.

Take-A-Ways

All companies should assess whether they are collecting biometric data, directly or through third parties.  Next is to evaluate the legal requirements regarding the handling of such data.  Note, many state data breach laws include biometric data as protected personally identifiable information (PII).  Companies should take steps to comply with applicable laws, including developing policies and practices around handling biometric data.  Also, contracts with third party vendors should be reviewed to help protect the business if there is mishandling of biometric data.

About Beckage

At Beckage, we have a team of skilled attorneys that can assist your company in developing BIPA compliant policies that will help mitigate the risks associated with collecting biometric information.  Our team of lawyers are also technologists who can help you better understand the legal implications surrounding BIPA and the legal repercussions that follow suit.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes. *

1 2