AI Hiring BiasAI Hiring Algorithms Present Big Questions About Accountability and Liability

AI Hiring Algorithms Present Big Questions About Accountability and Liability

As artificial intelligence (AI) becomes an increasingly prevalent human resources tool, the algorithms powering those hiring and staffing decisions have come under increased scrutiny for their potential to perpetuate bias and discrimination.

Are There Any Federal Laws or Regulations Governing the Use of AI in Hiring?

Under Title VII of the Civil Rights Act of 1964, the United States Equal Opportunity Commission (“EEOC”) is responsible for enforcing federal laws that make it illegal to discriminate against job applicants or employees because of their membership in a protected class.  For decades, attorneys have relied on the jointly issued Employment Tests and Selection Procedures by the Civil Service Commission, Department of Justice, Department of Labor and EEOC.  See generally 28 CFR § 50.14; see also Fact Sheet on Employment Tests and Selection Procedures, EEOCNevertheless, the current form of the Employment Tests and Selection Procedures fail to provide any guidance on the use of AI tools in the hiring process.   

That isn’t to say Federal regulators and legislators aren’t keen on regulating this area.  On December 8, 2020, ten United States Senators sent a joint letter to the EEOC regarding the EEOC’s authority to investigate the bias of AI driving hiring technologies.  In relevant part, the letter poses three questions:  

  1. Can the EEOC request access to “hiring assessment tools, algorithms, and applicant data from employers or hiring assessment vendors and conduct tests to determine whether the assessment tools may produce disparate impacts?
  2. If the EEOC were to conduct such a study, could it publish its findings in a public report?
  3. What additional authority and resources would the EEOC need to proactively study and investigate these AI hiring assessment technologies?  Id.

As of the current date, the EEOC has yet to respond to the letter.  Nevertheless, given the questions above, the current political climate, and the lack of current guidance from the EEOC, we anticipate future guidance, regulation, and potential enforcement actions in this area. 

How Are States Handling AI Hiring Bias? 

Illinois was first state to legislate in the area of the use of AI in hiring.  On August 9, 2019, Illinois enacted the Artificial Intelligence Video Interview Act (“AIVIA”), imposing strict limitations on employers who use AI to analyze candidate video interviews.  See 820 ILCS 42 et seq.  Under AIVIA, employers must: 

  1. Notify applicants that AI will be utilized during their video interviews;
  2.  Obtain consent to use AI in each candidate’s evaluation;  
  3. Explain to the candidates how the AI works and what characteristics the AI will track with regard to their fitness for the position; 
  4. Limit sharing of the video interview to those who have the requisite expertise to evaluate the candidate; and
  5. Comply with a candidate’s request to destroy his or her video within 30 days.  Id

Illinois was quickly followed up by Maryland, which on May 11, 2020 enacted legislation prohibiting an employer from using certain facial recognition services during a candidate’s interview for employment unless the candidate expressly consents.  See Md. Labor and Employment Code Ann. § 3-717.  The Maryland law specifically requires the candidate to consent to the use of certain facial recognition service technologies during an interview by signing a waiver which contains: 

  1. The candidate’s name;
  2. The date of the interview;
  3. that the candidate consents to the use of facial recognition during the interview;
  4. and that the candidate has read the waiver.  Id.

As with AIVIA, the emerging nature of the Maryland law does not provide much insight into how the law will be interpreted or enforced.

There are a number of other jurisdictions which have bills in different states of progress.  On February 20, 2020 a bill was introduced into the California legislature which would limit the liability of an employer or a purveyor of AI assisted employment decision making software under certain circumstances.  See 2019 Bill Text CA S.B. 1241.  This Californian bill “would create a presumption that an employer’s decision relating to hiring or promotion based on a test or other selection procedure is not discriminatory, if the test or procedure meets specified criteria, including, among other things, that it is job related and meets a business necessity” and “that the test or procedure utilizes pretested assessment technology that, upon use, resulted in an increase in the hiring or promotion of a protected class compared to prior workforce composition.”  Id. The bill would also require the employer to keep records of the testing or procedure and submit them for review to the California Department of Fair Employment and Housing, upon request, in order to qualify for the presumption and limit their liability.  Id

Not to be outdone, a bill was introduced into the New York City Counsel on February 27, 2020 with the purpose of regulating the sale of automated employment decision making tools.  See Int. No. 1894.  The New York City Council bill broadly defines automated employment decision making tool as “any system whose function is governed by statistical theory, or systems whose parameters are defined by such systems, including inferential methodologies, linear regression, neural networks, decision trees, random forests, and other learning algorithms, which automatically filters candidates or prospective candidates for hire or for any term, condition or privilege of employment in a way that establishes a preferred candidate or candidates.”  Id.  The bill seeks to prohibit the sale of automated employment decision making tools if they were not the subject of an audit for bias in the past year prior to sale, were not sold with a yearly bias audit service at no additional cost, and were not accompanied by a notice that the tool is subject to the provisions of the New York City Council’s bill.  Id.  The bill would require any person who uses automated employment assessment tools for hiring and other employment purposes to disclose to candidates, within 30 days, when such tools were used to assess their candidacy for employment, and the job qualifications or characteristics for which the tool was used to screen.  Id.  Finally, the bill is not without bite, as violator are subject to “a civil penalty of not more than $500 for that person’s first violation and each additional violation occurring on the same day as the first violation, and not less than $500 nor more than $1,500 for each subsequent violation.”  Id.

What Can My Business Do Now to Prepare for Potential Liability Related to the Use of AI in Hiring?

As the current political and legal landscape continues to be in flux, one of the best things your business can do is stay on top of current statutes.  Your business could also audit both internal and external use of AI in hiring to validate and confirm the absence of bias in the system; however, testing external systems may require your vendors to open their proprietary technology and information to their customers, something that most are hesitant to do.  Finally, your business should consider conducting a thorough review of any and all indemnification provisions in its vendor agreements to see how risk might be allocated between the parties.

Beckage is a law firm focused on technology, data security, and privacy. Beckage has an experienced team of attorneys and technologists who can advise your business on the best practices for limiting its liability related to the use of AI in hiring.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

FingerprintBiometric Litigation Continues To Rise As Businesses Work To Minimize Risk

Biometric Litigation Continues To Rise As Businesses Work To Minimize Risk

In 2008, Illinois enacted the Illinois Biometric Information Privacy Act (“BIPA”) with the purpose of recognizing a person’s privacy right to their “biometric information” and “biometric identifiers”.  BIPA was enacted in response to the growing use of biometrics by businesses.   

In part because of its private right of action, by which plaintiffs may bring suit against businesses directly, BIPA litigation remains at the forefront of the data privacy litigation landscape as businesses continue to collect the biometric identifiers of their employees.  Recent BIPA class action settlements with major tech companies like Facebook and TikTok have been in the hundreds of millions of dollars, but the majority of BIPA litigation is brought against small and medium sized enterprises who collect biometric information in employee timekeeping or for access controls to physical spaces.   

To date, defendants have found courts to be generally unwilling to dismiss BIPA litigation at early motion practice.  Two recent cases, Thornley v. Clearview AI and Barton v. Swan Surfaces, demonstrate that there are some potential limits to BIPA litigation. 

Thornley  v. Clearview AI 

In Thornley, Melissa Thornley accused Clearview AI of scaping publicly available photos from her social media accounts for facial recognition purposes and selling her biometric information to third parties without her consent.  Thornley v. Clearview AI, Inc., 984 F.3d 1241, 1242-1243 (7th Cir. 2021).  Thornley initially filed a complaint in Illinois state court, alleging as a class representative, that Clearview violated § 15(c) of BIPA, which requires in relevant part, that “[n]o private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information.”  Id. at 1246.  Clearview removed the case to federal court on the basis that the allegation of a statutory violation gave rise to a concrete and particularized injury-in-fact that is necessary for Article III standing.  Id. at 1243.  Under the Constitution, a plaintiff must have Article III standing to sue in federal court, which requires that the plaintiff prove: (1) an injury in fact; (2) causation of the injury by the defendant; and (3) that the injury is likely to be redressed by the requested relief.  See Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547 (2016).  In Spokeo, the Supreme Court of the United States held that a statutory violation could be sufficient to constitute an injury in fact; however, it did not provide any analysis as to which types of statutory violations necessarily implicate concrete and particularized injuries in fact.  Id.   

The district court held that Clearview alleged violation of § 15(c) of BIPA was “only a bare statutory violation, not the kind of concrete and particularized harm that would support standing”, the case must be remanded to the state court.  Thornley., 984 F.3d at 1242.  Clearview then appealed to the Seventh Circuit, who concurred with the District Court and remanded the case back to the Illinois State Court for much the same lack of standing.  Id.  Clearview has now petitioned the Supreme Court of the United States to take its case.  See Porter Wells, Clearview AI Will Take BIPA Standing Challenge to Supreme Court. 

Barton v. Swan Surfaces, LLC 

In Barton, a unionized employee of Swan Surfaces, LLC (“Swan”) was required to clock in and out of her employer’s manufacturing plant using her fingerprints as part of company protocol.  Barton v. Swan Surfaces, LLC, No. No. 20-cv-499-SPM, 2021 WL 793983 at *1 (S.D. Ill March 2, 2021).  On May 29, 2020 Barton filed a complaint in the United States District Court for the Southern District of Illinois alleging that she represented a class of individuals who “while residing in the State of Illinois, had their fingerprints collected, captured, received, otherwise obtained and/or stored by Swan”.  Id. at *2.  Barton asserted Swan violated BIPA in: (1) failing to institute, maintain, and adhere to publicly available retention schedule in violation of 740 ILCS 14/15(a); and (2) failing to obtain informed written consent and release before collecting biometric of information.  Id.  On July 31, 2020, Swan filed a Motion to Dismiss, asserting in relevant part, that Barton’s BIPA claims were preempted by § 301 of the Labor Management Relations Act (“LMRA”).  Id.  

On March 2, 2021, the court held that as Barton was a unionized employee, her Collective Bargaining Agreement (“CBA”), which contained a management rights clause and grievance procedure, controlled and as such Barton’s BIPA claims were preempted by § 301 of the LMRA.  In coming to its conclusion, the court heavily relied on the courts holding in Miller v. Southwest Airlines, Inc., 926 F.3d 898 (7th Cir. 2019). Id. at *6. In Miller, the Seventh Circuit held an adjustment board had to resolve the employees’ dispute over the airline’s fingerprint collection practices because their unions may have bargained over the practice on their behalf.  Miller, 926 F.3d 898.  The court in Barton noted that the United States “Supreme Court has held that the RLA preemption standard is virtually identical to the pre-emption standard the Court employs in cases involving § 301 of the LMRA” and therefore the same outcome should apply.  Barton, 2021 WL 793983 at *4. 

Key Takeaway 

While these cases demonstrate the potential to circumvent or limit BIPA litigation, the increased volume of biometric information being used by companies and the push for biometric policies that govern the use of these technologies and promote safeguards for consumers will undoubtedly continue.  

With many states looking to implement biometric privacy laws similar to BIPA, it is important to have legal tech counsel to address compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. We have a team of highly skilled lawyers that stay up to date on all developments in case law on BIPA and who can help your company best defense given the current legal landscape. Our team can help assist your company in assessing and mitigating risks associated with emerging technologies. 

*Attorney Advertising: Prior results do not guarantee a similar outcome. 

Subscribe to our newsletter. 

Text MarketingTCPA Considerations When Starting Your SMS Marketing Campaign

TCPA Considerations When Starting Your SMS Marketing Campaign

Consent is the cornerstone of compliance with the Telephone Consumer Protection Act (“TCPA”).  It is imperative that business and marketing teams have a strong understanding of this before leveraging text messaging or automated calls into their marketing campaigns.  Similarly, it is critical to understand when prior express written consent is required, if any exceptions may apply to your text messaging campaign practices, the importance of documenting consent, and other best practices that can be leveraged for obtaining prior express written consent in an online environment.

Understanding the TCPA

The TCPA was enacted in 1991, amending the Communications Act of 1934, and sought to restrict unwanted telephonic solicitations from companies.  The TCPA grants the Federal Communications Commission (“FCC”) the authority to develop rules related to telemarketing, the use of automated telephone dialers, artificial or prerecorded voice messages, SMS text messages, and fax machines. 

Many businesses leverage text messaging or SMS marketing to reach out to current and potential customers and while this can be a great marketing tactic, careful attention should be paid when using SMS text messages to communicate with customers, even where a preexisting business relationship exists, as there are steep penalties involved for initiating improper text messages or calls. In fact, the statute provides for damages in the amount of $500 per improper text message, which can quickly add up when you are sending them out en masse.  With these hefty fines, compliance with the TCPA should be taken into consideration before embarking on any SMS text messaging campaign.

Affirmative, Written/Digital Consent & Opt-Out

Under the TCPA, you must obtain written or digital consent before sending promotional SMS text messages.  As such, you always want to be sure your teams are obtaining affirmative written consent before beginning any SMS text messaging marketing campaign.  It is mandatory to obtain this affirmative written or digital consent before sending promotional SMS text messages. 

In Vandenberg & Sons Furniture, Inc. v. Alliance Funding Grp., a California corporation that provided financing for equipment leasing to small businesses faxed a Michigan corporation that is in the furniture business in 2012.  No. 1:15-CV-1255, 2021 WL 222171 (W.D. Mich. January 22, 2021).  At the bottom of the two-page fax, there was an opt-out notice that provided that the fax recipient with instruction on how to opt out of future fax advertisements.  Id. Over the next four years, the equipment leasing business sent out hundreds of thousands of fax advertisements to the furniture business and others.  Id.  The Western District of Michigan recently held that as the equipment leasing business failed show any evidence it had obtained affirmative written consent from the individuals it sent faxes to, a class potentially worth over a $100 million dollars could be formed.  Id.

Best SMS Practices to Follow for Text Marketing

As stated, obtaining (and documenting) proper consent is foundational.  One recommendation for obtaining affirmative consent is to present a just-in-time notice at the point of collection of a telephone number.  A small dialogue box should confirm that the individual is authorizing the collection of the phone number and consents to be contacted by text messages.  The TCPA recommends marketers retain the consent for a minimum of four years.  This affirmative consent needs to be duly signed by the customers, which can be written, digital or a simple opt-in for a campaign.  Moreover, under the TCPA, customers must also be provided with an option to opt of out any such marketing campaign, being presented with the choice of continuing to receive messages.

To best align with TCPA guidelines, here are some additional best practices that your business should be following when undertaking text messaging as part of your marketing campaign:

  1. We recommend mentioning the details regarding opting out of your campaign at least once every month.  Include a small message addressing the same at the end of your marketing text.
  2. Look into the opt-out requests and process them as soon as possible (it is advisable to acknowledge in real time).  This provides your customers with a sense of reassurance and makes your activities more organized.
  3. Along with the details regarding opting out of your campaigns, it is important to include contact details for your customer care services at least once every month.  If the details are precise, you can add them to every marketing SMS you send to your customers.
  4. Always keep a track of an opt-out request once it has been received.  Ensure all the procedures are carried out efficiently and the concerned customer is successfully opted out of receiving your messages.  Also, inform the customer through a final SMS, confirming the fact that they will stop receiving similar messages from you in future.  It is also advisable to provide details of opting back in for your SMS campaign, in case the customer feels the need to do so in future.

Like many areas of compliance, building an infrastructure within your organization to address the new and evolving legal landscape surrounding the use of text messages under the TCPA can help your business stay ahead of the curve and prevent costly litigation.  Being proactive and building robust and scalable policies into the foundation of your organization will help mitigate legal risk. Our TCPA team has handled numerous class actions litigations in this space and can help your business navigate this complex area of the law.

*Attorney Advertising: Prior results do not guarantee a similar outcome.

Subscribe to our newsletter.

United States Department of Homeland Security (DHS) Announces New Grant Plan to Slow Epidemic Spread of Cyber Attacks

United States Department of Homeland Security (DHS) Announces New Grant Plan to Slow Epidemic Spread of Cyber Attacks

Businesses may be able to take a little sigh of relief that some help may be coming to the persistent threat of ransomware attacks.  The DHS announced that significant funds will be provided to a number of public and private sectors to help improve the nation’s protection against data security attacks and other crises.

The Feb. 25 Announcement

On February 25, 2021, DHS announced its funding notice for several different types of cyber preparedness grants worth nearly $1.87 billion.  After noticing a rise in both the number and complexity of cyber threats faced by communities, including targeted ransomware attacks on our infrastructure, hospital, transportation systems, DHS identified five critical priority areas for attention for its fiscal 2021 grant cycle: 1) cybersecurity; 2) soft targets and crowded places; 3) intelligence and information sharing; 4) domestic violent extremism; and 5) emerging threats.  These grant programs provide funding to state, local, tribal/territorial governments, transportation authorities, nonprofit organizations, and the private sector to improve the nation’s readiness in preventing, protecting against, responding to, recovering from terrorist attacks, major disasters, and other emergencies.

The DHS announced several non-competitive grants which are to be awarded to recipients based on several factors:

  • State Homeland Security Program – The State Homeland Security Program provides $415 million to support the implementation of risk-driven, capabilities-based state homeland security strategies to address capability targets;
  • Urban Area Security Initiative – The Urban Area Security Initiative provides $615 million to enhance regional preparedness and capabilities in 31 high-threat, high-density areas; and
  • Emergency Management Performance Grant (“EMPG”) – EMPG provides more than $355 million to assist state, local, tribal, and territorial governments in enhancing and sustaining all-hazards emergency management capabilities; and
  • Intercity Passenger RailAmtrak Program – The Amtrak Program provides $10 million to Amtrak to protect critical surface transportation infrastructure and the traveling public from acts of terrorism and increase the resilience of the Amtrak rail system.

Moreover, the DHS announced several competitive grants, including:

  • Operation Stonegarden – Operation Stongarden provides $90 million to enhance cooperation and coordination among state, local, tribal, territorial, and federal law enforcement agencies to jointly enhance security along the United States land and water borders;
  • Tribal Homeland Security Grant Program – The Tribal Homeland Security Grant Program provides $15 million to eligible tribal nations to implement preparedness initiatives to help strengthen the nation against risk associated with potential terrorist attacks and other hazards;
  • The Nonprofit Security Grant Program – The Nonprofit Security Grant Program provides $180 million to support target hardening and other physical security enhancements for nonprofit organizations that are at high risk of a terrorist attack;
  • Port Security Grant Program – The Port Security Grant Program provides $100 million to help protect critical port infrastructure from terrorism, enhance maritime domain awareness, improve port-wide maritime security risk management, and maintain or re-establish maritime security mitigation protocols that support port recovery and resiliency capabilities;
  • Transit Security Grant Program – The Transit Security Grant Program provides $88 million to owners and operators of public transit systems to protect critical surface transportation and the traveling public from acts of terrorism and to increase the resilience of transit infrastructure; and
  • Intercity Bus Security Program – The Intercity Bus Security Program provides $2 million to owners and operators of intercity bus systems to protect surface transportation infrastructure and the traveling public from acts of terrorism and to increase the resilience of transit infrastructure.

Impact on Business

Private sector businesses can apply for these grants, especially if they are in the process of developing and creating cyberwarfare and other data defense tools.  Grant  information can be found here.

Beckage has responded to countless data breaches and is always comforted to see more dollars that foster collaboration between public and private sectors to help defend and protect U.S. business and more.

If you have questions about the grant dollars or how to apply, please contact a Beckage attorney at 716.898.2102.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

VirginiaWhat You Need to Know About Virginia’s New Consumer Data Protection Act

What You Need to Know About Virginia’s New Consumer Data Protection Act

On March 2, 2021, Virginia enacted the Consumer Data Protection Act (the “CDPA”) with the goal of establishing a framework for controlling and processing the personal data of Virginia Residents. Where the CDPA resembles California’s Consumer Privacy Act (“CCPA”) in some regards and resembles the European Union’s General Data Privacy Regulation (“GDPR”) in others, the CDPA is likely the first step in a line of new state laws governing the processing of a consumers’ data.  As such, companies should use this time to familiarize themselves with the intricacies of the CDPA so as to begin to adapt to the intricacies of handling consumer data.

Who Does the CDPA Apply to?

The CDPA applies to all companies who operate a business or produce products or services that are targeted to residents of Virginia, and that:

  1. during a calendar year, control or process personal data of at least 100,000 consumers; or
  2. control or process personal data of at least 25,000 consumers and derive over 50 percent of gross revenue from the sale of personal data. 

Equally important is who is exempted from the CDPA.  Va. Code Ann. § 59.1-572(A).  To that end, the CDPA does not apply to i) any governmental body within Virginia; ii) financial institutions or data subject to Title V of the federal Gramm-Leach-Bliley Act (15 U.S.C. § 6801 et seq.); or iii) any covered entity or business associate governed by the privacy, security, and breach notification under HIPAA or HITECH.  Va. Code Ann. § 59.1-572(A).

What is “Sensitive Data” Under the CDPA?

Understanding what constitutes as “sensitive data” under the CDPA first requires an understanding of what is “personal data” under the CDPA.  The CDPA defines personal data as being “any information that is linked or reasonably associated to an identified or identifiable natural person”.  Va. Code Ann. § 59.1-571.  Nevertheless, personal data under the CDPA does not include de-identified data or “publicly available information”.  Id.

The CDPA more heavily regulates a covered business’ processing and handling of sensitive data.  Under the CDPA sensitive data is defined as including:

  1. personal data revealing racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status;
  2. the processing of genetic or biometric data for the purpose of uniquely identifying a natural person;
  3. the personal data collected from a known child; or
  4. the precise geolocation of an individual.  Va. Code Ann. § 59.1-571. 

Moreover, the CDPA provides certain exceptions for data which is not to be considered sensitive data, including, but not limited to:

  1. protected health information under HIPAA; information used only for public health activities under by HIPAA; information derived from any of the health care-related information that is de-identified in accordance with the requirements for de-identification pursuant to HIPAA; patient identifying information for purposes of 42 U.S.C. § 290dd-2;  information created for purposes of the Health Care Quality Improvement Act of 1986 (42 U.S.C. § 11101 et seq.) or  the Patient Safety and Quality Improvement Act (42 U.S.C. § 299b-21 et seq.);
  2. information collected and maintained regulated and authorized under the federal Fair Credit Reporting Act (15 U.S.C. § 1681 et seq.); personal data collected, processed, sold, or disclosed in compliance with the federal Driver’s Privacy Protection Act of 1994 (18 U.S.C. § 2721 et seq.); and
  3. personal data regulated by the federal Family Educational Rights and Privacy Act (20 U.S.C. § 1232g et seq.).  Va. Code Ann. § 59.1-571(C).

What is My Business Required to Do if it is a Covered Business?

Under the CDPA, a covered business is required to:

  1. adopt data minimization practices;
  2. disclose their privacy practices through a “meaningful privacy notice”;
  3. implement data security measures;
  4. refrain from discriminating against consumers who exercise their rights under the CDPA; and
  5. obtain consent prior to processing sensitive data, as defined below.  Va. Code Ann. § 59.1-574. 

Moreover, a covered business may be required to conduct risk assessments on their data protection practices.  These risk assessments must be taken where the covered business activities involve:

  1. the processing of personal data for purposes of targeted advertising;
  2. the sale of personal data;
  3. the processing of personal data for purposes of profiling, where such profiling presents a reasonably foreseeable risk;
  4. the processing of sensitive data; and
  5. any processing activities involving personal data that present a heightened risk of harm to consumers.  Va. Code Ann. § 59.1-576.

Does the CDPA Provide Any Rights to Virginians?

Under the CDPA, Virginians are provided certain individual rights including:

  1. the right to access their data;
  2. the right to amend their data;
  3. the right to delete their data;
  4. the right to transfer their data; and
  5. the right to opt out of certain uses of their personal data.  Va. Code Ann. § 59.1-573(A)(1-5). 

What Happens If My Business Violates the CDPA?

CDPA does not contain a private right of action.  Va. Code Ann. § 59.1-579(C).  As such, enforcement is the exclusive jurisdiction of the Virginia Attorney General.   Va. Code Ann. § 59.1-579(A).  Under the CDPA, the Virginia Attorney General is required to provide the covered business a letter outlining the provisions of the CDPA that have been, or are alleged to have been, violated.   Va. Code Ann. § 59.1-579(B).  The covered business than has 30 days to cure any alleged violations.  Id.  If the covered business cures the alleged violations of the CDPA “and provides the consumer an express written statement that the alleged violations have been cured and that no further violations shall occur” then Virginia Attorney General is not to seek statutory damages against the covered business.  Id.  Nevertheless, if the covered business fails to cure the alleged violations of the CDPA, it may be “subject to an injunction and liable for a civil penalty of not more than $7,500 for each violation.  Va. Code Ann. § 59.1-580(B).

When Will the CDPA Become Effective?

The CDPA will become effective on January 1, 2023.  Va. Code Ann. § 59.1-581.  Moreover, in contracts to the new California Consumer Privacy Rights Act (“CPRA”), the CDPA does not contain a twelve-month lookback period, and thus compliance with the CDPA will only be required moving forward.

What Do I Do Next?

Now is the time to prioritize developing a robust, scalable data privacy program within your organization.  First and foremost, conducting an assessment to determine what laws and regulations, such as the CDPA, CCPA, or GDPR, apply to your organization is a great starting place. Your business may be required to make additional disclosures surrounding your data collection practices and how consumers can exercise certain rights to that data.

Beckage’s dedicated data privacy attorneys routinely provide guidance on various consumer data privacy regulatory regimes and are especially adept to help your business adapt to the changing legal landscape.  We recommend reviewing all cookie consent banners and just in time notices to evaluate whether they provide the necessary opt out consent for targeted advertising as required by the CDPA and other evolving laws.  Based on the above, if you believe that the CDPA may impact your business, reach out to Beckage for assistance.

Subscribe to our newsletter.

*Attorney Advertising; prior results do not guarantee similar outcomes.

1 2 3