AIAccountability and the Use of Artificial Intelligence

Accountability and the Use of Artificial Intelligence

As artificial intelligence (“AI”) and automated decision-making systems make their way into every corner of society – from businesses and schools to government agencies – concerns about using the technology responsibly and accountability are on the rise. 

The United States has always been on the forefront of technological innovations and our government policies have helped us remain there.  To that end, on February 11, 2019, President Trump issued an Executive Order on Maintaining American Leadership in Artificial Intelligence (No. 13,859).  See Exec. Order No. 13,859, 3 C.F.R. 3967.  As part of this Executive Order, the “American AI Initiative” was launched with five guiding principles:

  1. Driving technological breakthroughs; 
  2. Driving the development of appropriate technical standards; 
  3. Training workers with the skills to develop and apply AI technologies; 
  4. Protecting American values, including civil liberties and privacy, and fostering public trust and confidence in AI technologies; and
  5.  Protecting U.S. technological advantages in AI, while promoting an international environment that supports innovation. Id. at § 1. 

Finally, the Executive Order tasked the National Institute of Standards and Technology (“NIST”) of the U.S. Department of Commerce with creating a plan for the development of technical standards to support reliable, robust, and trustworthy AI systems.  Id. at § 6(d). To that end, the NIST released its Plan for Federal Engagement in Developing Technical Standards in August 2019.  See Nat’l Inst. of Standards & Tech., U.S. Leadership in AI: A Plan for Federal Engagement in Developing Technical Standards and Related Tools (2019). 

While excitement over the use of AI was brewing in the executive branch, the legislative branch was concerned with its accountability as on April 10, 2019, the Algorithmic Accountability Act (“AAA”) was introduced into Congress.  See Algorithmic Accountability Act of 2019, S. 1108, H.R. 2231, 116th Cong. (2019).  The AAA covered business that: 

  1. Made more than $50,000,000 per year;
  2. Held data for greater than 1,000,000 customers; or
  3. Acted as a data broker to buy and sell personal information.  Id. at § 2(5). 

The AAA would have required business to conduct “impact assessments” on their “high-risk” automated decision systems in order to evaluate the impacts of the system’s design process and training data on “accuracy, fairness, bias, discrimination, privacy, and security”.  Id. at §§ 2(2) and 3(b).  These impact assessments would have required to be performed “in consultation with external third parties, including independent auditors and independent technology experts”.  Id. at § 3(b)(1)(C).  Following an impact assessment the AAA would have required that business reasonably address the result of the impact assessment in a timely manner.  Id. at § 3(b)(1)(D).  

It wasn’t just the federal government who is concerned about the use of AI in business as on May 20, 2019, the New Jersey Algorithmic Accountability Act (“NJ AAA”) was introduced into the New Jersey General Assembly.  The NJ AAA was very similar to the AAA in that it would have required businesses in the state to conduct impact assessments on “high risk” automated decisions. See New Jersey Algorithmic Accountability Act, A.B. 5430, 218th Leg., 2019 Reg. Sess. (N.J. 2019).  These “Automated decision system impact assessments” would have required an evaluation of the systems development “including the design and training data of the  automated  decision  system,  for  impacts  on accuracy,  fairness,  bias,  discrimination,  privacy,  and  security” as well as a cost-benefit analysis of the AI in light of its purpose.  Id. at § 2.  The NJ AAA would have also required businesses work with independent third parties, record any bias or threat to the security of consumers’ personally identifiable information discovered through the impact assessments, and provide any other information that is required by the New Jersey Director of the Division of Consumer Affairs in the New Jersey Department of Law and Public Safety.  Id

While the aforementioned legislation has appeared to have stalled, we nevertheless anticipate that both federal and state legislators will once again take up the task of both encouraging and regulating the use of AI in business as the COVID-19 pandemic subsides.  Our team at Beckage contains attorneys who are focused on technology, data security, and privacy and have the experience to advise your business on the best practices for the adoption of AI and automated decision-making systems. 

*Attorney Advertising. Prior results do not guarantee future outcomes. 

Subscribe to our Newsletter

FingerprintBiometric Litigation Continues To Rise As Businesses Work To Minimize Risk

Biometric Litigation Continues To Rise As Businesses Work To Minimize Risk

In 2008, Illinois enacted the Illinois Biometric Information Privacy Act (“BIPA”) with the purpose of recognizing a person’s privacy right to their “biometric information” and “biometric identifiers”.  BIPA was enacted in response to the growing use of biometrics by businesses.   

In part because of its private right of action, by which plaintiffs may bring suit against businesses directly, BIPA litigation remains at the forefront of the data privacy litigation landscape as businesses continue to collect the biometric identifiers of their employees.  Recent BIPA class action settlements with major tech companies like Facebook and TikTok have been in the hundreds of millions of dollars, but the majority of BIPA litigation is brought against small and medium sized enterprises who collect biometric information in employee timekeeping or for access controls to physical spaces.   

To date, defendants have found courts to be generally unwilling to dismiss BIPA litigation at early motion practice.  Two recent cases, Thornley v. Clearview AI and Barton v. Swan Surfaces, demonstrate that there are some potential limits to BIPA litigation. 

Thornley  v. Clearview AI 

In Thornley, Melissa Thornley accused Clearview AI of scaping publicly available photos from her social media accounts for facial recognition purposes and selling her biometric information to third parties without her consent.  Thornley v. Clearview AI, Inc., 984 F.3d 1241, 1242-1243 (7th Cir. 2021).  Thornley initially filed a complaint in Illinois state court, alleging as a class representative, that Clearview violated § 15(c) of BIPA, which requires in relevant part, that “[n]o private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person’s or a customer’s biometric identifier or biometric information.”  Id. at 1246.  Clearview removed the case to federal court on the basis that the allegation of a statutory violation gave rise to a concrete and particularized injury-in-fact that is necessary for Article III standing.  Id. at 1243.  Under the Constitution, a plaintiff must have Article III standing to sue in federal court, which requires that the plaintiff prove: (1) an injury in fact; (2) causation of the injury by the defendant; and (3) that the injury is likely to be redressed by the requested relief.  See Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547 (2016).  In Spokeo, the Supreme Court of the United States held that a statutory violation could be sufficient to constitute an injury in fact; however, it did not provide any analysis as to which types of statutory violations necessarily implicate concrete and particularized injuries in fact.  Id.   

The district court held that Clearview alleged violation of § 15(c) of BIPA was “only a bare statutory violation, not the kind of concrete and particularized harm that would support standing”, the case must be remanded to the state court.  Thornley., 984 F.3d at 1242.  Clearview then appealed to the Seventh Circuit, who concurred with the District Court and remanded the case back to the Illinois State Court for much the same lack of standing.  Id.  Clearview has now petitioned the Supreme Court of the United States to take its case.  See Porter Wells, Clearview AI Will Take BIPA Standing Challenge to Supreme Court. 

Barton v. Swan Surfaces, LLC 

In Barton, a unionized employee of Swan Surfaces, LLC (“Swan”) was required to clock in and out of her employer’s manufacturing plant using her fingerprints as part of company protocol.  Barton v. Swan Surfaces, LLC, No. No. 20-cv-499-SPM, 2021 WL 793983 at *1 (S.D. Ill March 2, 2021).  On May 29, 2020 Barton filed a complaint in the United States District Court for the Southern District of Illinois alleging that she represented a class of individuals who “while residing in the State of Illinois, had their fingerprints collected, captured, received, otherwise obtained and/or stored by Swan”.  Id. at *2.  Barton asserted Swan violated BIPA in: (1) failing to institute, maintain, and adhere to publicly available retention schedule in violation of 740 ILCS 14/15(a); and (2) failing to obtain informed written consent and release before collecting biometric of information.  Id.  On July 31, 2020, Swan filed a Motion to Dismiss, asserting in relevant part, that Barton’s BIPA claims were preempted by § 301 of the Labor Management Relations Act (“LMRA”).  Id.  

On March 2, 2021, the court held that as Barton was a unionized employee, her Collective Bargaining Agreement (“CBA”), which contained a management rights clause and grievance procedure, controlled and as such Barton’s BIPA claims were preempted by § 301 of the LMRA.  In coming to its conclusion, the court heavily relied on the courts holding in Miller v. Southwest Airlines, Inc., 926 F.3d 898 (7th Cir. 2019). Id. at *6. In Miller, the Seventh Circuit held an adjustment board had to resolve the employees’ dispute over the airline’s fingerprint collection practices because their unions may have bargained over the practice on their behalf.  Miller, 926 F.3d 898.  The court in Barton noted that the United States “Supreme Court has held that the RLA preemption standard is virtually identical to the pre-emption standard the Court employs in cases involving § 301 of the LMRA” and therefore the same outcome should apply.  Barton, 2021 WL 793983 at *4. 

Key Takeaway 

While these cases demonstrate the potential to circumvent or limit BIPA litigation, the increased volume of biometric information being used by companies and the push for biometric policies that govern the use of these technologies and promote safeguards for consumers will undoubtedly continue.  

With many states looking to implement biometric privacy laws similar to BIPA, it is important to have legal tech counsel to address compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. We have a team of highly skilled lawyers that stay up to date on all developments in case law on BIPA and who can help your company best defense given the current legal landscape. Our team can help assist your company in assessing and mitigating risks associated with emerging technologies. 

*Attorney Advertising: Prior results do not guarantee a similar outcome. 

Subscribe to our newsletter. 

Canada PrivacyCanada’s New Privacy Bill Aims to Strengthen Privacy Rights for Citizens

Canada’s New Privacy Bill Aims to Strengthen Privacy Rights for Citizens

On November 17, 2020, the Canadian Minister of Innovation, Science, and Industry introduced a new federal privacy bill that would reshape Canada’s privacy framework with a main goal of strengthening interoperability with both the European Union and the United States. Bill C-11 proposes the Digital Charter Implementation Act, 2020 which includes the Consumer Privacy Protection Act. This legislation would significantly increase protection of Canadian personal information by enhancing Canadian control over data and demanding more transparency from companies as to their handling of personal information. The Digital Charter Implementation Act includes:

  1. Increased control and transparency of Canadian personal identifiable information being handled by companies,
  2. Ability for Canadians to move information from one organization to another in a secure manner,
  3. Right for Canadians to destroy their information,
  4. Ability of the Privacy Commissioner to force an organization to comply and order businesses and corporations to stop collecting data or using personal information, and
  5. Strongest fine among G7 privacy laws.

Penalties and Provisions

There are significant fines for noncompliant businesses – up to 5% of revenue or a sum of Can$25 million, whichever is higher. The bill would also modernize the Consumer Privacy Protection Act (CPPA) to protect an individual’s personal information while regulating organizations collection, use, and disclosure of personal information. The CPPA would also further consent requirements for handling personal information, create transparency requirements with respect to algorithms and artificial intelligence (AI), mobility of personal data, retention and disposal of personal information, and codifies legitimate interests where consent is not required. The CPPA updates the Personal Information Protection and Electronic Documents Act, which governed how private sector organizations collect, use, and disclose personal information in commercial business.

Part of Bill C-11 also introduces the Personal Information and Privacy Protection Tribunal Act (PIPPTA). The PIPPTA was established to create an accelerated and more direct path to enforcement of orders from the Office of the Private Commissioner to meet its expanded role and provide strong enforcement. The PIPPTA also includes a private right of action, allowing individuals to sue where the commissioner issues a finding of a privacy violation and it will be upheld by the Tribunal. However, all cases must be brought up within two years of the violation.

Impact

Canada’s proposed federal privacy bill follows the lead of the European Union’s General Data Protection Regulation and the United States’ California Consumer Privacy Act. Canada’s privacy bill was created to impose obligations on any business that collects Canadian personal data. Businesses and companies that fail to comply will be subject to the penalties outlined above. If Bill C-11 is passed, US businesses that collect and/or process the personal data of Canadians will have to enact procedures that comply with the Consumer Privacy Protection Act and other requirements in the bill. As with any new piece of data legislation, it crucial that companies potentially impacted perform a thorough review of their forward-facing privacy practices as well as update their internal procedures to address any new compliance requirements.

At Beckage, we have a team of Global Data Privacy Attorneys that continue to monitor the constantly evolving data privacy and cybersecurity legislation landscape. The Beckage team is made up of technologists and Certified Information Privacy Professionals (CIPP/US & CIPP/E) who can help develop and review new and existing privacy policies compliant with Bill C-11 and other international legislation to help protect your business.

*Attorney Advertising. Prior results do not guarantee similar outcomes.

Subscribe to our Newsletter.

Apple Privacy UpdateMobile App Developers Take Notice Of New Apple Privacy Requirements

Mobile App Developers Take Notice Of New Apple Privacy Requirements

Companies that have, or are in the process of developing, mobile applications that are connected to the Apple Store should be aware of recent privacy updates and should take steps to prepare your business for these new privacy requirements in 2021. 

Apple’s Announcement

Beginning on December 8, 2020, Apple will impose specific requirements for the disclosure of privacy practices for all applications on the product page in the Apple Store.  This change will help users understand an app’s privacy practices before they download the app on any Apple platform.  The App Store product page will now feature a new privacy information section to help users understand an app’s privacy practices, such as data collection practices, the types of data collection, the data linked to the user, user tracking, and privacy links.  More details about Apple’s announcement can be found at the privacy details page and additional guidance on how to provide app privacy information can be found in Apple’s App Store Connect.

In addition to providing information about some of your app’s data collection practices on your product page, on iOS 14, iPadOS 14, and tvOS 14, apps will be required to receive user permission (opt-in consent) to track users across apps or websites owned by other companies or to access the device’s advertising identifier. This change allows users to choose whether they permit an app to track them or access their device’s advertising identifier.

Tracking refers to the act of linking user or device data collected from your app with user or device data collected from other companies’ apps, websites, or offline properties for targeted advertising or advertising measurement purposes.  Tracking also refers to sharing user or device data with data brokers.  To provide developers time to make necessary changes, apps will be required to obtain permission to track users starting early next year.  Additional guidance can be found at the Apple developer’s blog page.

What To Do Now

Businesses should take steps to make sure their current practices are legally compliant and address Apple’s new guidelines.

Now is an ideal time to work with your tech legal counsel to review your privacy policy and the App Store guidelines as well as applicable laws to confirm that the statements made throughout your policy are true and accurate representations of your data collection and sharing practices. Apps will need to create standardized privacy disclosures for the App Store to meet format and content requirements, but these responses should be carefully reviewed as not to conflict with any existing privacy statements.  Your internal business practices and collection protocols may change from time to time, which is why Beckage recommends an annual review of your privacy policy and related practices.  

Additionally, business should consult with their tech legal counsel to review and update consent language and disclosures for pop-up and any related consent forms that are utilized.  There may be specific regulatory or statutory requirements for obtaining consent through a mobile application that may need to be evaluated.  For example, although there are not currently opt-in requirements under the CCPA, there are specific requirements for consent under the GDPR and that may need to be met should the GDPR apply to your application.

Beckage lawyers have worked with numerous mobile app developers on privacy matters.   The Beckage team of lawyers is made up of technologists and certified privacy professionals who can help develop and review new and existing privacy policies to ensure compliance with Apple’s new privacy requirements. To reach a Beckage attorney, call 716.898.2102.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

CPRACalifornia Passes Proposition 24 on Consumer Privacy

California Passes Proposition 24 on Consumer Privacy

Businesses that have worked hard to implement California Consumer Privacy Act (CCPA) compliance initiatives will have a whole new set of privacy standards to comply with in the very near future.  California’s Proposition 24, also known as the California Privacy Rights Act (CPRA), has passed, expanding the state’s consumer privacy regulations. 

The CCPA, which passed only two years ago, the final regulations of which were just released earlier this year, will remain in effect until the CPRA becomes effective on January 1, 2023.  The CPRA expands the CCPA, adding new privacy rights aimed at strengthening consumer privacy. 

Among the changes introduced by the CPRA is the creation of a new, five-member agency with regulatory authority for enforcement of both the CCPA and CPRA.  The California Privacy Protection Agency will take over enforcement authority from the California Attorney General and dramatically change the way privacy rights are handled.  The Agency will be empowered to issue guidelines and impose fines on businesses who fail to comply. The Agency is slated to take over on July 1, 2021.

What is new in the CPRA? 

The CPRA modifies the CCPA in some meaningful ways by introducing new privacy rights and obligations pertaining to certain categories of personal information.  The updates will likely have a significant impact on companies that do business in California.  

New provisions of the CPRA include:

  • Sensitive Personal Information. The CPRA introduces a newly defined category of personal information that includes things like social security number, driver’s license number, passport number, sexual orientation, biometric data, health and financial information, and precise geolocation.
  • Additional Consumer Rights.  In addition to the rights conferred upon consumers under the CCPA, under the CPRA consumers will have additional rights, including the right to:
    • correct personal information;
    • know the length of data retention;
    • opt-out of geolocation utilization;
    • limit businesses from collecting more data than necessary;
    • restrict usage of sensitive personal information;
    • know what personal information is sold or shared and to whom;
    • prevent retaliation for exercising privacy rights.
  • Sharing of Data.  Of note, the CPRA allows consumers to opt out of the sharing of their personal information (rather than sale) for “cross-context behavioral advertising.”  This change is intended to close a perceived loophole in the CCPA that some businesses have relied on to avoid compliance.  This means businesses who do not sell data but share for digital advertising purposes may have to comply.
  • Expanded Breach Liability.  The CPRA adds a private right of action for unauthorized access or disclosure of an email address and password or security question that would permit access to an account if the business failed to maintain reasonable security.
  • Disclosure Obligations.  Businesses will be required to disclose the duration they will retain each category of personal information, the purpose for which they retain the personal information, and the volume collected.  Misrepresentations would constitute a statutory violation.
  • Increased Penalties for Children’s Personal Information.  The CPRA triples the maximum penalties for any violations concerning children’s personal information (under the age of 16).  The new penalties may go up to $7,500 per intentional violation.
  • Third Party Requirements.  Businesses that share personal information with third-party service providers are required under the CPRA to enter into contracts extending the CPRA privacy requirements to the third parties.
  • Covered Business.  The CPRA also slightly updates who is a covered business required to comply, increasing the threshold from buying, selling, or sharing personal information from 50,000 California consumers/households to 100,000.

Certain exemptions from the CCPA are retained in the CPRA, including exemptions for medical information or protected health information covered by HIPAA (Health Insurance Portability and Accountability Act) and HITECH (Health Information Technology for Economic and Clinical Health Act).  In addition, the CPRA extends the CCPA’s exemption for employee information and business to business data until January 1, 2023.

What impact will the CPRA have?

The CPRA becomes effective on January 1, 2023.  The CPRA will apply to personal information collected on or after January 1, 2022.  While many details still need to be clarified and defined through regulation, the impact of the CPRA will likely be significant as the concept of sharing is much broader in scope than selling.  The passage of another stringent privacy law in California may boost the likelihood of a comprehensive federal privacy law in the near term.

Beckage’s California Privacy Team continues to actively monitor the updates to the privacy landscape and the impacts the new data privacy law will have. The CPRA underscores the importance of operationalizing robust data security and privacy practices that can stand the test of time and adapt to the evolving consumer privacy landscape.  To learn more about the impact the CCPA and the CPRA may have on your business reach out to our team of attorneys.

*Attorney Advertising. Prior results do not guarantee future outcomes.

Subscribe to our Newsletter.

1 2 3 4