BiometricsIn the Face of Huge Settlements, BIPA May Soon Be Losing Its Bite

In the Face of Huge Settlements, BIPA May Soon Be Losing Its Bite

Illinois lawmakers are considering a bill which has the potential to dramatically rein in the state’s strict Biometric Information Privacy Act (“BIPA”).  On March 9, 2021, the Illinois House judiciary committee advanced House Bill 559 (the “Bill”) which would amend BIPA.  The Bill has a couple of key amendments that may impact your business.

First, the Bill changes BIPA’s “written release” requirement to instead simply require “written consent”.  Thus, under the Bill, businesses would no longer be required obtain written release, but instead could rely on electronic consent.

Second, whereas BIPA currently requires that a business in possession of biometric identifiers draft and provide a written policy regarding its handling of biometric data to the general public, under the Bill, businesses would only be required to provide this written policy to affected data subjects.

Third, the Bill creates a one-year statute of limitations for BIPA claims.  Moreover, the Bill provides that prior to initiating a claim, a data subject must provide a business with 30 days’ written notice identifying the alleged violations.  If the business cures these violations within the 30 day window, and provides the data subject an express written statement indicating the issues have been corrected and that no further violations shall occur, then no action for individual statutory damages or class-wide statutory damages can be taken against the business.  If the business continues to violate BIPA in breach of the express written statement, then the data subject can initiate an action against the business to enforce the written statement and may pursue statutory damages.  Therefore, not only does the Bill finally create a statute of limitations, but also provides a mechanism by which businesses can respond to alleged violations of BIPA prior to engaging in costly litigation.

Fourth, the Bill modifies BIPA’s damages provisions.  Currently BIPA provides that prevailing plaintiff is entitled liquidated damages of $1,000 or actual damages, whichever is greater, when a business is found to have negligently violated BIPA.  The Bill would limit a prevailing plaintiff’s recovery to only actual damages.  Similarly, in its current form, BIPA provides that a prevailing plaintiff is entitled to liquidated damages of $5,000 or actual damages, whichever is greater, when a business is found to have willfully violated BIPA.  The Bill would limit a prevailing plaintiff’s recovery to actual damages plus liquidated damages up to the amount of actual damages.  Therefore, the Bill would limit a businesses exposure in BIPA claims to what a prevailing Plaintiff can demonstrate as actual damages.

Finally, the Bill provides that BIPA would not apply to a business’ employees if the those employees were covered by a collective bargaining agreement.  Something which has been at issue in recent BIPA litigation as discussed here.

BIPA litigation has increased dramatically and resulted in a number of recent high-profile settlements, including TikTok’s $92 million dollar settlement and Facebook’s $650 million dollar settlement.  This Bill has the potential to greatly curtail this spiral of litigation and high settlement figures.  Beckage will continue to monitor any developments regarding the Bill and will update its guidance accordingly.  Our team of experienced attorneys, who are also devoted technologists, are especially equipped with the skills and experience necessary to not only develop a comprehensive and scalable biometric privacy compliance program but also handle any resulting litigation.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

5GWith 5G, will your thermometer need malware protection?

With 5G, will your thermometer need malware protection?

5G is perhaps the biggest critical infrastructure build the world has seen in twenty-five years.  It will allow for the connection of millions of Internet of Things (“IoT’) devices.  However, with these added benefits comes related vulnerabilities and cybersecurity risks. 

What are the specific cybersecurity risks are associated with the 5G network?

First, the 5G network itself can pose many security risks.  The 5G infrastructure is built using many components, each of which may be corrupted through an insecure supply chain.  Significantly more software is being used allowing for more entry points and more potential vulnerabilities.  Similarly, more hardware devices are required (cell towers, beamforming devices, small cells, etc.), and each one of these hardware devices must be adequately secured.  Small, local cells may be more physically accessible and therefore subject to physical attack.  Further, 5G will be built, in part, on legacy 4G LTE components – which themselves can have vulnerabilities.

Second, with specific focus on IoT devices, cybersecurity protections will need to become much more granular and more capable of being deployed on less intelligent “Things.”  Historically, one could think of a Thing as a device that can be connected to a network, but which lacked sufficient processing power to handle more advanced computations.  Things are “dumb.”  By connecting a processor, we could make such dumb Things “smart.”  These new smart IoT devices are interesting vectors of attack by malicious actors and further confound overall cybersecurity programs.  The ability to detect a cyber attack on a light bulb will require additional cybersecurity solutions.

Finally, with 5G facilitating the implementation of more IoT devices, more sensitive data may be stored requiring the need to protect edge computers servicing the IoT device.  If we consider the ubiquity of thermometer scanning now and how those and similar IoT devices could easily become part of 5G, then we begin to understand the seemingly exponential possibility for threat vectors on our networks.  We may have sensitive data (Am I sick?  What time do I show up for work?) and we may have the concern that a malicious actor may look to infect a network through a Thing. Will thermometers need malware protection?  More devices arguably allow for more places for a hacker to attempt to attack and thus the possibility of a greater availability of distributed denial of service (DDOS) attacks.  There were reports of Things being used collectively to deny service with the LTE network.  With 5G, the concept of an army of coffee makers attacking by all issuing a request to an address will become a greater possibility and manufacturers could be liable to other parties if their insecure Things are used to deny the service of someone else.

Regardless of the attack vector, incident response practices are universal, and Beckage’s Incident Response Team can help prepare your team from IoT and other attacks.

What potential solutions are available to mitigate this risk?

Companies looking to incorporate 5G should partner with experienced tech counsel who can assist by reviewing contracts, conducting risk assessments, and evaluating and updating incident response plans and procedures to account for any additional risks associated with 5G.

In addition, there are already some attempts at governmental solutions.  In March 2020, President Trump issued a National Strategy to Secure 5G – requiring, in relevant part, that the Unites States must identify cybersecurity risks in 5G.

The CISA (Cybersecurity & Infrastructure Security Agency) also issued some documents relating to the security of 5G.  Similarly, we are seeing a push for international standards and certain untrusted companies have had their products banned from use.  The Federal government is using regulations to limit the adoption of equipment that may contain vulnerabilities.

So, what is the solution?  The same as always.  Innovation.  Businesses are encouraged to develop trusted solutions and innovation in this space.  Advanced cybersecurity monitoring and protection by design will continue to be needed.

The Beckage Team of lawyers, who are also technologists, is well-versed in new and emerging technologies and works with clients to facilitate innovation through the use of IP protections.  We also assist companies in the implementation new technologies, like 5G, taking into consideration the cybersecurity, data privacy, and regulatory obstacles associated with their use.  From patent acquisition to policy drafting and review, Beckage attorneys are here to help your company capitalize on innovation.

*Attorney Advertising. Prior results do not guarantee future outcomes. 

Subscribe to our Newsletter

AIAccountability and the Use of Artificial Intelligence

Accountability and the Use of Artificial Intelligence

As artificial intelligence (“AI”) and automated decision-making systems make their way into every corner of society – from businesses and schools to government agencies – concerns about using the technology responsibly and accountability are on the rise. 

The United States has always been on the forefront of technological innovations and our government policies have helped us remain there.  To that end, on February 11, 2019, President Trump issued an Executive Order on Maintaining American Leadership in Artificial Intelligence (No. 13,859).  See Exec. Order No. 13,859, 3 C.F.R. 3967.  As part of this Executive Order, the “American AI Initiative” was launched with five guiding principles:

  1. Driving technological breakthroughs; 
  2. Driving the development of appropriate technical standards; 
  3. Training workers with the skills to develop and apply AI technologies; 
  4. Protecting American values, including civil liberties and privacy, and fostering public trust and confidence in AI technologies; and
  5.  Protecting U.S. technological advantages in AI, while promoting an international environment that supports innovation. Id. at § 1. 

Finally, the Executive Order tasked the National Institute of Standards and Technology (“NIST”) of the U.S. Department of Commerce with creating a plan for the development of technical standards to support reliable, robust, and trustworthy AI systems.  Id. at § 6(d). To that end, the NIST released its Plan for Federal Engagement in Developing Technical Standards in August 2019.  See Nat’l Inst. of Standards & Tech., U.S. Leadership in AI: A Plan for Federal Engagement in Developing Technical Standards and Related Tools (2019). 

While excitement over the use of AI was brewing in the executive branch, the legislative branch was concerned with its accountability as on April 10, 2019, the Algorithmic Accountability Act (“AAA”) was introduced into Congress.  See Algorithmic Accountability Act of 2019, S. 1108, H.R. 2231, 116th Cong. (2019).  The AAA covered business that: 

  1. Made more than $50,000,000 per year;
  2. Held data for greater than 1,000,000 customers; or
  3. Acted as a data broker to buy and sell personal information.  Id. at § 2(5). 

The AAA would have required business to conduct “impact assessments” on their “high-risk” automated decision systems in order to evaluate the impacts of the system’s design process and training data on “accuracy, fairness, bias, discrimination, privacy, and security”.  Id. at §§ 2(2) and 3(b).  These impact assessments would have required to be performed “in consultation with external third parties, including independent auditors and independent technology experts”.  Id. at § 3(b)(1)(C).  Following an impact assessment the AAA would have required that business reasonably address the result of the impact assessment in a timely manner.  Id. at § 3(b)(1)(D).  

It wasn’t just the federal government who is concerned about the use of AI in business as on May 20, 2019, the New Jersey Algorithmic Accountability Act (“NJ AAA”) was introduced into the New Jersey General Assembly.  The NJ AAA was very similar to the AAA in that it would have required businesses in the state to conduct impact assessments on “high risk” automated decisions. See New Jersey Algorithmic Accountability Act, A.B. 5430, 218th Leg., 2019 Reg. Sess. (N.J. 2019).  These “Automated decision system impact assessments” would have required an evaluation of the systems development “including the design and training data of the  automated  decision  system,  for  impacts  on accuracy,  fairness,  bias,  discrimination,  privacy,  and  security” as well as a cost-benefit analysis of the AI in light of its purpose.  Id. at § 2.  The NJ AAA would have also required businesses work with independent third parties, record any bias or threat to the security of consumers’ personally identifiable information discovered through the impact assessments, and provide any other information that is required by the New Jersey Director of the Division of Consumer Affairs in the New Jersey Department of Law and Public Safety.  Id

While the aforementioned legislation has appeared to have stalled, we nevertheless anticipate that both federal and state legislators will once again take up the task of both encouraging and regulating the use of AI in business as the COVID-19 pandemic subsides.  Our team at Beckage contains attorneys who are focused on technology, data security, and privacy and have the experience to advise your business on the best practices for the adoption of AI and automated decision-making systems. 

*Attorney Advertising. Prior results do not guarantee future outcomes. 

Subscribe to our Newsletter

FacebookRegulating Online Content – The Balance Between Free Speech and Free-For-All

Regulating Online Content – The Balance Between Free Speech and Free-For-All

This year kicked off with an explosive culmination to the ongoing tensions between free speech and social media, with Twitter bans, lawsuits and enduring questions about who gets to regulate content on the internet—or if it should be regulated at all. America is distinctly uncomfortable with the government stepping in to regulate speech. But public pressure has forced Big Tech to fill the void, spurring claims of unfair treatment and violations of First Amendment rights.

At the heart of the matter: unlike other countries that have laws against hate speech and fake news, America seems to have left it up to private companies to decide what content is acceptable with little legal obligation to explain their choices. Compounding the problem is what some argue the enormous power that a few big tech companies wield over our online infrastructure and channels of communication, leaving some to wonder if service providers like Facebook should really be treated more like a utility, with the government regulations to match. 

Are there restrictions or laws regulating online content?

In Reno v. American Civil Liberties Union, the U.S. Supreme Court declared speech on the Internet equally worthy of the First Amendment’s historical protections. That means pornography, violent films, explicit racism are all fair game on social media in the eyes of the law. The government only deems very narrow categories of speech as criminal, such as “true threats,’ or language that is explicitly intended to make an individual or group fear for their life or safety. Although it’s interesting to note that arguing a politician should be shot wouldn’t necessarily meet the criteria for incitement or true threat.

As of late, America has held tightly to an interpretation of the First Amendment that protects the free marketplace of ideas, even when it comes at a cost. Landmark cases like Brandenburg v. Ohio that protected the speech of a Ku Klux Klan leader, have solidified our particularly high bar for punishing inflammatory speech.  

But America has also supported the rights of private companies to decide what kind of speech is appropriate in their venues and by extension, virtual squares. Unlike most of the world, where ISPs are subject to state mandates, content regulation in the United States mostly occurs at the private or voluntary level. Social media companies are allowed to decide what their user policies are and are expected to self-regulate, creating internal speech policies that, in theory, protect against unfair censorship.

Beyond the social media companies themselves, the regulators and legal recourse that do exist present their own set of problems. ICANN, the non-profit that controls contracts with internet registries (.com, .org, .info, etc.) and registrars (companies that sell domain names), has immense power over who gets to claim a domain name—ICANN decisions are not subject to speech claims based on the First Amendment. The Digital Millennium Copyright Act (DMCA), designed to offer anti-piracy protections, is often used as a tool of intimidation or as a means for companies to keep a tight control on how consumers use their copyrighted works, stifling free speech in the process. Apple, for example, tried to use the DMCA in 2009 to shut down members in the online forum BluWiki who were discussing how to sync music playlists between iPods and iPhones without having to use iTunes. John Deere refuses to unlock its proprietary tractor software to let farm owners repair their own vehicles, leaving tractor owners in fear of DMCA lawsuits if they try to crack the software protections themselves.

The Growing Pressure to Regulate Content

In the absence of legal pressure, public opinion seems to be the real driver of online content regulation. It was a tipping point of public outrage that finally pushed big tech to ban the president and Parler. Apple pulled Tumblr from the App Store in 2018 because it was failing to screen out child sex abuse material, but only after multiple public complaints. After decades of proudly promoting free speech, regardless of the consequences, external pressures are now forcing companies like Facebook to police their domains, using legions of reviewers to flag harmful content.

While the world grapples with how to manage online speech, it’s clear that businesses will continue to face a variety of legal, social, and moral pressures regarding the content they provide or facilitate—and they must be prepared to monitor and account for what goes on in their virtual public spaces. Companies that allow for the posting of content – words, photos, videos – have a slew of laws to consider in allowing this practice, including free speech rights and controls. Companies should work with sophisticated and experienced tech legal counsel, like Beckage, to address these issues.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

PrivacyVirginia, Oklahoma, and Florida Join Growing List of States With Proposed Privacy Legislation

Virginia, Oklahoma, and Florida Join Growing List of States With Proposed Privacy Legislation

Since California’s Consumer Privacy Act (CCPA) was passed in 2018, Beckage has seen a slew of other states follow suit in proposing and enacting their own comprehensive data privacy bills. Most recently, lawmakers in Virginia, Oklahoma, and Florida have joined the growing list of states with proposed privacy bills. So far this year, New York, Washington, and Minnesota have also introduced legislation governing the ways companies collect, store, use, and share consumer data and we expect to see other laws emerging in the coming months with still no federal data privacy bill in sight.  

Working with experienced privacy counsel can help build out data privacy programs that stand the test of time and contemplate emerging legislation.   

Below is an overview of the Virginia and Oklahoma proposed bills, their requirements, and their potential impact on the data privacy landscape. 

Virginia Consumer Data Protection Act (SB 1392) 

The Virginia proposal is quickly moving through the Virginia state legislature and is likely to be the next comprehensive state data privacy law on the books. This bill passed the Virginia House of Delegates on January 29th by a wide margin and was unanimously approved in the Senate on February 3rd. Assuming Governor Northam signs it into law, the Virginia Consumer Data Protection Act is set to go into effect on January 1, 2023. 

Who Does It Apply To? 

Companies that conduct business in Virginia or “produce products or services that are targeted to” Virginians would have to comply with the Virginia Consumer Data Protection Act if they: 

  • Control or process the personal data of at least 100,000 Virginians; or 
  • Control or process the personal data of at least 25,000 Virginians and derive over 50% of their gross revenue from the sale of that data. 

The Legislation does provide exemptions for financial institutions governed by the Gramm-Leach-Bliley Act, entities subject to HIPAA or HITECH, non-profits, and educational institutions. 

What Is Included? 

Included in this Bill are several requirements not covered under the CCPA or any other U.S. privacy law. One such obligation requires entities that control personal data to conduct protection assessments of any activities that use personal data for specific purposes, such as targeted advertising. These data protection assessments may be requested and evaluated by the attorney general to ensure compliance. 

This Act would afford Virginia consumers with several rights regarding their personal data, including the right to opt-out of the sale or use of their information for targeted advertising or profiling. It would also allow consumers to delete their data, move their data, correct inaccuracies in their data, and confirm if their data is being processed upon request.  

Notably missing is a private right of action through which consumers could seek damages for alleged violations. Instead, enforcement of the Act would be left exclusively to the attorney general, who may seek up to $7,500 per violation. 

Oklahoma Computer Data Privacy Act (HB 1602) 

Introduced on January 19, 2021 by Representatives Josh West (R) and Collin Walke (D), this Bill has bipartisan support in the Oklahoma House of Representatives. Its intended purpose is to give Oklahomans more online privacy by taking aim at tech companies. If passed, the Oklahoma Computer Data Privacy Act would go into effect on November 1, 2021. 

Who Does It Apply To? 

If passed, this act would apply to companies that operate in the state of Oklahoma and collect Oklahoman’s personal information or have information collected on their behalf, determine the purpose for and means of processing that information, and satisfy one of the following thresholds: 

  • Has an annual gross revenue exceeding $10 million; 
  • Buys, sells, receives, or shares for commercial purposes the personal information of 50,000 or more consumers, households, or devices annually; or 
  • Derives 25% or more of their annual revenue from the sale of personal data. 

What Is Included? 

Companies subject to this legislation would be required to disclose what personal information they hold on a consumer and allow for the deletion of that information upon the consumer’s request. This proposal also mandates consumers opt-in to providing their personal data, which differentiates it from most other state privacy laws, like the CCPA. The Oklahoma Computer Data Privacy Act also differs from the CCPA in its inclusion of a broad private right of action through which Oklahoma residents could seek damages up to $7,500 for violations. 

Florida House Bill 969 (HB 969) 

Introduced on February 15th by Representative Fiona McFarland (R), House Bill 969 would place several requirements on businesses that deal with Florida residents’ private information. If passed, it would go into effect on January 1, 2022. 

Who Does It Apply To? 

For-profit companies that do business in Florida and collect personal information about consumers, have personal information collected on their behalf, or determine the process and means of processing personal information will have to comply with this Bill’s requirements if they satisfy one of the following thresholds: 

  • Has an annual gross revenue exceeding $25 million; 
  • Buys, sells, receives, or shares for commercial purposes the personal information of 50,000 or more consumers, households, or devices annually; or 
  • Derives 50% or more of their annual revenue from the sale of personal data. 

What Is Included? 

HB 969 would require that applicable businesses notify consumers about their data collection and selling practices before or at the point of data collection. Under this Bill, consumers would also have the right to request their data be disclosed, corrected, or edited and the right to opt-out of having their personal information disclosed or sold to a third party. 

Applicable businesses would be required to implement reasonable security protocols to protect their consumer’s personal data. Also included is a private right of action through which a consumer “whose nonencrypted and nonredacted personal information or e-mail addresses are subject to unauthorized access” may seek damages for violations of the Bill. The Department of Legal Affairs would be authorized to bring other enforcement actions, up to $2,500 per unintentional violation and $7,500 per intentional violation. 

Potential Impact 

Currently, the data privacy landscape in the United States is a patchwork of enacted and proposed laws, all with their own requirements and consumer rights, creating a confusing web for companies operating in more than one jurisdiction. While advocates of these state privacy laws argue for the protection of consumers’ data in an increasingly digitally-driven world, opponents argue that the potential risk of operating within states who have enacted comprehensive privacy laws may deter businesses from expanding their operations there. 

A federal privacy law that could rectify the many differences between individual state laws would simplify this landscape, making it easier for companies to protect their consumers’ data and operate efficiently while complying with regulations.  

Beckage is closely monitoring these, and other emerging privacy laws. In the meantime, companies that collect personal data should start thinking about privacy compliance by conducting a baseline privacy assessment and starting to develop relevant policies and procedures. Beckage attorneys, who are also technologists and certified privacy professionals, are happy to help counsel your business on compliance with the CCPA, GDPR, and other pending and enacted privacy legislation.  We work with clients of all sizes to build out data privacy programs and address compliance matters.  

Subscribe to our newsletter. 

*Attorney advertising – prior results do not guarantee future outcomes. 

1 2 3 4