Attorney Client PrivilegeWengui v. Clark Hill PLC: Another Decision Addresses the Application of Attorney Client Privilege in Incident Response

Wengui v. Clark Hill PLC: Another Decision Addresses the Application of Attorney Client Privilege in Incident Response

Last week, the District of Columbia federal court added to the growing body of caselaw related to the privileged afforded forensic reports generated in response to cyber incidents.  The ruling found that any such forensic report (or other compliance-related investigation summary) is not privileged if it “would have been created in the ordinary course of business irrespective of litigation.”  See Wengui v. Clark Hill, 2021 U.S. Dist. WL106417 (D.D.C. Jan. 12, 2021) at *1.

In this matter, the Plaintiff sought the work product and arguably privileged report created for the Defendant’s counsel, by security-consulting firm Duff & Phelps.  Where the Defendant argued that the report was created in anticipation of litigation and provided information to defense counsel regarding how the cyberattack unfolded, the Court found that the report was neither attorney-client privileged nor an attorney work-product, as it was created in the “ordinary course” of the response a business that suffered a cyberattack would follow.  As the Court ruled, it was a “necessary business function regardless of litigation or regulatory inquiries.” Id at *2.

How the Defendant Argued for Attorney Work-Product Privilege & Attorney-Client Privilege

The Defendant’s argument for maintaining work-product privilege, i.e., that the forensic report was created to aide counsel’s understanding of the attack in anticipation of litigation, was based on defendant’s use of a parallel investigation.  The Court did not find this persuasive, but the Defendant explained that two investigations unfolded in response to the breach: (1) a business-continuity oriented response for which the cybersecurity vendor was retained to “investigate and remediate” the cyberattack; and (2) a litigation-oriented response in which litigation counsel retained a firm “for the sole purpose” of “gathering information necessary to render timely legal advice.”  Id. at *3.  Additionally, the Defendant argued that the work provided by the consultant to the Defendant’s counsel constituted privileged communication as it translated the incident into a digestible report for the attorney.  Id. at *5.

The Court’s Analysis

While the defendant argued that the parallel investigation path is well-worn and generating a protected report for litigation is separate from a business-continuity report, the Court’s careful review of the record is a reminder of how key factual details and steps can impact an argument over privilege.  For instance, the Court noted that the Defendant claimed that its understanding of the root cause and progression of the attack was “based solely on the advice of outside counsel and consultants retained by outside counsel.”  Id.  Furthering that analysis, the Court noted that there is no evidence that suggests the second, litigation-oriented investigation “produced any findings, let alone a comprehensive report like the one produced” about the root cause of the breach.  Id.  The distribution of the root-cause business continuity report also worked against the Defendant in the Court’s analysis, as it suggested the report was the one document with the “recorded facts” of the incident.  Id. at *4.  Additionally, the Court found that the record suggested the Defendant relied the work of the business-continuity investigation, “instead of, rather than separate from or in addition to” the litigation-oriented investigation. Id.  The Court built off existing case law, including Capital One, on the basis that the report was used for non-litigation purposes and the Defendant did not meet the burden of demonstrating that a substantially similar report would not have been produced in the absence of litigation.  Id. at *5.

In considering the attorney-client privilege argument, the Court declined to extend such privilege to all manner of services or attached it to reports of third parties made at the request of the attorney.   The Court instead reviewed the factual record and concluded that Defendant’s counsel used the security firm for its “expertise in cybersecurity, and not in obtaining legal advice” based on an in-camera review of the report and the Court’s note that it “provides not only a summary of the firm’s findings, but also pages of specific recommendations on how [Defendant] should tighten its cybersecurity.”  Id. 

What Now?

This ruling shows how steps taken in the immediate response of a cyberattack can echo significantly into a litigation.  The greatest takeaway may be in the Court’s acknowledgement that “[a]lthough [Defendant] papered the arrangement through its attorneys, that approach ‘appears to [have been] designed to help shield material from disclosure’ and is not sufficient in itself to provide work-product protection.”  Id. at *4.  The Court’s ruling suggests that the use of parallel investigations is not at issue, but the parallel investigations should be genuine and produce reports oriented to the stated purpose. Counsel thus should consider such steps when assigning responsibilities in response to a cyberattack.  Additionally, the substance and distribution of the generated report(s) can reflect to a Court the presence or absence of legal assistance vs. security and business continuity advice.  A report heavy on recommendations and distributed widely can defeat attorney-client privilege and attorney work product protections according to this ruling, and IR counsel should take note when engaging third-party incident response firms.

In any incident, it is important to work with sophisticated and experienced tech counsel.  The attorneys at Beckage have years of experience responding to large-scale data breaches and can help provide the guidance needed at every stage of a data incident.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

Biometric InformationBIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

BIPA Illinois Biometric Law Sets the Stage for Biometric Litigation

COVID-19 is accelerating company adoption of biometric technologies.  With a global shift towards remote working, biometric technologies, which measure physiological, behavioral, and psychological characteristics, can promote, or at least monitor, productivity by recording employee performance.  Facial recognition biometric systems have also been vital in contactless engagement, especially in the airline and retail sectors, and such systems will remain after the pandemic subsides.  This burgeoning biometric industry is garnering interest from lawmakers. Given the firm’s technology-driven focus, Beckage has been tracking biometric laws and will continue to monitor legal and business developments surrounding biometric technologies. 

Biometric Data and the Law

Unlike other personal data, such as passwords, social security numbers, and payment card information, biometric identifiers cannot easily be changed once breached.  Because they are immutable by nature, regulations classify them as a sensitive class of personal data.  Notable laws that govern biometric data include the E.U. Global Data Protection Regulation (GDPR) and U.S. state laws, including California’s comprehensive privacy law. Three states, Illinois, Texas, and Washington, have passed biometric specific laws. New York State recently introduced the Biometric Privacy Act, a bill that is nearly identical to Illinois’ BIPA, and other states, such as Arkansas and California have amended their breach notification laws to reflect biometric data as personal identifying information.

The first step to knowing whether biometric regulations apply to your business is understanding the definition of biometric data.  The GDPR defines biometric data as “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.” Art. 4(14).  Similarly, U.S. biometric laws protect biometric data characterized in terms of personal identifiers, including retina scan, iris scan, fingerprint, voiceprint, hand scan, and face geometry.  For example, the Illinois Biometric Data Act (BIPA) defines biometric information as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.” Sec.10.

U.S. Biometric Litigation Trends

Recent rulings in biometric litigation indicate that BIPA currently drives the legal landscape on biometric data protection in the U.S.  BIPA litigation is on the rise following the Illinois Supreme Court 2019 decision in Rosenbach v. Six Flags.  The plaintiff in Rosenbach was the mother of a minor whose fingerprint was captured to verify his identity for entry to an amusement park owned by the defendant.  The Court rejected the defendant’s allegations that the plaintiff had not suffered any actual or threatened harm.  Consequently, the Court held a plaintiff can sue based on a mere technical violation of the law.  This decision means that a person does not have to suffer actual harm to pursue a biometric suit under BIPA.  Further, federal courts have agreed that failure to implement privacy policies outlining procedures for collection, retention, and destruction of biometric identifiers is sufficient to demonstrate a violation of the law.  For example, in May 2020, the Seventh Circuit in Bryant v. Compass found the Rosenbach ruling instructive in holding the plaintiff can pursue a lawsuit against a vending machine operator if the vending machine installed at a workplace integrated biometric authentication in lieu of credit card payments.

The types of companies involved in BIPA litigation are diverse.  Any company that collects, stores, or uses biometric information related to Illinois residents is subject to BIPA.  To that end, no industry seems immune: plaintiffs have sued big tech companies using facial recognition technologies and smaller companies, such as nursing homes, using fingerprinting systems for timekeeping.  The Compass ruling illustrates that third-party vendors who provide biometric authentication systems in the workplace are within the reach of BIPA.

The diversity in cases signals the legislative impact of the law and spotlights the role of privacy policies and procedures.  BIPA is the only biometric law in the U.S that allows individuals to sue a company for damages in amounts ranging from $1,000 to $5,000 per violation.  Thus, the stakes can be high for companies without proper biometric data governance.

What should companies do?

To comply with the evolving BIPA compliance and other biometric laws, companies should work with experienced lawyers who understand biometric technologies and regulations to address the following controls and practices:

  • Properly inform individuals or responsible parties about the purpose of collecting their biometric data.
  • Properly inform individuals or responsible parties about the company’s biometric collection, retention, storage, and dissemination policies and procedures.
  • Obtain written consent from individuals or their responsible party before collecting biometric data.
  • Make the company’s written biometric policy establishing retention schedule and destruction guidelines publicly available.

A robust biometric compliance program should reflect current laws and be flexible and scalable to adapt to the changes laws that new biometric legal rules will inevitably bring to their privacy compliance programs.  Beckage’s lawyers, who are also technologists, are equipped with the skills and experience to build a robust biometric compliance program.  We stand ready to answer any of your questions.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

Parler v. AWSParler v. Amazon Web Services – The Ongoing Conversation Surrounding Social Media, Big Tech, and Freedom of Speech

Parler v. Amazon Web Services – The Ongoing Conversation Surrounding Social Media, Big Tech, and Freedom of Speech

As the fallout from last week’s attack on the Capitol continues to be front page news, big questions surround big tech’s role as the arbiter of acceptable online speech.

After Facebook suspended President Trump’s account indefinitely and Twitter shut him down permanently, YouTube announced Wednesday that it will be freezing the president’s account for a week, citing concerns over the ongoing potential for violence.

Apple, Google, and Amazon have also pulled the plug on Parler, a social network that has become increasingly popular in recent months with conservatives, with a reputation for allowing content that would not otherwise be tolerated on other channels, including numerous calls for violence. Parler has responded by filing a lawsuit against Amazon, including claims that Amazon Web Services (AWS) violated antitrust laws and is in breach of contract for not providing a 30-day notice of cancellation.

In the 18-page complaint, filed in the U.S. District Court for the Western District of Washington, Parler argues that the decision to suspend its account “is apparently motivated by political animus” and designed to “reduce competition in the microblogging services market to the benefit of Twitter,” which recently signed a long-term deal with AWS and stands as one of Parler’s main competitors. The suit includes claims for breach of contract, tortious interference, and violation of antitrust law, alleging that Amazon failed to take similar actions in suspending Twitter’s account that included similar rhetoric. Parler is seeking a temporary restraining order to prevent Amazon from removing the social platform from its servers and prevent what it says will be irreparable harm to its business.

Can Amazon really do that? What about the First Amendment?

The suit also comes as tensions over alleged First Amendment violations remain high.  It’s well established that the First Amendment limits the government’s ability to restrict people’s speech, not private businesses’ ability to do so. Stated differently, the First Amendment only applies to public places, not private spaces, such as a social media platform.  But not so fast –  in 1980, the Supreme Court in Pruneyard v. Shopping Center v. Robins held that a shopping mall owner could not exclude a group of high school students who were engaged in political advocacy in quasi-public spaces in a private shopping mall. The Court accepted the argument that it was within California’s power to guarantee this expansive free speech right since it did not unreasonably intrude on the rights of private property owners. Likewise, in 2017, the Supreme Court in Packingham v. North Carolina held that the First Amendment prohibited the government from banning sex offenders from social media websites, finding implicitly social media to be a public space. The question, then, of whether Twitter and other social media spaces, and their associated cloud servers, where people congregate are “public” and deserving of First Amendment protections is not clear-cut. 

For its part, Amazon claims it was well within its rights to dismiss Parler after it failed to promptly identify and remove content encouraging or inciting violence against others, a direct violation of Amazon’s terms of service. According to court documents, Amazon says it reported more than a hundred examples of such violative content to Parler in just the past several weeks. In its official response to Parler’s restraining order request, AWS states that this “case is not about suppressing speech or stifling viewpoints. It is not about a conspiracy to restrain trade. Instead, this case is about Parler’s demonstrated unwillingness and inability to remove from the servers of Amazon Web Services (‘AWS’) content that threatens public safety.”

Most experts see Amazon’s decision to remove Parler as legitimate, and the microblogger will have a steep climb arguing against what are clear violations of terms. It’s also not without precedent: Cloudflare, a small company that provides tools to help websites protect against cyber attacks and load content more quickly, made a similar decision after facing pressure to drop The Daily Stormer, a neo-Nazi hate site, from their service after the deadly riots in Charlottesville in 2017. It later dropped 8Chan, a controversial forum linked to several deadly attacks, including those in El Paso, Texas and Christchurch, New Zealand.

What does this means for businesses, consumers and the future of social media?

While this case was born out of a national crisis, there is little incentive and less legal standing for businesses to start an online political witch hunt. As Amazon stated in their response to Parler, “AWS has no incentive to stop doing business with paying customers that comply with its agreements.”

But while Amazon and others are arguably on solid legal ground in their choice to drop Parler or block the president, these decisions bring up much larger questions about how we ended up with a few huge companies holding immense power over the trajectory of public discourse.

In many ways, the Constitution and our legal frameworks have not caught up to the pace, scope, and influence of online and social media. There’s not a lot of legal guidance on how tech companies or third-party vendors should treat illegal or inflammatory content posted on their networks or produced with their tools. Lawmakers are also grappling with how much responsibility should fall on social behemoths, like Facebook, that produce and house immense amounts of online content, but are not treated like traditional publishers under the law.

This is certainly both a landmark moment and a moment of reckoning for digital media consumers and providers. It’s too soon to tell how this will push transformation in the tech world and the digital town square of social media, but we’ll be following the conversation closely.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

0
Facial RecognitionFTC & EverAlbum Inc. Settlement Clarifies Privacy Standards for Facial Recognition Technology

FTC & EverAlbum Inc. Settlement Clarifies Privacy Standards for Facial Recognition Technology

One of Beckage’s 2021 privacy predictions is the continued rise of biometric lawsuits and legislation, even outside Illinois’ BIPA. Case in point is a recent consent decree the Federal Trade Commission issued against EverAlbum, a California company, concerning its use of photo-tagging and facial recognition technologies.

The Claims Against EverAlbum Inc.

In its complaint, the FTC alleges that EverAlbum, Inc. violated Section 5 of the Federal Commission Act by making several misrepresentations concerning its App’s use of facial recognition technology (FRT). Specifically, the FTC alleged that:

  • EverAlbum’s facial recognition feature was on by default. InFebruary 2017, EverAlbum launched a new feature in the Ever App, called ‘Friends’ that used facial recognition technology to group users’ photos by the faces of the people who appear in them and allowed users to “tag” people by name. EverAlbum allegedly enabled facial recognition by default for all mobile app users when it launched the ‘Friends’ feature.
  • EverAlbum falsely claimed that users must affirmatively activate FRT. Between July 2018 and April 2019, EverAlbum allegedly represented that it would not apply facial recognition technology to users’ content unless users affirmatively chose to activate the feature. Although, beginning in May 2018, the company allowed some Ever App users—those located in Illinois, Texas, Washington and the European Union—to choose whether to turn on the face recognition feature, it was automatically active for all other users until April 2019 and could not be turned off.
  • EverAlbum used users’ images to create a larger dataset to develop its FRT, and sold FRT services to enterprise clients. Between September 2017 and August 2019, EverAlbum combined millions of facial images that it extracted from users’ photos with facial images that EverAlbum obtained from publicly available datasets to create datasets for use in the development of its facial recognition technology. The complaint alleges that EverAlbum used the facial recognition technology resulting from one of those datasets to provide the Ever App’s “Friends” feature and also to develop the facial recognition services sold to its enterprise customers without disclosing this to users.
  • EverAlbum Failed to delete photos from deactivated accounts. EverAlbum is also alleged to have promised users that the company would delete the photos and videos of users who deactivated their accounts. The FTC alleges, however, that until at least October 2019, EverAlbum failed to delete the photos or videos of any users who had deactivated their accounts and instead retained them indefinitely.

FTC v. EverAlbum Inc. Settlement Agreement

In the consent Agreement, the FTC requires EverAlbum to:

  • Delete Certain User Information: Specifically, within 30-90 days of the agreement, EverAlbum must delete:
    1. The photos and videos of Ever App users who deactivated their accounts
    2. All face embeddings, data reflecting facial features that can be used for facial recognition purposes, the company derived from the photos of users who did not give their express consent to their use.
    3. Any facial recognition models or algorithms developed with EverAlbum users’ photos or videos
  • Make Clear and Conspicuous Disclosures: EverAlbum must clearly and conspicuously disclose to the user from whom the respondent has collected the biometric information, separate and apart from any Privacy Policy, Terms of Use page, or other similar document, all purposes for which respondent will use, and to the extent applicable, share, the biometric information.
  • Obtain Affirmative Express Consent from Users: EverAlbum must obtain affirmative express consent from users whose biometric information is collected.

Potential Application of EverAlbum Settlement

The FTC v. EverAlbum Inc. settlement sets a defacto standard for businesses who are collecting biometric information from consumers in the United States. Companies who use biometric data or facial recognition technology should observe the following takeaways from this settlement:

First, the settlement makes clear that facial recognition technology used on photographs is a regulated biometric practice. This is somewhat unclear under the Illinois BIPA statute, where defendants have argued that photographs are exempt from the law.

Next, as a defacto standard, the FTC is requiring that businesses make clear and conspicuous disclosures regarding their biometric practices. The Agreement defines clear and conspicuous as “not difficult to miss” and easily understandable by ordinary consumers, including in all the following ways:

  • In any communication that is solely visual or solely audible, the disclosure must be made through the same means through which the communication is presented. In any communication made through both visual and audible means, such as a television advertisement, the disclosure must be presented simultaneously in both the visual and audible portions of the communication, even if the representation requiring the disclosure (“triggering representation”) is made through only one means.
  • A visual disclosure, by its size, contrast, location, the length of time it appears, and other characteristics, must stand out from any accompanying text or other visual elements so that it is easily noticed, read, and understood.
  • An audible disclosure, including by telephone or streaming video, must be delivered in a volume, speed, and cadence sufficient for ordinary consumers to easily hear and understand it.
  • In any communication using an interactive electronic medium, such as the Internet or software, the disclosure must be unavoidable.
  • The disclosure must not be contradicted or mitigated by, or inconsistent with, anything else in the communication.

Third, as a defacto standard, the FTC is requiring businesses that collect biometric information (such as photographs used for FRT) should obtain affirmative express consent from users before doing so. Although undefined in the agreement, in other contexts affirmative express consent may be accomplished through a written release or digital signature (BIPA), through an affirmative opt-in pop up for the specific purpose of making the biometric disclosure and obtaining consent.

Recommended Next Steps

Beckage recommends all companies that collect biometric information, including facial recognition technology, take several proactive steps in the wake of the EverAlbum settlement.

  1. Evaluate your existing privacy policy disclosures to confirm you are in compliance with the EverAlbum requirements and to make requisite clear and conspicuous disclosures regarding the collection of biometric information and use of facial recognition technology/photo-tagging.
  2. Evaluate the use of pop-ups and opt-ins or written releases to obtain affirmative express consent for FRT practices in the United States (note, in IL, a written release is required).
  3. Evaluate default settings and deletion photo and biometric information deletion practices to ensure compliance with the EverAlbum settlement requirements.

Emerging technologies present opportunities for companies to better engage their customers, but also create new data privacy concerns. With some states looking to implement biometric privacy laws mimicking Illinois’ Biometric Information Privacy Act (BIPA), including New York Biometric Privacy Act, (AB27), companies collecting and using biometric technology, like FRT, should consult legal tech counsel to evaluate compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. Our team can help your company implement and mitigate the risks associated with emerging technologies.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

Health DataOCR Continues its Focus on Patient Access Rights

OCR Continues its Focus on Patient Access Rights

The Beckage Health Law team continues to monitor OCR developments that relate to patient access rights.  In 2020, it became clear that patient right of access to records is a significant priority of the Office of Civil Rights (OCR), under the Department of Health and Human Services (HHS).  Just last month OCR reported on a settlement, audit results, and proposed rules, all focused on patient access to records. 

For example, on December 22nd, OCR announced the settlement of its 13th investigation focused on health records access.  The investigation followed a patient complaint to the OCR after the patient was unable to obtain records from his primary care provider on two separate occasions in 2019.  Emphasizing the importance of workforce training and documentation, the OCR issued a $36,000 fine and required the provider to update its Designated Record Set Policy as part of the Corrective Action Plan. 

In December, we also saw the release of an audit report on health industry compliance for audits conducted during 2016-2017.  The December 17, 2020 report reveals findings for audits of randomly selected entities and business associates.  Of note, most organizations failed to include appropriate content in plain language in their Notice of Privacy Practices, and often missing content related to individual rights.  Moreover, the report notes that many entities did not have appropriate policies, procedures, and documentation to demonstrate compliance with rules about how to respond to requests for records.

Finally, as described more fully in Beckage’s recent blog posted about HHS proposed rules OCR proposed amending the HIPAA Rule, including amendments to expand patients’ rights to access records, increase transparency about these rights, and shorten providers’ time to respond to records. 

These three developments reaffirm OCR’s strong commitment to enforce the patient access rules, which we expect will continue in 2021. 

Beckage health law attorneys work with hospitals, health care providers and business associates to develop a compliance program tailored to mitigate risk.  Our team has significant experience in OCR enforcement matters and investigations.  We recommend that clients prioritize a review of their Notice of Privacy Practices and as well as patient access policies to help mitigate risk.  Reach out to our Beckage Health Law team for assistance analyzing these and other regulatory and legislative matters. 

*Attorney advertising. Prior results to not guarantee a similar outcome.

Subscribe to our newsletter.