0
Facial RecognitionFTC & EverAlbum Inc. Settlement Clarifies Privacy Standards for Facial Recognition Technology

FTC & EverAlbum Inc. Settlement Clarifies Privacy Standards for Facial Recognition Technology

One of Beckage’s 2021 privacy predictions is the continued rise of biometric lawsuits and legislation, even outside Illinois’ BIPA. Case in point is a recent consent decree the Federal Trade Commission issued against EverAlbum, a California company, concerning its use of photo-tagging and facial recognition technologies.

The Claims Against EverAlbum Inc.

In its complaint, the FTC alleges that EverAlbum, Inc. violated Section 5 of the Federal Commission Act by making several misrepresentations concerning its App’s use of facial recognition technology (FRT). Specifically, the FTC alleged that:

  • EverAlbum’s facial recognition feature was on by default. InFebruary 2017, EverAlbum launched a new feature in the Ever App, called ‘Friends’ that used facial recognition technology to group users’ photos by the faces of the people who appear in them and allowed users to “tag” people by name. EverAlbum allegedly enabled facial recognition by default for all mobile app users when it launched the ‘Friends’ feature.
  • EverAlbum falsely claimed that users must affirmatively activate FRT. Between July 2018 and April 2019, EverAlbum allegedly represented that it would not apply facial recognition technology to users’ content unless users affirmatively chose to activate the feature. Although, beginning in May 2018, the company allowed some Ever App users—those located in Illinois, Texas, Washington and the European Union—to choose whether to turn on the face recognition feature, it was automatically active for all other users until April 2019 and could not be turned off.
  • EverAlbum used users’ images to create a larger dataset to develop its FRT, and sold FRT services to enterprise clients. Between September 2017 and August 2019, EverAlbum combined millions of facial images that it extracted from users’ photos with facial images that EverAlbum obtained from publicly available datasets to create datasets for use in the development of its facial recognition technology. The complaint alleges that EverAlbum used the facial recognition technology resulting from one of those datasets to provide the Ever App’s “Friends” feature and also to develop the facial recognition services sold to its enterprise customers without disclosing this to users.
  • EverAlbum Failed to delete photos from deactivated accounts. EverAlbum is also alleged to have promised users that the company would delete the photos and videos of users who deactivated their accounts. The FTC alleges, however, that until at least October 2019, EverAlbum failed to delete the photos or videos of any users who had deactivated their accounts and instead retained them indefinitely.

FTC v. EverAlbum Inc. Settlement Agreement

In the consent Agreement, the FTC requires EverAlbum to:

  • Delete Certain User Information: Specifically, within 30-90 days of the agreement, EverAlbum must delete:
    1. The photos and videos of Ever App users who deactivated their accounts
    2. All face embeddings, data reflecting facial features that can be used for facial recognition purposes, the company derived from the photos of users who did not give their express consent to their use.
    3. Any facial recognition models or algorithms developed with EverAlbum users’ photos or videos
  • Make Clear and Conspicuous Disclosures: EverAlbum must clearly and conspicuously disclose to the user from whom the respondent has collected the biometric information, separate and apart from any Privacy Policy, Terms of Use page, or other similar document, all purposes for which respondent will use, and to the extent applicable, share, the biometric information.
  • Obtain Affirmative Express Consent from Users: EverAlbum must obtain affirmative express consent from users whose biometric information is collected.

Potential Application of EverAlbum Settlement

The FTC v. EverAlbum Inc. settlement sets a defacto standard for businesses who are collecting biometric information from consumers in the United States. Companies who use biometric data or facial recognition technology should observe the following takeaways from this settlement:

First, the settlement makes clear that facial recognition technology used on photographs is a regulated biometric practice. This is somewhat unclear under the Illinois BIPA statute, where defendants have argued that photographs are exempt from the law.

Next, as a defacto standard, the FTC is requiring that businesses make clear and conspicuous disclosures regarding their biometric practices. The Agreement defines clear and conspicuous as “not difficult to miss” and easily understandable by ordinary consumers, including in all the following ways:

  • In any communication that is solely visual or solely audible, the disclosure must be made through the same means through which the communication is presented. In any communication made through both visual and audible means, such as a television advertisement, the disclosure must be presented simultaneously in both the visual and audible portions of the communication, even if the representation requiring the disclosure (“triggering representation”) is made through only one means.
  • A visual disclosure, by its size, contrast, location, the length of time it appears, and other characteristics, must stand out from any accompanying text or other visual elements so that it is easily noticed, read, and understood.
  • An audible disclosure, including by telephone or streaming video, must be delivered in a volume, speed, and cadence sufficient for ordinary consumers to easily hear and understand it.
  • In any communication using an interactive electronic medium, such as the Internet or software, the disclosure must be unavoidable.
  • The disclosure must not be contradicted or mitigated by, or inconsistent with, anything else in the communication.

Third, as a defacto standard, the FTC is requiring businesses that collect biometric information (such as photographs used for FRT) should obtain affirmative express consent from users before doing so. Although undefined in the agreement, in other contexts affirmative express consent may be accomplished through a written release or digital signature (BIPA), through an affirmative opt-in pop up for the specific purpose of making the biometric disclosure and obtaining consent.

Recommended Next Steps

Beckage recommends all companies that collect biometric information, including facial recognition technology, take several proactive steps in the wake of the EverAlbum settlement.

  1. Evaluate your existing privacy policy disclosures to confirm you are in compliance with the EverAlbum requirements and to make requisite clear and conspicuous disclosures regarding the collection of biometric information and use of facial recognition technology/photo-tagging.
  2. Evaluate the use of pop-ups and opt-ins or written releases to obtain affirmative express consent for FRT practices in the United States (note, in IL, a written release is required).
  3. Evaluate default settings and deletion photo and biometric information deletion practices to ensure compliance with the EverAlbum settlement requirements.

Emerging technologies present opportunities for companies to better engage their customers, but also create new data privacy concerns. With some states looking to implement biometric privacy laws mimicking Illinois’ Biometric Information Privacy Act (BIPA), including New York Biometric Privacy Act, (AB27), companies collecting and using biometric technology, like FRT, should consult legal tech counsel to evaluate compliance with these emerging laws. Beckage attorneys, who are also technologists and former tech business owners, have years of collective experience with new technologies, like artificial intelligence, biometric data, facial recognition technology. Our team can help your company implement and mitigate the risks associated with emerging technologies.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes.

2020Looking Back on 2020’s Top Privacy and Cybersecurity Trends

Looking Back on 2020’s Top Privacy and Cybersecurity Trends

As 2020 comes to a close, Beckage looks back on the ways this difficult and unprecedented year impacted the data privacy and cybersecurity landscape both domestically and across the globe.

Enhanced Privacy Challenges and Concerns Due to Covid-19

In response to the COVID-19 pandemic, businesses around the globe made a major pivot to online or virtual operations early this year. An intentional focus on data protection and a solid understanding of the regulatory landscape is a legal requirement that demands the integration of data protection up front in any network design or business practice. The increase in exposure of company assets made it necessary to implement a variety of technical safeguards. Companies still had to meet the compliance milestones of the NY SHIELD Act and California’s Consumer Protection Act (CCPA) while dealing with new privacy challenges caused by a distributed workforce and a global health pandemic. Beckage reminds organizations of the importance of revisiting their readiness through business continuity, incident response, and more expansive administrative, technical, and physical safeguards when shifting to a work-from-home model and recommends continued assessment of your company’s privacy pitfalls in this ever-shifting legal landscape.

Increased Ransomware and Cyberattacks

With rapid changes in organizational operations caused by the COVID-19 pandemic, attackers became more sophisticated in their strategies and unleashed several unrelenting, simultaneous attacks on service providers and the organizations they serve in 2020. Victims of recent cyber attacks, such as the SolarWinds campaign carried out in December, include government agencies, healthcare providers, consulting agencies, and , technology, telecom, and oil and gas companies. In many of these campaigns, attackers were able to gain access and move freely throughout an organization’s server, installing additional software, creating new accounts, and accessing sensitive data and valuable resources while remaining largely undetected. In response to the uptick in data incidents this year, the Beckage Incident Response Team recommends organizations implement several preventative steps to safeguard their organization to help minimize legal risk.

Patient Access Rights and Interoperability

Recent developments in 2020 concerning patients’ right to access health information to implement interoperability and record access requirements intend to help patients obtain access to health records and payment data to make informed decisions about their healthcare. The CMS Proposed Rule and the OCR Proposed Rule represent a complete overhaul of well-established standards and an introduction of new and highly technical requirements with healthcare compliance. The experienced Health Law Team at Beckage can help to distill these lengthy and complicated rules so organizations can understand practical implications on daily operations.

Increased International Focus on Consumer Privacy

On the heels of EU’s General Data Protection Regulation (GDPR), many countries followed suit by establishing legal frameworks for governing how organizations collect, use, and store their citizens’ personal data. One example is Brazil’s Lei Geral de Proteção de Dados (LGPD), which went into effect in August of 2020. This general data protection law, which closely mimics the GDPR, places strict requirements on organizations that process Brazilian citizen’s personal data.

At the same time, Europe continued to elevate its enforcement of the GDPR, with major decisions from various member state Data Protection Authorities, the European Court of Justice (ECJ), and the European Data Protection Board (EDBP). The most impactful for businesses across the globe was the ECJ’s decision in Schrems II, which invalidated the EU-US Privacy Shield and called into question the long-term viability of the Standard Contractual Clauses (SCCs) to transfer data from the EU to the US. In 2021, companies should closely monitor the evolving guidance on international data transfers and be prepared to mitigate risk of global data transfers.

Beckage’s Global Data Privacy Team expects continued adoption of data protection regulations across many regions, and an emphasis on creating global security and privacy compliance programs in the year ahead.

Uptick in ADA Litigation

This past year, the Beckage Accessibility Team has witnessed a drastic increase in litigation under Title III of the Americans with Disabilities Act. On average, about eight new lawsuits are filed a day by disabled individuals alleging unequal access to goods and services provided on a company’s digital platforms. While the Department of Justice (DOJ) has consistently held that the ADA applies to websites and mobile apps, they have failed to clarify the precise requirements for a business to be deemed compliant. This has prompted a wave of litigation by plaintiffs’ who claim a website or mobile app’s incompatibility with assistive technology, like screen-reading software, has denied them full access to and equal enjoyment of the goods, services, and accommodations of the website, therefore violating the ADA. Most of these lawsuits are settled quickly out of court to avoid litigating in such uncertain legal terrain.

Beckage handles the defense of website accessibility lawsuits as well as assists companies in navigate pre and post-suit settlement agreements for this unique area of the law.  Beckage also works with clients under privilege to conduct internal and remedial audits of client websites and mobile applications, evaluate platform compatibility and oversee implementation of recommended remedial or accessibility-enhancement measures.

California Consumer Protection Act (CCPA)  

Enforcement of California’s comprehensive California Consumer Privacy Act (CCPA) began on July 1, 2020 and has brought a range of plaintiff related lawsuits under its private right of action provision expanding California breach laws. For a data breach to be actionable, the information accessed must be identified as personal information, as narrowly defined by California’s data breach notification law. Recently, in November 2020, the Consumer Right To Privacy Act (CRPA) ballot initiative was passed, creating additional privacy rights and obligations pertaining to sensitive personal information that will go into effect. CPRA also expands data breach liability created by the CCPA, adds a private right of action for unauthorized access that permits access to an account if the business failed to maintain reasonable security, and imposes data protection obligations directly on service providers, contractors, and third parties. Beckage urges businesses who operate in or serve California citizens to continue to follow CCPA developments and carefully monitor related litigation in the coming months.

Emerging Technologies

The recent expansion of the Illinois Biometric Information Privacy Act (BIPA) has resulted in numerous class actions suits against organizations alleged to have collected plaintiffs’ biometric data. With the expanding use of biometric equipment, these claims often allege defendants obtained plaintiffs’ biometric data without complying with the BIPA’s notification and consent requirements. Upcoming class suits may address the issue of BIPA having an extraterritorial effect when bringing claims against out of state vendors.

Similarly, computers that manipulate the media, known as deep fakes, advance the dangers of influenced perceptions. The advancements of deep fakes are giving rise to laws regarding defamation, trade libel, false light, violation of right of publicity, or intentional infliction of emotional distress. Sophisticated tech lawyers can assist in determining rights and technological solutions to mitigate harm. As former tech business owners, Beckage lawyers want to drive innovation with use of these new and emerging technologies while understanding standards and laws that may impact such development. Beckage recommends that companies proactively mitigate the risks associated with collecting biometric information and deep fakes to prevent legal repercussions and defamation. 

Key Takeaways

2020 proved to be an unpredictable year in more ways than one. The COVID-19 pandemic forced companies to rapidly adapt to new privacy and data security challenges caused by a distributed workforce, emerging technologies, and an increased focus on ecommerce with in-person shopping and events. As we move towards 2021 with no definitive end to the pandemic in sight, it is crucial for companies to prioritize data privacy and cybersecurity initiatives by consulting qualified legal tech experts who can help navigate the uncertainty next year will bring. Beckage attorneys can assist in creating, implementing, and evaluating robust data security and privacy infrastructures that will help put your business in a position to tackle all the challenges 2021 has in store.

*Attorney Advertising. Prior results do not guarantee similar outcomes.

Subscribe to our newsletter.

Artificial IntelligenceArtificial Intelligence Best Practices: The UK ICO AI and Data Protection Guidance

Artificial Intelligence Best Practices: The UK ICO AI and Data Protection Guidance

Artificial intelligence (AI) is among the fastest growing emerging information digital technology. It helps businesses to streamline operational processes and to enhance the value of goods and services delivered to end-users and customers. Given AI is a data-intensive technology, policymakers are seeking ways to mitigate risks related to AI systems that process personal data, and technology lawyers are assisting with compliance efforts.

Recently, the UK Information Commissioner Office (ICO) published its Guidance on AI and Data Protection. The guidance follows the ICO’s 2018-2021 technology strategy publication identifying AI as one of its strategic priorities.  

The AI guidance contains a framework to guide organizations using AI systems and aims to:

  • Provide auditing tools and procedures the ICO will use to assess the compliance of organizations using AI; and  
  • Guide organizations on AI and data protection practices.

AI and Data Protection Guidance Purpose and Scope

The guidance solidifies the ICO’s commitment to the development of AI and supplements other resources for organizations such as the big data, AI, and machine learning report and the guidance on explaining decisions made with AI which the ICO produced in collaboration with the Alan Turing Institute in May 2020.

In the AI framework, the ICO adopts an academic definition of AI, which in the data protection context, refers to ‘the theory and development of computer systems able to perform tasks normally requiring human intelligence’. While the guidance focuses on machine-learning based AI systems, it may nonetheless apply to non-machine learning systems that process personal data.

The guidance seeks to answer three questions. First, do people understand how their data is being used? Second, is data being used fairly, lawfully and transparently? Third, how is data being kept secure?

To answer these questions, the ICO takes a risk-based approach to address different data protection principles including transparency, accountability and fairness. The framework outlines measures that organizations should consider when designing artificial intelligence regulatory compliance. The applicable laws driving this compliance are UK Data Protection Act 2018 (DPA 2018) and the General Data Protection Regulation (GDPR).

The ICO details key actions companies should take to ensure their data practices relating to AI system comply with the GDPR and UK data protection laws. The framework is divided into four parts focusing on (1) AI-specific implications of accountability principle (2) the lawfulness, fairness, and transparency of processing personal data in AI systems (3) security and data minimization in AI systems and (4) compliance with individual rights, including rights relating to solely automated decisions.

AI Best Practices

This section summarizes selected AI best practices outlined in the guidance organized around the four data protection areas. When working towards AI legal compliance, organizations should work with experienced lawyers who understand AI technologies to address the following controls and practices:

Part One: Accountability Principle

  • Build a diverse, well-resourced team to support AI governance and risk management strategy
  • Determine with legal the companies’ compliance obligations while balancing individuals’ rights and freedoms
  • Conduct Data Protection Impact Assessment (DPIA) or other impact assessments where appropriate
  • Understand the organization’s role: controller/processor when using AI systems

Part Two: Lawfulness, Fairness, and Transparency of Processing Personal Data

  • Assess statistical accuracy and effectiveness of AI systems in processing personal data
  • Ensure all people and processes involved understand the statistical accuracy, requirements and measures
  • Evaluate tradeoffs and expectations
  • Adopt common terminology that staff can use to communicate about the statistical models
  • Address risks of bias and discrimination and work with legal to build into policies

Part Three: Principles of Security and Data Minimization in AI Systems

  • Assess whether trained machine-learning models contains personally identifiable information
  • Assess the potential use of trained -machine learning models
  • Monitor queries from API’s users
  • Consider ‘white box’ attacks
  • Identify and process the minimum amount of data required to achieve the organization’s purpose

Part Four: Compliance with Individual Rights, Including Rights Relating to Solely Automated Decisions

  • Implement reasonable measures respond to individual’s data rights requests
  • Maintain appropriate human oversight for automated decision-making

The ICO anticipates developing a toolkit to complement the AI guidance. In the meanwhile, the salient points to the ICO guidance’s rests upon these key takeaway’s organizations should understand the applicable data protection laws and assemble the right team to address these requirements.

Building privacy and security early into the development of AI can provide efficiencies in the long-term to address the growing focus of regulatory authorities on ensuring that these technologies include data protection principles.  Also working towards robust AI compliance efforts, organizations can find themselves having a competitive advantage.  Beckage’s lawyers, many who are also technologists and have been trained by MIT regarding business use of AI, have been quoted in national media about AI topics.  We stand ready to answer any of your questions.

*Attorney advertising. Prior results do not guarantee future outcomes.

Subscribe to our newsletter.

BIPABIPA Suits Against Third Parties: An Emerging Trend

BIPA Suits Against Third Parties: An Emerging Trend

Companies should take note of the recent expansion of biometric privacy laws, that could have significant impact on their businesses, changing how they collect and process biometric data and how third party vendors handle such data.

Background on BIPA

The Illinois Biometric Information Privacy Act (BIPA) was passed on October 3, 2008, and regulates how “private entities” collect, use, and share biometric information and biometric identifiers, collectively known as biometric data.  BIPA imposes certain security requirements including:

1. Developing a publicly available written policy regarding the retention and destruction of biometric data in an entity’s possession.

2. Providing required disclosures and obtaining written releases prior to obtaining biometric data.

3. Prohibiting the sale of biometric data.

4. Prohibiting the disclosure of biometric data without obtaining prior consent.

Expansion of BIPA to Third Party Vendors

In a significant turn of events, courts in Illinois are applying BIPA to third party vendors who do not have direct relationships with plaintiffs, but whose products are used by plaintiff’s employees or in other settings to collect plaintiff’s biometric data.

This is an alarming expansion of BIPA’s scope of which all third-party providers should be aware.  Under this caselaw, putting a biometric-collecting product into the stream of commerce does not immunize the manufacturer of that product from suit in Illinois.

Since the passing of BIPA, numerous class actions suits have been filed against those alleged to have collected plaintiffs’ biometric data, but claims brought up against vendors that sell the biometric equipment are exponentially growing.  These claims allege not that plaintiffs have had direct contact with the vendor defendants, but that the defendants obtained the plaintiff’s biometric data through timekeeping equipment without complying to BIPA’s requirements.

Recently, the U.S. District Court for the Northern District of Illinois held that a biometric time clock vendor could be liable for violations of BIPA in the context of employment, extending the liability to people who “collect” biometric information.  

Another recent decision, Figueroa et al v. Kronos, held that the plaintiffs sufficiently alleged that the collection function extended to the company, Kronos, and was responsible, along with the employer, for obtaining required employee consent.

These cases, among others, signify that third-party vendors are becoming defendants in BIPA consent cases and broaden third party contribution claims brought by employers against the vendors of Biometric clocks for failure to obtain required consent.  These decisions also allow insured employers to seek contributions from clock vendors for any judgement assessed against an insured employer under the Employment Practices Liability (EPL).

However, BIPA’s Section 15(a), which requires publicly available policies for the retention and destruction of biometric data, makes it difficult for plaintiffs to make claims against third parties in federal court.  BIPA Section 15(a) creates an issue of standing.  A state federal court could exercise jurisdiction over a vendor in connection with a BIPA claim if the vendor maintained continuous and systematic contacts with Illinois.  If the vendor is located in the forum state, then there is no jurisdictional dispute, but since many vendors sell their equipment nationally, the issue of whether the court has specific personal jurisdiction of the vendor must be addressed.

For example, in Bray v. Lathem Time Co., the US District Court for the Central District of Illinois alleged that the defendant sold a facial-recognition time keeping product to the plaintiff’s employer and violated BIPA because they failed to notify employees and obtain their consent.  The plaintiffs had no dealing with the defendant, who was located in Georgia but was sued in Illinois.  The court found no contacts between the defendant and the state of Illinois and concluded that the time keeping equipment was sold to an affiliate of the plaintiff’s employer and then transferred to Illinois by the employer.  The court concluded that it lacked jurisdiction over the defendant vendor.

Expansion of BIPA Outside Illinois?

Vendors being located in states outside of Illinois raises the question of whether BIPA is applicable to conduct in other states.  But while BIPA is applied to violations in Illinois, upcoming class suits may address the issue of BIPA having an extraterritorial effect when bringing claims against out of state vendors.  The extraterritorial application of BIPA is fact-dependent and courts acknowledge that decertifying extraterritoriality as being evaluated on an individual basis may be appropriate.  Companies collecting, using, and storing biometric information will face an increased risk in BIPA lawsuits.

Take-A-Ways

All companies should assess whether they are collecting biometric data, directly or through third parties.  Next is to evaluate the legal requirements regarding the handling of such data.  Note, many state data breach laws include biometric data as protected personally identifiable information (PII).  Companies should take steps to comply with applicable laws, including developing policies and practices around handling biometric data.  Also, contracts with third party vendors should be reviewed to help protect the business if there is mishandling of biometric data.

About Beckage

At Beckage, we have a team of skilled attorneys that can assist your company in developing BIPA compliant policies that will help mitigate the risks associated with collecting biometric information.  Our team of lawyers are also technologists who can help you better understand the legal implications surrounding BIPA and the legal repercussions that follow suit.

Subscribe to our newsletter.

*Attorney Advertising.  Prior results do not guarantee future outcomes. *

Disinformation and Deep FakesThe Risks Associated with Disinformation and Deep Fakes

The Risks Associated with Disinformation and Deep Fakes

Disinformation is the deliberate spreading of false information about individuals or businesses to influence public perceptions about people and entities.  Computers that manipulate the media, known as deep fakes, advance the dangers of influenced perceptions.  Deep fakes can be photos, videos, audio, and text manipulated by artificial intelligence (AI) to portray known persons acting or speaking in an embarrassing or incriminating way.  With the advancements of deep fakes becoming more believable and easier to produce, disinformation is spreading at alarming rates.  Some risks that arise with disinformation include:

·       Damage to Reputation

Reputational damage targets companies of all sizes with rumors, exaggerations, and lies that harm the reputation of the business for economic strategy and gain. Remedying reputational damage may require large sums of money, time, and other resources to prove the media was forged.

·       Blackmail and Harassment

Photos, audio, and text manipulated by AI can be used to embarrass or extort business leaders, politicians, or public figures through the media.

·       Social Engineering and Fraud

Deep fakes can be used to impersonate corporate executives’ identities and facilitate fraudulent wire transfers.  These tactics are a new variation of Business E-mail Compromise (BEC), traditionally considered access to an employee or business associate’s email account by an impersonator with the intent to trick companies, employees, or partners into sending money to the infiltrator.

·       Credential Theft and Cybersecurity Attacks

Hackers can also use sophisticated impersonation and social engineering to gain informational technology credentials through unknowing employees.  After gaining access, the hacker can steal company data and personally identifiable information or infect the company’s system with malware or ransomware.

·       Fraudulent Insurance Claims

Insurance companies rely on digital graphics to settle claims, but photographs are becoming less reliable as evidence because they are easy to manipulate with AI.  Insurance companies will need to modify policies, training, practices, and compliance programs to mitigate risk and avoid fraud.

·       Market Manipulation

Another way scammers seek to profit from disinformation is through the use of fake news reports and social media schemes using phony text and graphics to impact financial markets.  Traders who use social post and headline-driven algorithms to make market decisions may find themselves prey to these types of schemes.  As accessibility to realistic but manipulated video and audio increases, these misperceptions and disinformation will become substantially more believable and difficult to correct.

·     Falsified Court Evidence

Deep fakes also pose a threat to the authenticity of media evidence presented to the court.  If falsified video and audio files are entered as evidence, they have the potential to trick jurors and impact case outcomes.  Moving forward, courts will need to be trained to scrutinize potentially manipulated media.

·     Cybersecurity Insurance

Cybersecurity insurance helps cover businesses from financial ruin but has not historically covered damages due to disinformation.  Private brands, businesses, and corporations should consider supplementing their current insurance policies to address disinformation to help protect themselves from risk.

Legal Options

There are legal avenues that can be pursued in responding to disinformation.  Deep fakes that falsely depict individuals in a demeaning or embarrassing way are subject to laws regarding defamation, trade libel, false light, violation of right of publicity, or intentional infliction of emotional distress if the deep fake contains the image, voice, or likeness of a public figure.  

Preventative Steps

Apart from understanding the risks associated with disinformation, companies can work to protect themselves from disinformation and deep fakes by:

1. Engaging in social listening to understand how a company’s brand is viewed by the public.

2. Assessing the risks associated with the business’ employed practices.

3. Registering the business trademark to have the protection of federal laws.

4. Having an effective incident response plan in the event of disinformation, deep fakes, or data breach to mitigate costs and prevent further loss or damage.

5. Communicating with social media platforms in which disinformation is being spread.

6. Speaking directly to the public, the media, and their customers via social media or other means.

7. Bringing a lawsuit into court if a business is being defamed or the market is manipulated.

What To Do When Facing Disinformation

If a business is facing disinformation, sophisticated tech lawyers can assist in determining rights and technological solutions to mitigate harm.  Businesses are not defenseless in the face of disinformation and deep fakes but should expand their protective measures to mitigate the risks associated.  

About Beckage

Beckage is a team of skillful technology attorneys who can help you protect your company from cyber attacks and defamation cause by disinformation and deep fakes. Our team of certified privacy professionals and lawyers can help you navigate the legal scope of the expanding field of disinformation.

*Attorney Advertising.  Prior results do not guarantee similar outcomes.*

Subscribe to our newsletter.

1 2