close
close

State data protection laws are intended to combat the unauthorized cloning of voices

State data protection laws are intended to combat the unauthorized cloning of voices

Unauthorized voice cloning using generative artificial intelligence is a growing problem. However, government privacy laws can help combat voice cloning.

The State of Illinois’ Biometric Information Privacy Act is a good vehicle for exploring this potential. It provides a private right of action that has led to the development of a significant amount of relevant case law. Although BIPA’s case law does not explicitly address voice cloning, it illustrates two principles that would be relevant to any attempt to prevent voice cloning through biometric protection.

First, the BIPA case law shows that the use of AI to generate a voice with the same characteristics as that of the person concerned is only punishable if those characteristics are specific enough to identify the person concerned. This is demonstrated by Carpenter v. McDonald’s Corp.in which the judge ruled that features extracted from voice recordings using artificial intelligence did not constitute a “voiceprint” because they could not be “unambiguously identified” on the basis of those features alone.

Second, the BIPA case law shows that the way an AI system processes information determines whether it uses a protected biometric identifier. Whether a particular voice cloning system uses data unique enough to qualify as a protected voiceprint will likely be left to a jury to determine.

This is illustrated by In the dispute over data protection regarding biometric information from Facebook The main issue was whether Facebook collected and stored facial scans – a type of biometric identifier protected under BIPA. Both sides had access to the AI ​​system’s source code and to a research paper titled “DeepFace” that described how it worked.

However, the court could not resolve the question of whether the AI ​​system captured or stored a “scan of facial geometry.” It ruled that “a jury must determine the actual issues surrounding facial scanning and recognition technology.”

Given these principles, a plaintiff could likely bring a viable BIPA claim based on unauthorized voice cloning. While the features of the AI ​​system in Carpenter were not sufficient to be considered a voiceprint; in this case, the AI ​​was only used to accept orders.

In contrast, voice cloning AI is designed to emulate a person’s voice. The features an AI system would use to clone voices are likely to be more detailed than the high-level features in Carpenter. To determine whether the features used are unique enough to be considered a voiceprint, one must first identify what the features are and then determine whether their combined uniqueness is sufficient to uniquely identify the cloned speaker.

This requires a deeper dive into the details of the underlying AI model – an investigation that, as in Regarding Facebookwill likely provide an alleged victim of voice cloning with enough arguments to prevent their claim from being dismissed before trial. Therefore, for the right plaintiff (an Illinois resident) and with the right AI (a system that can uniquely mimic the plaintiff’s voice), BIPA appears to be a remedy against unauthorized voice cloning.

Other states

At least one other state privacy law appears to be well-suited to protecting residents of those states from unauthorized voice cloning. Washington State’s My Health My Data Act explicitly defines that consumer health data includes biometric data and prohibits the collection of that data without the consent of the individual, unless it is necessary to provide a requested product or service.

This is analogous to BIPA’s prohibition on collecting biometric identifiers without first obtaining permission from the individual concerned. It means that My Health My Data could address unauthorized voice cloning under the principles discussed above in the context of BIPA. Even laws that do not explicitly mention biometrics may apply, as voiceprints that can identify a specific individual appear to fall under the common definitions of personal information used in several states.

However, some state privacy laws contain provisions that complicate their application to voice cloning. Vermont’s privacy law excludes any data generated from an audio recording from the definition of biometric data, “unless such data is generated to identify a specific individual.”

Vermont’s law is so new that it’s unclear how this exception will apply or whether it can be overcome in the context of voice cloning. At a minimum, it conflicts with BIPA’s general rule that a voiceprint is protected if it is specific enough to be used to identify a particular person, even if it is never actually used for that purpose.

While state privacy laws are potential tools to combat unauthorized voice cloning, there are important differences across states.

Privacy laws can also complement other state measures, such as Tennessee’s ELVIS Act, which allows musicians and others to file lawsuits over unauthorized use of their voices.

The crux of the matter is that a person’s voice cannot be the subject of a federal copyright. And other state laws protecting the voice are not as comprehensive and apply only in certain contexts (such as advertising) or only to certain groups of people.

Given the limitations of existing voice-specific protections, additional legal tools may be needed to address the risks of voice cloning technology. Given the assumed uniqueness of a person’s voice and the widespread use of voice as a biometric authentication mechanism, privacy laws could provide such a complement to voice protections, such as the ELVIS Act.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Information about the author

William Morriss is a partner at Frost Brown Todd and focuses on law related to software and other technologies.

Write for us: Author guidelines