Print Friendly and PDF
Happy St. Patrick’s Day
Trump fires two Democratic members of the FTC

AI voice cloning programs can easily make a copy of a person’s voice, but companies lack safeguards to prevent harm, study finds

Sound-856771_640Are you worried about AI and how it may affect your life? Concern is warranted.

Many AI voice cloning products enable consumers to create an artificial copy of a person’s voice using only a short audio clip of the person speaking, but a majority of the six company products assessed didn’t have safeguards to stop misuse of their product, Consumer Reports found in a study. 

The six companies in the review are: Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify.

AI voice cloning products have many legitimate uses, including speeding up audio editing, enhancing movie dubbing, and automating narration, Consumer Reports said the study. However, without safeguards, these products also present an opportunity for scammers. Examples: a consumer’s grandchild calling in need of money, and celebrities and political figures endorsing dubious products and bogus investment schemes.

“AI voice cloning tools have the potential to supercharge impersonation scams,” Grace Gedye, policy analyst at Consumer Reports, said in a statement. “Our assessment shows that there are basic steps companies can take to make it harder to clone someone’s voice without their knowledge – but some companies aren’t taking them.”

Consumer Reports is calling on companies to raise their standards, and it’s also calling on state attorneys general and the federal government to enforce existing consumer protection laws – and consider whether new rules are needed.

Key findings from the study

Consumer Reports researchers were able to easily create a voice clone based on publicly available audio in four of the six products tested: 

  • The products didn’t employ any technical mechanisms to ensure researchers had the speaker’s consent to generate a clone or to limit the cloning to the user’s own voice. The companies – ElevenLabs, Speechify, PlayHT, and Lovo – required only that researchers check a box confirming that they had the legal right to clone the voice or make a similar self-declaration.  
  • Descript and Resemble AI took steps to make it more difficult for customers to misuse their products by creating a non-consensual voice clone. 

Four of the six companies – Speechify, Lovo, PlayHT, and Descript – required only a customer’s name and/or email address to make an account.

Recommendations for AI voice cloning companies

Companies should:

  • Have mechanisms in place to confirm the consent of the speaker whose voice is being cloned.
  • Collect customers’ credit card information, along with their names and emails, so that fraudulent audio can be traced back to specific users.
  • Watermark AI-generated audio for future detection.
  • Provide a tool that detects whether audio was generated by their own products.
  • Detect and prevent the unauthorized creation of clones based on the voices of public figures, including celebrities and politicians.
  • Build guardrails into their cloning tools that would automatically flag and prohibit the creation of audio containing phrases commonly used in scams and fraud and other forms of content likely to cause harm, such as sexual content.
  • Consider supervising AI voice cloning, rather than offering do-it-yourself voice products.

Consumer Reports has argued successfully that companies have a legal obligation under the Federal Trade Commission Act to protect their products from being used for harm, but more robust enforcement is needed, as well as new rules, Gedye said.

However, to ensure that the United States is poised to counter AI-powered scams, Congress should grant the Federal Trade Commission additional resources and expand its legal powers, she said.

Gedye also said state attorneys general should examine whether AI voice cloning tools that make it easy to impersonate someone without their knowledge run afoul of state consumer protection laws.

In addition, Consumer Reports encourages the introduction of legislation at the federal and state level to help codify consumer protections as it relates to AI. 

Final thoughts

The report and recommendations show a lot needs to be done for AI voice cloning protection. However, Congress has struggled to pass a comprehensive AI framework comparable to laws passed in the European Union, according to the website TechCrunch. And, the Trump administration has yet to endorse major congressional AI legislation.

Meanwhile, with the lack of action from Congress, states are increasing the number of AI bills they’re considering. However, it’s difficult for states to address AI issues without a broad federal approach.

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Carol Cassara

That is so scary. I mean, really scary!

Rita

Yes, just think of the many ways this can be used. However, the warning to be even more vigilant about imposter scams is good.

I just read a long article in The New York Times today about money laundering and the complexities of how it works. It said that the scammer has the victim send the money to a special account that the money laundering can occur from. Otherwise, the money can be traced and the criminal caught.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)