The US Federal Communications Commission (FCC) has barred robocalls with voices generated by artificial intelligence (AI) to protect consumers from voice-cloning scams.
The unanimous judgment on Thursday recognizes the calls made with AI-generated voices as “artificial” under the Telephone Consumer Protection Act, according to FCC.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting the fraudsters behind these robocalls on notice,” FCC Chairwoman Jessica Rosenworcel said in a statement.
“State Attorneys General will now have new tools to crack down on these scams and ensure the public is protected from fraud and misinformation,” she added.
This comes as New Hampshire authorities continue their probe into AI-generated robocalls that used the voice of US President Joe Biden to discourage people from voting in the state’s first-in-the-nation primary last month.
“This technology now has the potential to confuse consumers with misinformation by imitating the voices of celebrities, political candidates, and close family members,” the agency said.
Effective immediately, the regulation empowers the FCC to fine companies that use AI voices in their calls or block the service providers that carry them. It also opens the door for call recipients to file lawsuits and gives state attorneys general a new mechanism to crack down on violators, according to the FCC.
The ruling, which goes into effect immediately, allows the FCC to fine companies that utilize AI voices in their calls or restrict the service providers that carry them.
According to the FCC, it also allows call recipients to file lawsuits and provides state attorneys general with a new tool for enforcing violations.
Bijay Pokharel
Related posts
Subscribe
Cybersecurity Newsletter
You have Successfully Subscribed!
Sign up for cybersecurity newsletter and get latest news updates delivered straight to your inbox. You are also consenting to our Privacy Policy and Terms of Use.