Most AI Voice Cloning Tools Aren't Safe From Scammers
Consumer Reports assessed the most leading voice cloning tools and found that four products did not have proper safeguards in place to prevent non-consensual cloning. The technology has many positive applications, but it can also be exploited for elaborate scams and fraud. To address these concerns, Consumer Reports recommends additional protections, such as unique scripts, watermarking AI-generated audio, and prohibiting audio containing scam phrases.
- The current lack of regulation in the voice cloning industry may embolden malicious actors to use this technology for nefarious purposes.
- How can policymakers balance the benefits of advanced technologies like voice cloning with the need to protect consumers from potential harm?