Türkiye'de Mostbet çok saygın ve popüler: en yüksek oranlarla spor bahisleri yapmayı, evden çıkmadan online casinoları oynamayı ve yüksek bonuslar almayı mümkün kılıyor.
Search for:
Polskie casino Mostbet to setki gier, zakłady sportowe z wysokimi kursami, gwarancja wygranej, wysokie bonusy dla każdego.
  • Home/
  • Technology/
  • 4chan users are embracing AI voice cloning tool to generate celebrity hate speech

4chan users are embracing AI voice cloning tool to generate celebrity hate speech

An AI startup that allows anyone to clone a target’s voice in seconds is quickly embraced by internet trolls. 4chan users are flocking to the free speech synthesis platform ElevenLabs, using the company’s technology to clone celebrity voices and read audio ranging from memes and erotica to hate speech and misinformation.

Such AI voice deepfakes have improved rapidly in recent years, but ElevenLabs’ software, which appears to have opened general access over the weekend, offers a powerful combination of speed, quality and availability – as well as a complete lack of safeguards.

Misuse of ElevenLabs’ software was the first indicated by Motherboard, who found posters on 4chan sharing AI-generated speech clips that sound like famous people, including Emma Watson and Joe Rogan. As Motherboard‘s Joseph Cox reports:

In one example, a generated voice that sounds like actor Emma Watson is reading a portion of Mein Kampf. In another, a voice very similar to Ben Sharpio makes racist remarks about Alexandria Ocasio-Cortez. In a third, someone who says ‘trans rights are human rights’ is strangled.

In The edgeIn our own testing, we were able to use the ElevenLabs platform to clone targets’ voices in seconds and generate audio clips containing everything from threats of violence to expressions of racism and transphobia. In one test, we created a voice clone of President Joe Biden and were able to generate audio that sounded like the president was announcing an invasion of Russia and another that admitted the “pizzagate” conspiracy theory is real; illustrate how the technology can be used to spread misinformation. You can listen to a short SFW sample of our Biden vote deepfake below:

Eleven Labs markets its software as a way to quickly generate audio dubs for media such as film, TV and YouTube. It’s one of many startups in this space, but claims that the quality of its voices requires little editing, enabling applications such as real-time foreign language dubs and instant audiobook generation, as in the example below:

Posts on 4chan seen by The edge including manuals on how to use ElevenLabs technology; how to find the sample audio needed to train a model; and how to get around the company’s audio clip generation “credit” limits. Typical of 4chan is that the content created by its users varies widely in tone and intent, ranging from memes and copypasta to virulent hate speech and erotic fiction. Voice clones of video game and anime characters, as well as clones of YouTubers and Vtubers, are particularly popular, in part because it’s easy to find sample audios of these voices to train the software.

In a Twitter thread Posted Monday, Eleven Labs acknowledged this abuse, noting that it had seen “an increasing number of instances of voice cloning abuse” and would be exploring ways to remedy these issues. The company claims it can “trace all generated audio back to the user,” and will investigate security measures such as verifying users’ identities and manually reviewing each voice cloning request. However, at the time of publication, the company’s software is freely accessible without any restrictions on the generated content. The edge has reached out to the company for comment and will update this story if we hear back.

To predict how AI voice clones might be used and abused in the future, we can look at the recent history of video deepfakes. This technology started spreading online as a way to generate non-consensual pornography, and while many experts feared it would be used for misinformation, this has (so far) turned out to be largely false. Instead, the vast majority of video deepfakes shared online are pornographic and the software is used to it harass and intimidate not only celebrities but also individuals. At the same time, deepfakes are slowly being embraced by commercial entities and used in film and TV alongside traditional VFX techniques.


Shreya has been with australiabusinessblog.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider australiabusinessblog.com, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Leave A Comment

All fields marked with an asterisk (*) are required