
Last week, Genna Bain, whose late husband was the game critic and YouTube personality John “Totalbiscuit” Bainbroke the news to the fans that he is contemplating deleting all your videos from the internet to prevent “AI” machine learning technologies from storing and manipulating your voice to promote particular political and social views.
“Today was fun. Faced with making the decision to delete all of my late husband’s lifetime content from the internet. Apparently people think it’s okay to use their library to train voice AI to promote their social comments and like points.” political view,” Bain wrote on Twitter.
Read more: AI Comic Art Gets Copyright Revoked After Office Learns How AI Works
A sympathetic and divided community
John Bain, who died of cancer in 2018 at the age of 33, left behind a vast YouTube library of reviews, podcast interviews and other videos that could possibly be valuable source material for machine learning technologies that feed on said data. While the people inside r/Cynicalbritsubreddit official, an online community of Totalbiscuit fans, were sympathetic to the situation his widow now faces, many felt that their efforts to prevent AI voice-learning technology from committing digital necromancy by removing Totalbiscuit’s videos from the internet would be in vain. . Chief among her collective reasoning is the old Internet expression that once something is put on the Internet, for better or worse, it’s there forever. In other words, even if Bain removes her late husband’s videos from her channel, they’ve likely already been archived and saved elsewhere for fans and, sadly, AI libraries.
G/O Media may receive a commission

28% discount
HP Envy Desktop Bundle
game time
This PC has an RTX 3070 GPU, a 12th Gen Intel i9 processor, 16GB of SDRAM, and a 1TB SSD, and also includes a mouse and keyboard.
“Technology is getting pretty scary. This is identity theft on steroids, they steal not only your name, but also your face and voice,” Reddit user. iMogwai wrote.
“That’s why we can’t have nice things,” member of the fan community. bers90 wrote. “These AI people are out of control. Your deletion of everything won’t stop these people, as they’ve probably already backed everything up. It will remove TB from the internet and ‘regular’ people with good intentions won’t be able to be inspired by it, hear its wisdom and generally relax with its content.”
“I don’t think deleting the entire library will change this shitty situation much,” said another member who goes by the name Existing_End6867. “TB’s voice is widely available in so many other places that the only effect cleaning up the YouTube channel would have… would be to erase his legacy to the detriment of all of us by going there to remember him.” Existing_End6867 wrote.
Read more: Your favorite voice actors report AI sites that copy voices without consent
Although the US Copyright Office recently ruled that procedurally generated images have no copyright protection, that hasn’t stopped AI voice websites from copying and selling the voices of famous voice actors without consent. Last month, well-known anime and video game voice actors such as cowboy bebopof Steve Blum and the mass effect The series’ Jennifer Hale posted a public service announcement in a Twitter warning fans should not purchase from any AI website who has copied and sold their voices but to support real actors.
As AI voice technology becomes more and more sophisticated, the ways that people try to use it become more varied and sometimes more problematic. for example, a bullying campaign last month involved AI-generated sound clips posted to Twitter in which actors’ voices were made to use racist and homophobic slurs, as well as reveal private information, including actors’ home addresses.
“Hey folks, I know AI technology is exciting, but if you see my voice, or any of the characters I play, offered on any of those sites, please know that I have not given my permission and never will.” Blum tweeted. “This is very unethical. We all appreciate your support. Thank you.”
my city contacted Bain for comment.