Back to December
ElevenLabs' Biden deepfake embarrassment; Contact Center AI Association launches
Voice AI companies - you are OVERPAYING for cloud.
Switch to SaladCloud and access 10K+ GPUs starting from $0.02/hour.
Special Offer: The first 10 qualifying This Week In Voice VIP readers get $1000 in free credits.
That’s almost 1170 days of audio transcribed.
Fill out this demo form, enter ‘PROJECT VOICE' in the ‘How did you hear about us?’ field, and attend the exploration call to redeem the offer.
Late in 2010, Taylor Swift’s illustrious career enjoyed a new first: she wrote a song to apologize to a former boyfriend, rather than simply air previous grievances.
The result was Back to December, a regretful and introspective song frequently cited as one of Swift’s most poignant and a distinct inflection point in her young career.
This is about a person who was incredible to me, just perfect to me in a relationship, and I was really careless with him.
Actor Taylor Lautner, an ex-boyfriend of Swift’s, later confirmed that the song was inspired by their relationship during a 2016 interview.
The song holds an interesting and telling place of honor in the firmament of Taylor Swift’s catalog of tunes: she went over a decade not performing it live at shows.
ElevenLabs, a synthetic voice company, has had an interesting last week or so.
First, they announced their $80M fundraise, anointing them as voice AI’s latest unicorn.
An eye-opening quote from that announcement:
“We raised the new money to cement ElevenLabs’ position as the global leader in voice AI research and product deployment,” CEO Mati Staniszewski told TechCrunch in an email interview.
I’m not sure being handed investor money is what cements a company’s leadership position, but ElevenLabs certainly has become a global leader in voice AI controversy.
The day after the company’s fundraising news was announced, news broke of ElevenLabs’ technology being used to create a deepfake of Joe Biden’s voice, which was deployed to interfere with the New Hampshire primary.
ElevenLabs then suspended the user who made the Biden deepfake.
This sequence of events is absurd, even for a company with the lengthy history of bad media coverage that ElevenLabs possesses.
ElevenLabs could have stopped this user from ever creating this fake robocall through a number of different methods. In not doing so, they ignored past security offenses from both the 2016 and 2020 US elections, among other incidents abroad.
You might ask yourself: why do we not hear more stories like this happening in US politics?
AI voices are everywhere. And the incentive to abuse them is everywhere as well.
So why has ElevenLabs been a lightning rod on the issue of voice AI misuse?
To answer that question, I’d take you back to 2022, when we had the joy of experiencing an AI-created podcast between Steve Jobs and Joe Rogan, without necessary permissions from anyone involved.
This publicity stunt was disgusting.
But what happened to the company itself that did it?
Glad you asked: absolutely nothing.
The media coverage they got from their fake podcast exceeds the sum total of anything else they’ve ever done, by an order of magnitude. They benefited greatly.
So when we look at ElevenLabs, and this latest breach of trust, why would we expect contrition? Or change?
This Biden deepfake incident, weirdly, is far from a problem. It’s actually a massive endorsement of the company’s technology, which clearly works quite well and has now been validated on a grandiose scale.
The only answer to bring this idiocy to a halt is harsh financial penalties and/or jail time for executives of repeat offender companies, akin to rules governing a bartender selling alcohol to underage minors.
It’s not a potential offender’s fault for seeing what they can get away with - it’s yours corporately for enabling it by failing to anticipate abuse.
Maybe this is the year we’ll see it happen. The industry needs it.
I am pleased to share I have joined the new Contact Center AI Association as Executive Vice President, Conversational AI. I’ll support the organization alongside my existing responsibilities related to the growth of both Project Voice as well as Project Voice Capital Partners.
The association aims to bring together CX leaders from within various cities across the United States, giving them valuable time and structure to network and share best practices. Initial chapters and meetings are planned for Atlanta, New York City, and San Francisco, with more to follow.
Conversational AI companies will be able to get involved in different ways to support the organization. If that’s of interest, let’s connect and I can share more detail. Additionally, the CCAIA will play a role in Project Voice 2024, our annual conference taking place in late April, where I’ll invite founder John Walter to speak in greater detail on the organization and its plans.