Senator Jacqui Lambie listens intently as a robocall is played from a computer in front of her.
“G’day Tasmanians, it’s senator Jacqui Lambie here.
“As the federal election gets closer, let’s get serious. I want to make a pledge to the people of Tasmania, that if re-elected, I will fight tooth and nail to move the national capital from Canberra to Tassie, where it bloody well belongs.”
What Senator Lambie just heard was a clone of her voice, produced by ABC NEWS Verify with her permission, to demonstrate the danger of the technology as the election nears.
The senator erupts in raucous laughter.
“You don’t need to give me a fake one for that! I could have just said it myself!” she jokes.
Loading…
Realising the gravity of the technology, Senator Lambie’s mood turns sombre.
“It’s scary. It’s really scary that they’re this close to sounding like Jacqui Lambie,” she says.
Although the senator had expected the clone to be even more convincing, she conceded it wasn’t far off.
“It’s only going to be a matter of months — I’d say months — before they have it exact.”
So, would the voters in her state know the difference?
Robo-Lambie goes to Burnie
Burnie, on Tasmania’s north-west coast, is Jacqui Lambie territory.
Outside the senator’s electoral office, we encountered a woman on her way out of the gym.
“I believe her, I’ve always liked Jacqui Lambie and seen her throughout the years and been a local and all that. But I stand by her,” she said, after hearing the voice clone’s pitch.
Loading…
Similar reactions came from residents and tourists alike, indicating support for Robo-Lambie’s pitch.
“Really?” responded the young gym goer when the ruse was revealed.
“AI-generated really fools us all,” she said.
Others weren’t so easily convinced.
“It’s a spoof of some sort. She’s not actually going to move all that infrastructure and put the cost on the taxpayer,” one tradie replied.
It was the information in the recording, he claimed, that alerted him to the fake.
“The facts will be how you navigate the AI. The context, not the words, not the voices”.
But others told us they sensed something wasn’t quite right in the voice itself.
Of the 18 people who listened to the recording, 12 admitted that they did not know they were listening to an AI voice clone of the senator.
Voice clones a unique threat
There’s been a lot of reporting on the dangers of video deepfakes — machine-generated video designed to fool audiences.
The ABC even made one of Greens senator David Shoebridge earlier this year, to demonstrate how far the technology had come.
But audio deepfakes can be a bigger risk as there are fewer clues for the listener.
And fake audio can strike not only over the internet, but over the phone, where there is less context for listeners to rely on.
There are instances where deepfaked audio has been created to influence voters in other countries, but the evidence of their impact is hard to measure.
Loading…
In Slovakia in 2023, a well-timed deepfake of parliamentary candidate Michal Šimečka, discussing electoral fraud with a prominent journalist was released days before the vote.
Mr Šimečka’s rival, Robert Fico, went on to win the election, but it isn’t clear if the deepfake made an impact.
And in the United States last year, an apparent voice clone of then-president Joe Biden tried to convince voters not to vote in the New Hampshire primary.
Mr Biden went on to win that primary, as well as clinch the nomination for president, before dropping out of the race.
Toby Murray, a professor of computing and information systems at the University of Melbourne, told ABC NEWS Verify the ability of the technology to produce a convincing deepfake has accelerated in recent years.
“Five years ago the ability to create a convincing voice clone was certainly out of reach for anyone who was not a machine learning scientist.
“Yet now there are a number of companies that offer really easy to use online tools and the ability for anyone to … create a pretty convincing voice clone.”
ABC NEWS Verify produced the audio using a market-leading platform, which is easily accessed on the internet — the voice clone cost about $100 to make.
No technical assistance from an expert was required, and just 90 seconds of audio of the senator, taken from an interview, was needed to help the platform learn.
The generator used by ABC NEWS Verify to create the voice clone has a tool which can detect whether audio has been created using the platform.
When we uploaded the audio file we created directly from the generator, the tool gave it a 98 per cent probability of being created by the generator.
As an experiment, we then played the audio file aloud on a computer, recording the sound with an iPhone.
When that iPhone audio file was loaded into the tool, the tool gave it only a 2 per cent chance of being created by the platform.
Loading…
The distortion created by the external recording was enough to fool the software, demonstrating the imperfection of technical solutions to detecting fake audio.
Professor Murray said that not only is the technology now easy to use, it can also be difficult to detect.
“We should absolutely expect the detection technology will get better, but we should also expect that it’s never going to be perfect,” he said.
He noted that there would always be an arms race between the detection software and the AI audio generators themselves.
Calls for AI regulation
Last year, a Senate select committee inquiry into the use of AI recommended the government “introduce new, whole-of-economy, dedicated legislation to regulate high-risk uses of AI”.
But like truth in political advertising laws, the government has not yet acted.
Senator Lambie says that’s not good enough, and that older Australians were particularly vulnerable to scams using the technology.
“Something needs to be done today. And instead [the government] is worrying about their seats… instead of worrying about the welfare of the people in this country,” she said.
A spokesperson for the Department of Industry, Science and Resources told ABC NEWS Verify the government is “considering proposed mandatory guardrails for AI in high-risk settings”.
“Consultation on these proposed regulations closed late last year, and the government is now considering the feedback received,” they said.
“A final approach will be announced in due course”.
How to avoid being fooled
There are steps you can take to prevent being fooled by fake audio, Professor Murray said, that don’t rely on imperfect technology.
“The basic advice [is] trying to remain sceptical, trying to be aware of when you’re being manipulated and where outrage is being used against you,” he said.
“Just in the same way when you see an ad on Facebook or a post on Facebook that’s making very outrageous claims, your degree of outrage should be some warning to you that what you’re seeing is perhaps not 100 per cent true.
“And the same is true when you’re hearing someone’s voice … If the claims being made are outrageous, that’s probably a pretty good tip-off that perhaps what you’re hearing is not legitimate.”
A spokesman for the Australian Electoral Commission (AEC) told ABC NEWS Verify robocalls should be authorised, just like other electoral advertisements.
The absence of an authorisation could be a flag that the robocall was fake, and should be reported to the AEC.
If there is an authorisation, and you still suspect the audio was faked, members of the public can make a complaint on the AEC’s website.
“The use of deepfake technology for the 2025 federal election is not illegal and, in many cases, is not likely to be used in an unethical manner,” the spokesman said.
“The AEC is also not aware of any evidence that the use of AI in electoral communication has been the determining factor in election results for the more than 60 national elections held around the world in 2024.”
The ABC is on the hunt for any misinformation or disinformation circulating in the lead-up to the federal election. Send us a tip by filling out the form below, or if you require more secure communication, select an option from our confidential tips page.
Loading…