We tested an upcoming iPhone feature that lets you clone your voice – can people tell it’s not real?

Technology

If I get through a meeting without saying a word, I consider it a great success.

Unfortunately, there are times when I can no longer stay under the radar. The microphone must be unmuted, the camera might have to go on, and all the attention is on me.

At least, that was the case until this week.

Thanks to a new iPhone feature that lets anyone clone their voice with no technical chops and little time required, meeting anxiety temporarily became a thing of the past.

Announced back in May and now available as part of the public beta for iOS 17, the next major software update for Apple‘s smartphone due out in September, the “personal voice” tool lets my voice read aloud any text whatsoever without needing to speak for myself.

How does it work?

The feature lives in the accessibility section of the iPhone’s settings app, under the speech heading.

To make your own on-demand digital voice, your handset tasks you with reading aloud 150 pretty random phrases, which takes about 15 or 20 minutes depending on your patience.

iPhone personal voice feature
Image:
Personal voice is an accessibility setting designed for people who are losing the ability to speak

“A German-born author won the prize for writing”, “during the Middle Ages in Europe, people bathed less often”, and “Ancient Greeks laid the foundation of Western culture” were some of the sentences I was given. I got some weird questions afterwards from people who could hear me in the next room.

The phone needs plenty of time to process the voice as it’s all done on the device itself, rather than uploaded to powerful computers somewhere at Apple HQ.

It needs to be locked and kept on charge, so probably best you leave it to work overnight.

With the voice ready for action, you enable the “live speech” function in settings and pick your personal voice. Triple tapping the phone’s side button will open a text box, and anything you enter will be spoken aloud.

Read more:
‘Holy Grail’ iPhone sells for £145,000
Apple to tweak ‘ducking’ autocorrect

iPhone personal voice feature
Image:
It requires the user to speak aloud 150 random phrases

Is it convincing?

Without wanting to expose certain relatives’ lack of tech know-how, it very much depends.

Digital me checked with my sister about the status of Taylor Swift tickets in a WhatsApp voice message and she seemed none the wiser. My mum replied to a cinema invitation with no qualms at all, until I asked whether anything about the message had sounded off.

Tech-savvy friends and loved ones were more immediately suspicious.

“Who are you and what have you done with Tom?” asked one.

“It kind of sounded like you, but as if someone made a robot version,” said another. They had me bang to rights.

As for meetings (undoubtedly my most ambitious attempt to replace myself), the longer the voice went on, the more colleagues realised I was up to some mischief.

But by and large, for something that takes just 15 minutes of work and a good sleep to set up, it’s impressive.

Like the rise of generative AI such as ChatGPT and the increasing realism of deep fake videos, it’s not just the power of such technology that has caught people’s attention, but the accessibility of it.

The digital news anchor that can read this article via the play button at the top of the page required a dedicated text-to-speech publishing company, a lengthy, professional recording session, and is constantly being tweaked to ensure she doesn’t trip up over certain words and phrases.

What I did is going to be available on everyone’s iPhone soon, with no such effort or expertise required.

Read more:
Can AI help with dating app success?
How AI could change future of journalism

Please use Chrome browser for a more accessible video player

How Sky News created an AI reporter

Isn’t this just asking for fraud trouble?

Apple says it’s an accessibility feature, designed for people who struggle to speak or are losing the ability to.

The company has explained the randomised nature of the personal voice process, with it all done on-device, keeps users’ information private and secure.

The voice cannot be shared, can be deleted, and all 150 recorded phrases can be downloaded and backed up.

Computer security company McAfee has warned voice cloning technology in general is fuelling a rise in scams, but indicated Apple’s protections should be sufficient and are unlikely to contribute to the problem.

McAfee researcher Oliver Devane told Sky News: “If you were to use an online service and there was a data breach, your voice clips could potentially be stolen.

“It only being on the device and you being able to delete the files removes that risk.

“There are already services people can use if they want to use this technology for malicious purposes.”

Click to subscribe to the Sky News Daily wherever you get your podcasts

McAfee recently surveyed 1,009 adults in the UK – and found almost a quarter had either experienced or knew someone who’d experienced some kind of AI voice scam.

It also found 65% of adults are not confident they could identify a cloned version from the real thing.

Earlier this year, voice technology company ElevenLabs provided a cautionary tale by releasing a public voice cloning suite that allowed users to upload any audio to generate an artificial voice.

It led to fake clips of Emma Watson reading Mein Kampf and Joe Biden announcing US troops will enter Ukraine.

Please use Chrome browser for a more accessible video player

‘AI will threaten our democracy’

How can I identify a fake voice?

Regardless of how it was made, there are things you can do to protect yourself against a voice scam.

Question the source – you could ask the person something only they would know to verify them.

What sets them apartis their accent or pace off? Have they lost their stutter? Listen out for key vocal traits.

Call them back – if the voice sounds right, but the number doesn’t, call them back on their known number.

Identify theft protection services – these notify you if your data is compromised and ends up on the dark web.

A verbal codeword – a word or phrase to share with friends and family that you or they will say in the case of an emergency phone call, like when they’re not using their normal device.

Products You May Like