
image credit: unsplash
Researchers from CSIRO’s Data61 have developed a new technique to protect consumers from voice spoofing attacks.
Fraudsters can record a person’s voice for voice assistants like Amazon Alexa or Google Assistant and replay it to impersonate that individual. They can also stitch samples together to mimic a person’s voice in order to spoof, or trick third parties.
Detecting when hackers are attempting to spoof a system
The new solution, called Void (Voice liveness detection), can be embedded in a smartphone or voice assistant software and works by identifying the differences in spectral power between a live human voice and a voice replayed through a speaker, in order to detect when hackers are attempting to spoof a system.