Amazon Alexa to Mimic the Voice of Anyone

Matthew.Rosenquist
1 min readJun 24, 2022

Ethical and Privacy issues abound. If a malicious actor combines an impersonated voice and deepfake video of a target, they can make others believe anything or harass them in unthinkable ways. We already have a deepfake problem, this compounds the issues greatly.

Even in Amazon’s use-case of keeping memories alive of deceased people, it could be more harmful than beneficial. Is synthesizing new fake memories really healthy or does it simply prolong the agony and pain by keeping people from moving on emotionally? Does it cheapen the actual memories of those that have passed?

If I die someone could make a fake digital persona of me to say things I never said or believed in? I would not have the opportunity to counter such activities. History could be changed. Such power can be misused in so many ways to the deep detriment of so many.

It is time we apply EthicalAI frameworks to better understand both the benefits and risks.

--

--

Matthew.Rosenquist
Matthew.Rosenquist

Written by Matthew.Rosenquist

CISO and cybersecurity Strategist specializing in the evolution of threats, opportunities, and risks in pursuit of optimal security

No responses yet