The technology known as Deepfake, used to edit and alter people’s faces or voices in order to create fake photos or videos, has increased in popularity recently. The technology is evolving and it’s beginning to be used by criminals with far more serious consequences. The CEO of an energy company from the UK received a call from his boss, the CEO of the parent company in Germany. The German boss told the British CEO to make an emergency transfer of $243,000 to a Hungarian bank account immediately. The CEO thought the voice of his boss sounded legitimate –right down to the German accent—so he complied and transferred the money. But in fact, the caller was a scammer using Deepfake audio technology to do a convincing imitation of the voice of the German boss. The scammer called two more times asking for more money to be transferred, which is when the British CEO noticed that the calls were being made from an Austrian phone number. After growing suspicious, he didn’t transfer any more money, but the original transfer of $243,000 had already been made and had been funneled to different untraceable accounts.
Important Note
Content Editors rate, curate and regularly update what we believe are the top 11% of all AI resource and good practice examples and is why our content is rated from 90% to 100%. Content rated less than 90% is excluded from this site. All inclusions are vetted by experienced professionals with graduate level data science degrees.
In the broadest sense, any inclusion of content on this site is not an endorsement or recommendation of any service, product or content that may be discussed, recommended, endorsed or affiliated with the content, company or spokesperson. We are a 501(c)3 nonprofit and receive no website advertising monies or direct or indirect compensation for any content or other information on any of our websites. For more information, visit our TOS.