Tim Draper warns of crypto scams using his AI-synthesized voice

Advancements in AI have made it possible to create deepfake videos and voices in which scammers write the scripts to try and illegally obtain others’ crypto.

American venture capitalist Tim Draper issued a warning on social media that scammers are attempting to con crypto users using an artificial intelligence (AI) voice generator.

In an Oct. 19 post on X (formerly Twitter), Draper warned his roughly 254,000 followers to be mindful of “thieves” using AI to create an approximation of his voice. According to the venture capitalist, “AI is getting smarter” as evidenced by followers seemingly reporting Draper tried to get them to send cryptocurrency.

Related: Here’s how to quickly spot a deepfake crypto scam — cybersecurity execs

Recent advancements in AI have made it easier for the average person to hear their favorite celebrity’s voice or watch a video of politicians saying whatever they want through certain programs. Following the collapse of FTX in November 2022, scammers created a deepfake video of former CEO Sam Bankman-Fried offering compensation to affected users. A similar situation occurred with a deepfake of Tesla CEO Elon Musk in May 2022.

Draper, who once predicted that the price of Bitcoin (BTC) would hit $250,000 by 2023, was an early investor in the cryptocurrency. Despite losing roughly 40,000 BTC when Mt. Gox collapsed in 2011, he has continued to be an advocate for the space and digital assets.

Magazine: US gov’t messed up my $250K Bitcoin price prediction: Tim Draper, Hall of Flame

Read Entire Article


Add a comment