Note that the topic of dipfake technology and artificial intelligence in general was commented on the day before by the head of Galaxy Digital and billionaire Mike Novogratz. According to him, the authorities need to fear AI in the first place and not create problems for the digital asset industry.

Novogratz explains his point of view primarily by the so-called identity crisis caused by fake videos that use artificial intelligence. The latter is precisely enabling the creation of fake videos, photos and audio, which could be detrimental to humanity not only within cryptocurrency exchanges.

Galaxy Digital CEO Mike Novogratz

For example, a quality fake video of some celebrity like presidents or famous entrepreneurs can affect the stock market, crashing it. And since Elon Musk is now selling “blue badges” of confirmed personalities on Twitter left and right, it helps the scammers even more.

Yet users of the social network have historically trusted official accounts more. And now, alas, there are plenty of such accounts with just a few followers.

The problems of cryptocurrency exchanges

Dipfakes are created using artificial intelligence tools that use machine learning to create convincing audio, images or videos where there is a resemblance to a person. In general, the technology does not violate the law, but it is already being actively used by attackers. And experts say the situation will only get worse in the future.

In an interview with Cointelegraph, Jimmy Su said that the exchange is recording an increase in the number of fraudsters trying to circumvent the KYC process on the exchange using dipfakes. Here’s a relevant quote from a platform employee.

A hacker can find a common photo of a victim somewhere on the internet. Based on this, using dipfake technology, he will be able to create a video of the person’s face to confirm their identity on the exchange.

For a demonstration of the situation, take a look at the following video. Here, the dipfake successfully passes the identity verification, essentially tricking the KYC service provider. As the authors of the video noted, they tested ten identity verification platforms, and all of them turned out to be vulnerable to this problem.

As a reminder, the KYC procedure is mandatory for the vast majority of cryptocurrency exchanges, well Binance was one of the first to introduce forced identity verification in the summer of 2021. Overall, the decision is one of the requirements of regulators who are fighting money laundering in the crypto industry.

On the face of it, KYC on Binance is almost impossible for an outsider to pass, but dipfakes could make a difference. Here’s the relevant replica Su.

Some checks require the user to, for example, blink their left eye or look left or right, look up or down. Artificial intelligence is advanced enough to actually perform these commands.

Again, the clip above confirms this.

To the human eye, the dipfaces are still visible – they can be identified by strange facial movements and other “artefacts”. With this in mind, employees of companies checking the identification process can screen out intruders. But the technology continues to evolve, so over time it will become increasingly difficult to distinguish fake videos from reality. Here’s Jimmy’s quote to that effect.

When watching these videos, there are certain things that we can detect with the human eye. Artificial intelligence will get better over time and correct the inaccuracies. So humans are not something we can always rely on.

Deepfakes can be used for more than just entertainment

In August 2022, Binance communications director Patrick Hillmann claimed that a team of hackers had used his previous interviews to create a dipfake. Hillmann’s “fake version” was then used to hold Zoom conferences with various crypto start-up teams. The attackers promised them the opportunity to have their coins listed on Binance – for a sum of money, of course.

And this is yet another use of advanced technology to scam people. We can assume that the ways of stealing money using dipfake are limited only by the imagination of the perpetrator.

Twitter app interface


We think that dipfake could really pose a problem for cryptocurrency exchanges, which could get fined if a fake identity is proven - especially if the author of the fraud is located in a sub-sanctioned region. So it is possible that a more thorough KYC procedure will await cryptocurrency enthusiasts in the future. Also, we shouldn't forget about the prospects of more sophisticated online scams, which will use celebrity images.

What do you think about what’s going on? Share your opinion in our cryptochat ex-wealthy, whose members are actively trying to wait for a new bullrun on the coin market.