It’s no secret that artificial intelligence (AI) has been at the forefront of emerging technologies over the past two years, with many businesses, including elliptical, taking advantage of AI to improve its abilities. However, as with any new innovation, there is always the risk of misuse of technologies for nefarious purposes, taking advantage of the rise in popularity, new opportunities and lack of regulation.
While there is no suggestion that AI-enhanced crypto-crime has yet become a major threat, Elliptic recognizes that proactively identifying and mitigating potential emerging crime trends is critical to promoting long-term sustainable innovation.
That’s why – in addition to implementing AI-enabled solutions to further strengthen our blockchain analytics tools – we’re also releasing our new Horizon Scan report, AI-enabled crime in the cryptoasset ecosystem. In it, we identify five basic typologies of how cryptocriminals can use AI to improve their crimes, based on the indicators to date.
However, that is not the end of our efforts. Elliptic seeks to bring together industry partners and thought leaders to jointly help the industry develop best practices and timely strategies to ensure that AI-enabled crime does not become a major threat. You can help us by taking our short Delphi survey you can find here.
By participating, you’ll get exclusive early access to the resulting findings, helping you and your industry stay ahead of the curve.
In the meantime, here’s a summary of the insights you can expect from this report:
Using deepfakes, images or voices generated by artificial intelligence to make scams more convincing
Anyone involved in the crypto space is likely to come across crypto investment scams, many of which now use deepfakes of celebrities and authority figures to promote themselves. Faces Elon Muskformer Prime Minister of Singapore Lee Hsien Loong both current and former President of Taiwan, Tsai Ing-wen and Lai Ching-te used in such frauds.
Promotional deepfakes are often posted on sites like TikTok and x.com. Other scams involve using AI to fake aspects of a crypto ‘business’ to appear more authentic. In 2022, Binance’s former CCO, Patrick Hillmann, was target of deepfake scammers using his persona in an attempt to scam potential victims from the crypto industry.
Deepfake footage of Singapore Prime Minister Lee Hsien Loong (left) and Taiwan’s 7th President Tsai Ing-wen promoting cryptocurrency investments.
Fortunately, there are a number of red flag indicators that can help prevent you from falling victim to deepfake scams. That verify video authenticity, you can check that lip movements and voices are in sync, check that shadows appear where you expect, and check that facial activity such as blinking looks natural.
Creating artificial intelligence scams or pump and dump schemes
On many blockchains, it takes some effort to create a token. Many fraudsters have taken advantage of this opportunity to build hype and increase the price of their tokens, and then sell their reserves for a significant profit. This drives the price down again and leaves their victims out of pocket with an ultimately worthless investment. This is known as “carpet pulling”. Coordinated groups that initiate sudden purchases and sales of tokens also exist to make money from market manipulation or pump-and-dump schemes.
Another way scammers can build hype is by claiming their token is associated with a big new event or company. AI is the target of the latest spate of such token scams. For example, there are hundreds of tokens listed on several blockchains that have some variation of the term “GPT” in their name. Some may be the product of legitimate ventures. However, Elliptic has identified a number of exit scams among them.
Elliptic Investigator shows a number of high-risk unrelated tokens – including one related to ChatGPT – created on the same wallet address, which are being laundered from trading via a coin exchange service.
Using large language models to facilitate cyber attacks
Tools like ChatGPT can generate new code or check the existing code with varying degrees of accuracy. This has led to an intense debate about whether AI tools can be used as code review and bug checking tools, and whether black hackers can use the same capabilities to identify and design hacks. Although Microsoft and OpenAI reported cases of Russian and North Korean threats engaging in such attempts, white hat hackers have suggested the technology as a whole is not there yet.
ChatGPT and other mainstream tools, however, have become better at identifying and rejecting malicious queries, which has led to cybercriminals taking to dark web forums and asking for GPT services without ‘morality’. As numerous sales points have already reported that demand has since been answered by paid tools such as HackedGPT and WormGPT.
An ad for WormGPT (left) and a Telegram post advertising one of its features (right).
These “unethical GPTs” actively advertise opportunities such as carding, identity theft, malware, vulnerability scanning, hacking, encoding malicious smart contracts, cyber stalking and harassment, identity theft, distribution of private sensitive material and other black “unethical requests” for “illegal or legal claims”. ” making money.
Such tools, however, have received mixed reviews from users – and block analysis platforms have the advantage of being able to track payments made by subscribers to their administrators. We explore these key capabilities—and how they can be leveraged by both law enforcement investigators and compliance professionals—in more detail in the report.
Deploying crypto scams or disinformation on a large scale
Some crypto fraudsters may launch a single fraud operation and withdraw after enough money has been stolen or it has been widely exposed. Many groups of threat actors, however, engage in cyclical fraud operations. Investment, airdrop, or giveaway sites are created, widely distributed on social media and messaging apps, and then “retired” when victims cause too much controversy about their fraudulent nature. The process is then repeated with a new website, fresh marketing and so on.
Cycling through scam sites is often a resource-intensive process, and some illegal groups are looking to make it more efficient by using artificial intelligence. One scam-as-a-service provider claims to use artificial intelligence to automatically design scam site interfaces tailored for SEO considerations.
Catalog of scam website interfaces, allegedly generated using AI by a scam-as-a-service group.
Increase in identity theft
Facilitating identity theft and issuing false documents is one of the dark web’s most successful criminal enterprises. Cybercrime forums often have designated advertising spots for cybercriminals who boast of their knowledge of photoshop, offering images of fake passports, ID cards or utility bills in minutes. Now, some of these document delivery services are exploring the use of AI to augment such services.
One document generation service – using the likeness of Keanu Reeves John Wick character to advertise their product – claimed and denied the use of AI for doctor images. Elliptic identified a crypto address used to make payments to this service, which received enough payments to generate just under 5,000 fake documents within a month.
Alleged AI using the Document Generator (left) and an example of a fake John Wick document image (right).
Stay ahead of the curve
As with almost all major emerging technologies, it bears repeating that their benefits far outweigh their potential for criminal exploitation. However, measured responses from affected stakeholders are important to ensure that victimization is minimized and that technologies such as AI can continue to innovate sustainably.
At Elliptic, we are committed to ensuring that our core crypto intelligence captures AI-enhanced crypto crime so that innovators, financial services, crypto companies and law enforcement can effectively detect, track and mitigate these threats.
Contact us for a demo of our blockchain analytics tools to further explore how we can help protect your business in the changing face of crypto crime.
Remember that too participate in our Delphi survey – which will entitle you to exclusive early access to industry insights on best practices to prevent and mitigate these emerging crime trends.
Crypto Crime Articles AI