Stay informed with free updates
Simply sign up to the Cryptocurrencies myFT Digest — delivered directly to your inbox.
The writer, known online as Etymology Nerd, is the author of ‘Algospeak: How Social Media Is Transforming the Future of Language’
This year, a dark new underside has emerged on Instagram Reels. Offensive memes are being manufactured in order to push cryptocurrency scams — and no one seems interested in removing them.
Since January, a grotesque set of characters has emerged on the social media app, uniquely enabled by the widespread availability of AI tools and decreased hate speech regulations on Meta platforms.
There’s “George Droyd,” an android reincarnation of George Floyd, created in April to promote the $FLOYDAI cryptocurrency. The “Kirkinator” was made in September following the death of political commentator Charlie Kirk in order to promote the $KIRKINATOR coin. There’s also a recurring cast of minor stock characters, including “Epstron” and “Diddytron” — parodies of Jeffrey Epstein and the rapper Sean Combs — also known as Diddy.
These accounts, which occupy the same narrative universe, have garnered millions of views, often by playing into racist and antisemitic tropes. The short videos include frequent use of bigoted language and recurring plot lines about race purification.
The shocking content is meant to generate engagement. The end goal is to draw attention to new “memecoins” — cryptocurrencies that ostensibly increase in value when a meme spreads. While early memecoins like $DOGE capitalised on existing memes, the George Droyd derivative and its ilk were created by crypto speculators.
The wheeze begins with pump.fun, a platform that enables users to register and trade digital coins with ease. Once a developer sets up a coin, they share it to trusted Telegram groups or X communities, where investors think of ways to manufacture attention, or so-called “mindshare”, for the meme connected to it. Next, they use AI to generate provocative videos, hoping to make their meme go viral and make their coin popular with normies — unsuspecting crypto investors less steeped in memecoin culture. After the value goes up, the original cabal “rug pulls”, abandoning the coin and selling out at a profit.
Realistically, only a few thousand people will ever buy these coins. And yet, because of the ease of creating cryptocurrency and posting AI slop, coin creators are able to play the game with ease by manufacturing cultural phenomena.
Meanwhile, the memes take on a life of their own. Once other creators see that they have viral potential, they replicate them for their own profit or internet clout. Both the “Kirkinator” and “George Droyd” characters have been used by other influencers who are unaffiliated with the original coin creators.
Yet with each reiteration, the crypto hustlers also continue to benefit. After one tweet about the Kirkinator reached 8mn views in October, for example, the $KIRKINATOR coin shot up five times in value before falling a few days later. For the investors who sold at the top, this increase in profit came as a result of millions of X users seeing a video that portrayed “George Droyd” being killed after stealing the Epstein files from the Kirkinator.
Unfortunately, the more shocking the videos, the more likely they are to go viral. Violent, offensive imagery generates more comments and higher viewing times, both of which are rewarded by the algorithm. Coin creators have learnt to use this pattern to enrich themselves. Even the people on Instagram or X who are not aware of the crypto coins may still have to endure seeing deeply disturbing clickbait.
We are caught in a maelstrom between deregulated cryptocurrency websites, accessible AI tools and social media platforms that allow offensive memes to be published.
As an internet etymologist, I find it alarming that online trends are being manufactured for the sole purpose of manipulating us. We can no longer trust that memes are genuine — there is now the risk that someone is attempting to profit on the other end.
Even when a meme isn’t directly created by crypto hustlers, it can be immediately co-opted by them. Every new reference is almost immediately registered on pump.fun and then artificially propped up for someone to make a profit.
The result is that this leaves us all less tethered to reality. More memes are going to be invented or amplified, leaving online viewers to question what they can believe. And greater exposure to a deeply unpleasant kind of discourse risks making it seem more acceptable. The only solution is to fight to reclaim the internet from those who are attempting to poison it.
Read the full article here