Last week, Ubisoft Entertainment SA’s chief executive officer held an impromptu meeting with staff to defend the company’s NFT initiative. Around the same time, the makers of the upcoming game Stalker 2 tweeted a lengthy defense of their NFTs in response to criticism and then reversed course 20 minutes later as criticism intensified.
A month ago, the people behind the chat app Discord had to backtrack on an NFT announcement for similar reasons.
I’ve been covering the video game industry for nearly 13 years. I’ve watched buzzy trends come and go, from FarmVille clones that analysts said would destroy consoles to virtual reality headsets that promised we’d never look at a television again.
But I’ve never seen a gaming trend inspire as much vitriol as non-fungible tokens — unique strings of data that can (sort of) be used to prove digital ownership. They’ve generated a lot of hype among video game executives for unclear reasons, and fans have grown furious about it all.
Yes, gamers are angrier about NFTs than they were about horse armor, the $2.50 micro-transaction that became a symbol of corporate greed when it was announced in 2006 before video game companies really knew what monetizing downloadable content should look like.
Loot boxes, the randomized item mechanisms that resemble slot machines, have led to a lot of anger (and even some brief U.S. Senate inquiries) but nothing even close to what we’re seeing here from gamers.
On Dec. 7, Ubisoft revealed Quartz, a platform for NFTs that allows players of the shooter Ghost Recon to buy and sell certain equipment. The backlash was quick and loud. A YouTube video announcing the service had a ratio of 95% dislikes to 5% likes, while internal documentation reviewed by Bloomberg showed that in just a week, customer sentiment about the Ghost Recon brand flipped from mostly positive to severely negative.
So many Ubisoft employees complained about the initiative that CEO Yves Guillemot held a sudden Q&A with staff to address concerns, as Kotaku first reported.
Why do gamers hate NFTs so much? There are a lot of factors — the environmental impact associated with crypto mining, the frequent scams, the urge to never see a cartoon monkey again — but the biggest is that their mere presence in a video game is an erosion of trust.
Games, especially the big-budget ones with massive worlds, require a large commitment of both money and time from players. There’s a certain understanding that comes with that. If you’re going to sink dozens of hours of your life into the new Fallout or Final Fantasy, you need to trust that the game is going to play by a fair set of rules.
When games start asking for more money, it chips away at that trust by raising questions about the fundamental nature of a game’s design. For example, the wonderful 2018 game Assassin’s Creed Odyssey courted controversy when it introduced a set of boosters that you could buy with real money to level up your character more quickly.
This set off alarm bells in players’ heads. If getting new levels felt slow, was that because of genuine artistic decisions or because the developers wanted you to feel compelled to pay to speed things up? In the case of Assassin’s Creed Odyssey, it was easy to ignore these boosters, but they created the appearance of impropriety and made some feel a little like chumps for playing — and for paying $60 for the privilege.
This isn’t by any means a new phenomenon. Arcade games were made hyper-difficult so we’d all put in more quarters. Free-to-play games like Clash of Clans and Candy Crush are designed around making it as tempting as possible to buy their digital gems.
But NFTs tests players’ trust in a new way. Every game that uses NFTs, whether it was built from the ground up to be “play-to-earn” or it just grafted on blockchain later like Ubisoft Quartz, is designed around an economy where players can buy and sell digital items to one another.
As a result, every player is incentivized not to have a good time but to make as much money as possible. Economy comes first; enjoyment and artistic value are often secondary.
Blizzard’s Diablo III, the action-role-playing game released in 2012, provides the perfect historical example of how poorly this can go. The third Diablo introduced an auction house in which players could trade gear for real cash (and Blizzard would get a cut of every transaction).
You could ignore it, but it was always there, haunting the experience. When you defeated a challenging boss, and it failed to drop that one unique sword you really wanted, you’d have to wonder: Is the game rigged to make you want to spend more? Should you just go buy it instead? And when you actually got a cool piece of gear, you’d feel like you were missing out if you didn’t put it up on the auction house to try to earn some cash.
When a video game is coupled with an economy that can impact your real wallet, everything feels different. Your brain shifts from enjoyment to work. Every decision you make will be influenced by whether it will win or lose you money.
You’ll second-guess every aspect of the game. You won’t even be able to trust word of mouth — your buddy sending video game recommendations to the group text comes off differently when you know that joining makes their Axies go up in value.
As for Diablo III’s auction house? Blizzard spent nearly two years trying to figure out a solution to make it more palatable, then decided that it couldn’t and removed it entirely. Turns out, it sort of broke the game, and everyone hated it.
Read full story on Bloomberg