Gaming Trends Nobody Asked For

Even though it's been around for decades now, video gaming is still a rapidly-evolving industry, with new ideas, styles, and methods being developed all the time. That's part of what makes gaming so great: at any moment, a game might change everything. Maybe a guy in Sweden will come up with a game where you mine blocks; maybe somebody else will turn battle royale games into a sensation.

Advertisement

Naturally, if a good idea emerges, other people will copy it. Sometimes this leads to a thriving new genre or business model...or a lot of rip-offs that don't add anything to the original. But what happens if a bad idea emerges? You might think a healthy, thriving industry would recognize the bad idea for what it is, and discard it. But we don't live in a world that nice. Sometimes, a bad idea will be copied, too. And all of the sudden, the whole industry is swamped with that terrible decision.

From "cutting-edge" hardware that went nowhere, to business decisions that screwed gamers over, to one very peculiar weapon choice, here are some gaming trends that nobody asked for.

Motion controls

In 2001, Microsoft debuted the Xbox, formally becoming a player in the console space. This set off an arms race for powerful hardware, as Sony and Microsoft developed their next-generation machines, which became the PlayStation 3 and the Xbox 360. But Sony and Microsoft were both behemoth companies, with near-limitless capital on-hand from their various other ventures outside of gaming.

Advertisement

This left Nintendo in a bind. Without any business outside of gaming, Nintendo simply didn't have the resources of its competitors. Nintendo couldn't make a console with the raw power of the PS3 or the 360. So how could Nintendo innovate to stay competitive?  Their solution was motion controls. Eschewing the standard controller, Nintendo's new console had the simplest input method imaginable: you wave a stick. Called the Wii, the console was an inexpensive package–much cheaper than its rivals–that promised family- and party-friendly games that literally anybody could play, whether or not they knew what a controller was.

The result was a sales sensation: for a while, Nintendo had trouble keeping production up with demand. Sony noticed, and when it released the PlayStation 3, players found that Sony had ditched the traditional rumbler from its controllers in favor of a motion control sensor (rumblers were reintroduced later). Later, Sony would release its own motion control wands, in a direct stab at Nintendo.

Advertisement

But while this was all very inviting for non-gamers, it didn't do all that much for actual gamers. Few hardcore titles ever released for the Wii, and when they did, they never looked nearly as nice as on the more powerful competition. Meanwhile, the casual crowd didn't follow Nintendo to its next console, the Wii U, which was a sales disappointment despite full backwards-compatibility with the Wii. Sony continued supporting their motion control wands for the PlayStation 4, though few non-VR games use them. Once an exciting new direction for video games, motion controls are all but defunct nowadays. Perhaps the new VR trend will bring them back, but for traditional games, it's controllers or bust. 

Or keyboard and mouse–pipe down, PC gamers.

Camera controls

As Nintendo developed motion controls, Microsoft was already developing a brand new input that would be more flexible, more powerful, and more eye-catching than anything else on the market: camera controls. The idea was that an integrated camera-microphone system could actually see and hear users, allowing for a range of commands that were simply impossible before. Everything from basic voice prompts to full-range motion capture would be possible.

Advertisement

The result was an add-on for the Xbox 360 called the Kinect. And at first, the Kinect seemed to really catch on! It sold a huge number of units in just a few years. In particular, Kinect opened up the Xbox console to exercise technology, allowing the system to help people get fit. Meanwhile, tech futurists saw similarities between the Kinect's input and the UI displayed in the science-fiction film Minority Report. The Kinect, in other words, was helping usher in a whole new future of human-computer interactions.

Except...it didn't. As with motion controls before it, hardcore gamers weren't impressed by the Kinect as a way to actually play games. Most Kinect games were uninspiring, to put it mildly. Worse, since the camera's detection wasn't very precise, most games had to compensate by putting the player on rails for most of the experience–leading to some pretty hilarious results. The Kinect earned a toxic reputation amongst the community, even as Sony–after deciding to compete against Nintendo's motion controls–decided to compete with Microsoft as well by introducing their own updated camera controller.

Advertisement

Microsoft got so carried away by the Kinect's potential that they made it a mandatory component of their next console, the Xbox One. To their horror, this was received negatively by the gamer community, and almost became a reason not to by the new console. Worse, the Kinect's inclusion made the Xbox One more expensive than the PlayStation 4, which drove even more people towards Sony's offering. In the end, Microsoft was forced to make the Kinect an optional add-on instead. In October of 2017, Microsoft killed off the Kinect for good.

Bow and arrow

If you're an elite soldier in advanced tactical gear, or a powerful archmage capable of commanding the very elements, or an archaeologist with a penchant for fighting mercenary armies, what do you need to bring with you? You need a weapon so advanced, so powerful, so capable, that no force in the universe could possibly stand against its awesome might.

Advertisement

But instead, you get a bow and arrow. Like, caveman-level weaponry.

For some reason, right around the end of the seventh-generation of consoles, seemingly every game started featuring a bow and arrow. This may have started in fantasy games first, where the bow and arrow at least makes sense from a setting perspective. And younger players who mostly played shooters, but then picked up a game like Skyrim, might start with the bow and arrow since it was the closest thing to a gun. But before long, everyone was using the thing. 

Crysis 3, which features a power-armored soldier in a post-apocalyptic New York, featured a bow and arrow on its cover. Lara Croft, in her rebooted Tomb Raider games, turns the bow and arrow into her primary weapon. Even out in the depths of space, games like Warframe showcase hyper-advanced versions of the oldest weapon in the book (well, not counting a pointy stick, of course).

Advertisement

Why so many developers caught onto this idea, and seemingly all at once, remains a mystery. But from Ark: Survival Evolved to Player Unknown's Battlegrounds, the bow and arrow remains the all-around most-used weapon in video games. Just don't let Hanzo mains know.

Loot boxes

The industry figured out a while ago that there were a lot of monetization options beyond "purchase the product for a set fee." Once it became clear that gamers were willing to fork out extra cash for extra items, features, or cosmetics, more and more games started letting them do so. The version of this sweeping the industry goes by different names depending on the game, but everybody just calls it "loot boxes."

Advertisement

Loot boxes contain a number of goodies, anything from new skins to new weapons and new abilities. Different games handle this differently–some only feature aesthetic items, while others can provide game-changing enhancements–but the key is that the player doesn't know for sure what's in the box. Every time one is acquired, there's a chance that player might get something really great... or might just get garbage.

There's an excitement to getting something unexpected: many games, even without monetization, depend on this. Diablo, for example, is all about getting lots of loot, while hoping that defeating a particularly large demon will grant a particularly great item. But by adding money into the mix, many game-makers are being accused of stiffing their players to crank out a few extra dollars. 

Advertisement

Star Wars: Battlefront II is one recent example, in which the full-price game also asks players to drop a lot of money on "Crates"...which are just loot boxe). Trouble is, a lot of powerful upgrades can be found in these crates, so players who have gotten more crates, whether by money or by sheer playtime, have a distinct advantage over players who haven't.

The system is reminiscent of booster packs for collectible card games like Magic: The Gathering, which is a successful business model. But booster packs, even with unknown contents, at least let buyers know a rough idea of what they'd be getting for their money. But in video games right now, many loot crates don't tell the player anything. Buy one, and hope for the best. Nobody asked for that.

Mandatory server connections

If a game is online, obviously the player must connect to a server somewhere. Older games would often allow–or even force–the players to host online matches on their own machines. Recently, however, most games provide a networking backend, so that online functionality is run and controlled by the developer. However, some games have started forcing a connection to these studio-operated servers at all times, even if the player is only using otherwise offline components of the game, such as a single-player campaign.

Advertisement

Diablo III and the most recent SimCity were two examples that forced connectivity at all times. Both Blizzard and Maxis claimed that this was required in order to provide a seamless experience, and mentioned that the whole game was meant to be interconnected. But gamers cried foul, and suspected that the real reason was that Blizzard and EA wanted to use constant-connectivity as a DRM system to fight piracy. EA eventually recanted and removed the always-on requirement, but Blizzard never did–though the console version of Diablo III does not have the requirement.

Over time, more and more games used similar systems. Fan backlash, however, never relented, and at this point mandatory server connections are not prevalent in the industry anymore–except for Blizzard Entertainment games. Still, as more games go online, and as more game purchases are tied to online store accounts, the possibility remains that constant connectivity could return.

Advertisement

Annualization

Activision had a hit World War II shooter series on its hands with Call of Duty and its sequel, Call of Duty 2. But while Infinity Ward took the time to craft a follow-up–a little game called Call of Duty 4: Modern Warfare–Activision decided they wanted another Call of Duty game ready to go in the meantime. And so they tasked developer Treyarch with making Call of Duty 3, and in so doing, created the notion of annualization: releasing an entry in the same franchise, every year. Activision would follow the same process in its Guitar Hero series of games.

Advertisement

This strategy paid off more dividends than even Activision expected when Modern Warfare became a megahit that redefined the entire shooter genre. Fans were ravenous for more Call of Duty: and Activision already had games coming down the pipe, one for every year. The sales dominance of Call of Duty over the next few years inspired other studios to adopt the annualization model, such as Ubisoft with the Assassin's Creed franchise.

But this method started grinding franchises into the ground. Guiar Hero was cancelled not long after its annualization started, because people didn't want to buy the same game, over and over. Meanwhile, other studios simply didn't have the resources necessary to crank out a sequel every single year without fail. Even Ubisoft, after the bug-ridden disaster that was Assassin's Creed Unity, decided to ditch annualization in favor of producing quality, and relatively bug-free, products.

Advertisement

In time, annualization was replaced by games-as-a-service: the idea that a single game is constantly supported with new content over time. This way, instead of constantly struggling to produce a whole new game, the developer can just concentrate on the foundation of a single game and put all their effort into making it even better. Games like Overwatch and League of Legends have proven how this method can lead to great, and bestselling, products that continue to make money long after the initial release. These days, only Call of Duty still annualizes.

Except for sports games, obviously. Those things will be annual forever.

Pay-to-win

Free-to-play sounds like the most consumer-friendly trend in the history of capitalism: you get awesome games, for free! And indeed, many of the most-played games in the world right now are free-to-play, including the juggernaut moba League of Legends. But of course, there's no such thing as a free lunch. Even free games need to make their developers money, and this is done through in-game purchases. These purchases might take the form of those pesky loot boxes, or direct sales of known items, such as skins, heroes, or items.

Advertisement

When done well, it's possible to have a game that's rewarding, fair, and pay-as-you-please. But unsurprisingly, the vast majority of free-to-play titles aren't fair: they're scams in disguise. While anybody can play these games for free, players need to drop a whole wad of cash on in-game resources and upgrades in order to actually stay competitive. This has garnered these types of games that derogatory moniker "pay-to-win."

While this method may con a few gamers into giving up a few dollars, the real targets here are a tiny number of players who will spend anything to win in these games. These so-called 'whales' can generate millions in revenue for pay-to-win developers. Long-term, this probably won't keep a healthy and active playerbase around, but in the short-term, it can make a few people a whole lot of money.

Advertisement

Ever-easier difficulty

Games used to be hard. Back in the glory days of 2D platformers, design concepts like 'balance' and 'playtesting' didn't exist yet. In this time when "game designer" was not a distinct profession, a game's programmers and level designers would make all the enemies, weapons, and areas. Sometimes, this would lead to great games. Often, it lead to bash-your-head-against-the-wall frustration. Battletoads, anyone?

Advertisement

As the industry matured, dedicated designers emerged. Their job wasn't to program, or design a level: their job was to design the actual gameplay, ensuring it was fun, balanced, and fair to everyone. But as they did so, a consequence was that games got easier. Getting to the end of a game was no longer a mark of skill, it was just a matter of time. As the gaming market got bigger, publishers wanted more people to buy their products, and that meant making games even easier so that anybody could play them.

The result was a lot of heavily-scripted campaigns that featured a giant arrow named "Follow." If the player followed that arrow, well, they would get to the end. Some enemies might be in the way, but on the increasingly-easy Normal difficulty, enemies were basically paper dolls. This meant that gamers with actual skill were bored. Even harder difficulty levels didn't fix the problem, since most Hard modes just turned enemies into bullet sponges. The games just weren't challenging the way they used to be.

Advertisement

Fortunately for hardcore players, there's been a recent surge in difficult games again, spearheaded by From Software's Souls series. Meanwhile on the multiplayer end of the spectrum, games like League of Legends require an enormous amount of technical skill and in-depth knowledge to play at a high level. While the big AAA titles from major publishers will continue to reach out to all players, hardcore gamers can at least rest in the knowledge that they still have a future, too.

Online passes

Big publishers despise used game sales. If a player sells a game to a friend, or a go-between such as GameStop, the publisher doesn't see any of that money. In an attempt to curtail the practice, publishers began including "online passes" with their games. An online pass was a single-use code that unlocked core functions, such as multiplayer, for that user's account. If the player sold the game, that online pass wouldn't work, and the new owner would be locked out of those functions unless they paid an extra fee, after they'd already bought the used game.

Advertisement

Needless to say, this didn't go over well with players. It was a clear cash-grab by giant corporations, that added no value whatsoever to the consumer–and actually removed value for second-hand consumers. The backlash was so fierce that publishers did eventually phase the practice out; today, it's almost non-existent.

But it's the publishers who have had the last laugh. Most games are bought digitally nowadays, and there is no way to resell a digital purchase to somebody else. It's not that the publishers gave up: it's that the industry caught up to their desires. Sorry, gamers. The corporations won this round.

Recommended

Advertisement