This post was spurred on by a recent thread on the Unity forums about how everyone’s perspective on games seem to change after you learn how to create games, since you now know the tricks they’re using and the magic is gone. Unless a game is truly exceptional, immersion is really difficult to achieve after you know what the tricks are. And, as some people in the thread said, some high-profile use quite lazy techniques.
I’m inclined to agree. In a sense, I’m subconsciously more critical of the games I play nowadays, since I can see each individual part of the game for what it is, and can think of simple ways that developers could have improved something they screwed up at, things that I used to just shrug and let go. I even deliberately play games I wouldn’t go anywhere near, including games that are infamously bad, in order to learn what not to do. Of course, not being an idiot, I tend to wait for these games to be 75% off on Steam, or for them to be part of a bundle that includes stuff I’m actually interested in.
There are also games that I buy with good intentions, to actually enjoy, that end up being a disappointment for some reason or another. Those games have actually been good learning experiences, and I pick up sequels to those games in order to see first-hand just how the developers decided to respond to criticism. Sometimes, the developers completely miss the nature of the problems, sometimes they only partially fix it, sometimes they make it worse, sometimes they leave it in as a conscious design choice. Very rarely, I find, are problems actually fixed.
But, no matter which path they take, I learn something about game design from each game I play with this mindset, so even disappointing things are somewhat enjoyable. Sadly, though, not everyone can agree with this mindset, with many being unable to enjoy games after learning how the game creation process works. I see where they’re coming from, since many types of games simply aren’t as fun without immersion, and many see this sort of knowledge as an immersion-breaker. But I don’t lose my immersion from knowing how it’s done… at least, not to the extent that most other people seem to.
I am a gamer before I am an aspiring game developer. I enjoy games as I play them, and try to enjoy a game despite its flaws. That said, I do not ignore these flaws, I simply hope that, if the developer decides to give the series another shot, that they fix them. In a sense, this creates an overlap between my gamer and game designer personas. If I enjoyed a game despite its flaws and pick up the sequel, I am hopeful that the issues are fixed. If they are, my gamer side is happy and my game designer side is satisfied. If they didn’t, my gamer side may be disappointed, but my game designer side is absolutely giddy. I follow a similar thought pattern when good games get sequels.
But I still can’t tell if this is a good thing or a bad thing. One has to ask if examining bad games for their flaws will truly teach you about good game design. I’m not entirely sure myself, and I don’t have any games to my name yet, so I can’t examine my own title to see if I learned enough or not. Time will tell if I wasted money picking up bad games at massive discounts. Specifically, looking at the first release of Turtles all the Way after adjusting for the fact that I coded it alone with relatively minimal experience.
P.S. I have specifically avoided listing any games in particular in this post (aside from my own game, which isn’t out yet), since quality is subjective and my viewpoint on a game’s quality contradicting that of someone reading this post would make them miss the point. Besides, I could always praise/criticize specific games in a separate post.