Have you ever tried playing GTA 4 final mission on a slow computer? I did! I managed almost to finish it. There is a bug in the final mission. You are riding a boat and you must catch up with a helicopter. At the time, I wasn’t able to catch up with a helicopter. No matter how many times I tried, I could not catch up with it. GTA 4, being a port from console to PC, the performance wasn’t quite there yet, even with fairly good hardware. Not too old compared to the release date.
I got tired of trying to beat the game, and I didn’t play the game for a couple of years until recently. Since my last attempt at finishing GTA 4, GWFL has been discontinued, I’ve upgraded my computer to high-end hardware, and I did kind of miss the game. The Stockholm Syndrome is real.
After fiddling with getting the ridiculously stupid and non-userfriendly Games for Windows – Live working enough to play games, I was able to launch GTA 4 and play it. Sure, I loaded my last save and began the final mission.
At my first attempt, I was able to catch the helicopter! Wow, it wasn’t that difficult was it?
But alas, it was too good to be true.
To be able to enter the helicopter, you must repeatedly press the [space] button. Tap it really fast. I wasn’t able to and fell out of the sky – and died. Immersion ruined – again.
After searching online about what was wrong, it turns out my computer is now too fast for the game. A couple of years ago it was too slow. This is hardly robust at all.
To be able to beat the mission, I had to do the following:
The items in bold should be totally useless, but it’s common in games ported from consoles to PCs. Their only goal is to get the job done, and not write robust code. This is a prime example of sloppy work, and Rockstar will not be remembered as those who can make bug free games. “Sadly”, they make great games, when they are playable.
When playing games, nothing will ruin the immersion more than game breaking bugs that makes the game unbeatable.
For applications, it means you can’t save or print a document when the deadline is 1 minute away or other minor problems.
The customer will curse you, demand a refund, and never do business with you again unless they are forced to. When you have proven you can’t take care of your customers, the customers’ perception of you as a company will not change, (almost) no matter what. When creating software, or any product, it must be robust enough to handle the future.
By robust, I don’t mean the small bugs here and there. I mean the more subtle bugs that might appear on different hardware or faster hardware in a couple of years. It can be bugs that happen on a particular set of hardware or with a certain sequence of user inputs.
By experience, those bugs are the most costly and difficult to 1) find and verify their existence, 2) reproduce the bugs reliably and 3) eliminate them. As a professional programmer, I’ve had my share of bugs. Anyone who claims they never had bugs, they are either a pathological liar or not a programmer, or both.
Don’t make your customers hate you.
The clue is to write future-proof code and algorithms. You’ll never know what exactly will change, but there are certain dead giveaways you can guess. At the time of writing, you will not get faster CPUs. Instead, you’ll be getting more CPUs (cores), faster and more memory (RAM), faster and more storage (SSD), faster internet connections and the displays are heading for 4k resolutions (UHD and friends).
For programmers there is a term called “The Free Lunch Is Over”. It was coined by Herb Sutter. For years programmers and programs didn’t have to do anything in order to perform better on better hardware. The CPUs simply got faster and faster.
Not so much anymore. The trend have been prevalent for years, and now the CPUs are getting more cores and some systems have multi-socket CPUs with multiple cores. The complexity will just increase and to be able to get the most juice from the system is to have specialized compilers for the certain CPUs. That is not possible with games, as they have to have a broad target audience. But it is possible to build the game and distribute 32-bit and 64-bit targets, with a varying degree of optimizations turned on.
Memory is no longer “just memory”. As the memory grows it’ll be more beneficial to have multiple memory banks assigned to cores. One technology is called NUMA (Non-uniform memory architecture). In simple terms, this is hardware designed to allocate specific memory banks to specific CPUs. Data from a closer NUMA node will have low latency, while far memory will have higher latency.
SSDs are getting faster and faster and bigger and bigger. This is our current “free lunch”.
There is a multitude of displays and 1 pixel is not just a pixel anymore. To display text in a sensible size onscreen, you must scale the text. 20 pixel font size is OK with 640×480 displays, but not with 4k displays. The text will barely be readable.
Faster and better. But customers live in different parts of the world. You will have to make sure your application handles poor connections in a robust way. You must handle variable latency, different throughput, random disconnections and so on. All network
Part of being a software developer is not only write code, but also understand in a broader sense and to make sense of the part being worked with right now.
When working on a small and isolated problem, you make certain assumptions on how this particular piece of code will behave and how it’ll be used. Most developers doesn’t have the luxury to predict how that piece will be used. By using robust algorithms, and using software constructs like smart pointers, you are already a long way in making sure your customers are happy.