Your PC could be faster, but software companies don’t care

The simplest logic tells us that as computers get more powerful, applications should increase their average speed. However, there are many tasks that apparently have not increased their performance. That is to say, it gives us the feeling that when doing them they do not run better than with our previous PC. This phenomenon is colloquially called Wirth’s Law. Why it happens?

In computing, there is the so-called Moore’s Law, which refers to the complexity of the chips and not to their speed. Amdahl’s Law was also around for a long time, until the mid-2000s when the jump to multi-core chips had to be made. All of them are laws, they are based on the hardware, however, the task of this is to execute software and in terms of performance, bad practices when it comes to programs can ruin performance increases.

What is Wirth’s Law and how does it affect the performance of my PC?

As hardware resources have become almost infinite, the need to write good code has been lost. The important thing is not that a program works, but rather that it does so using the least possible hardware resources. Either memory size or processor power. The problem comes from the fact that as hardware performance increases, problems that were previously solved with good programming discipline are increasingly being ignored and it is a problem that not only affects PCs or mobile phones.

Thus, we find ourselves with the problem that applications that should consume a portion of the resources they spend end up being like cookies for the famous blue monster from Sesame Street. Howeverthe name of the law we owe to Niklaus Wirthwho in February 1995 wrote an article entitled A Plea for Lean Software which could be translated as “a request for cleaner software”.

Let’s say that with each new iteration, bad practices mean that as the processing capacity of the heavier and less efficient processors increases, the programs are made. Take Microsoft Word for example, 99% of people are still using it like they were 20 years ago. However, the size of the application has grown enormously and PCs back then could not run the new version well.

Is it something unavoidable?

Not really, you just need to run older versions of the programs to gain performance. As silly as it may seem, sometimes pulling old versions through Abandonware to perform certain tasks is much better than not using the newer versions. And Wirth’s Law is not only given in common applications. We’ve been able to see how certain re-releases of games that are remastered versions of games of yesteryear end up performing poorly for the new hardware.

So the fault that certain things do not seem to go faster is not the fact that PCs have not suddenly slowed down or are giving us a ride in performance. The fault lies with the software, which has stopped optimizing itself and with it has become less efficient. Many companies have stopped having quality departments, which are people who check that the code is well written and optimized. A good application not only works, but it does so using the fewest resources possible.

And why is this happening? Due to the fact that deadlines and releases are imposed that depend on the sales departments. After all, the software can be updated and patched. The problem is that many applications that we think work well are actually doing worse than they should.