Why Optimization is Poised for a Breakout
March 20, 2023 No Commentsby Matthew Goodwin
Optimization is a key part of every software system, as has been the case even since the first days of coding. Despite this concept holding true, the emphasis placed on optimization in the modern age has diminished over time. Thanks to bottlenecks and increasing technical debt, optimizing code is returning to the limelight, where it’s becoming harder to overlook. While some industries have already taken this idea to heart, others struggle, and balanced at the precipice of some serious hardware limits, things are going to have to change.
Optimization Done Right
Optimization in the contemporary software space is often best performed on smaller individual pieces of software, thanks to the relative simplicity they present. Some of the best illustrations here in modern hardware are demonstrated by games in the online casino industry. Take Sweet Bonanza at Paddy Power as an example. Built around the idea of sweet candy, this game has a light heart, but the code behind it demonstrates an extremely strong foundation. Designed on HTML 5 this and other slot games are built to cater not just to modern mobile phones, but also to those dating back multiple generations. In taking this approach, the designers maximize access and performance, but in the wider software space, this kind of development can be frustratingly rare.
Release Now, Fix Never
There’s a constantly repeating pattern we see in coding where a functional solution is regarded as good enough, and then abandoned. You don’t have to look far to find an illustration of this ideal, as the search functionality in Windows has always been a prime example. With what can take minutes in failed attempts to find files that definitely exist, Microsoft’s effort has long been a sticking point for users. Yet, a Voidtools program named Everything can find files that Window’s official system can’t, easily, and in a fraction of the time.
“Everything file search” (CC BY 2.0) by passtheballtotucker
While sometimes this kind of overlooking isn’t at all intentional, there are other times when it’s accepted under the logic that faster tech in the future will make up for programs shortcomings today. The thinking is along the lines of “So what if a program runs poorly now, the computers five years from now will have no trouble!”. Of course, this doesn’t help people today, or users on older systems.
Other issues also tie into a famous programmer quote by Tony Hoare that states “premature optimization is the route of all evil”. As noted by this article by Ubiquity, this is based on the idea that because you don’t know how the completed code will look, optimizing before you’re done will introduce future complications. While this is true, it’s also true that the final optimization phase is then often ultimately overlooked thanks to issues of cost and time constraints.
Now’s the Time
We’ve reached a point now in hardware where finding gains in performance is becoming problematic. GPUs like Nvidia’s RTX 4090 are so large that they can’t fit in regular cases, and CPUs like Intel’s Core i9-13900K run so hot that they’re bound by inevitable thermal throttling at peak performance. Though it’s still possible to pass problems onto the future, that solution is proving less viable by the year.
“scythe katana 3” (CC BY 2.0) by nsr1986
Addressing the optimization issue is not something that can be easily done. This kind of work requires specialization, and it means developers need to give time and funds to programmers despite increasing costs and complexity in high-tech programming projects. Given the current trajectory, however, the possibilities of avoiding optimization represent a growing dead end. Whether it likes or not, the market is about to have its hand forced, and for users, that could be just what we need.
Sorry, the comment form is closed at this time.