New Post: The Danger of Focusing on Specs

Many years ago I was a sysadmin for a medium-sized tech company, and a fringe benefit of that role was getting first choice at stacks of “obsolete” computers that were about to be thrown away. They say that one man’s trash is another man’s treasure, and that is even truer when the first man ran Windows, but you run Linux. It has long been known among the Linux community that a Windows computer that was “too slow to use” and about to be thrown away, could be transformed into a brand new computer simply by installing Linux on it. While my Windows-using colleagues were replacing computers every two or three years as they grew slower and slower with age, I found my Linux-using friends and myself were often using the same hardware (even second-hand hardware) for at least twice as long. Even when I replaced hardware with something new, I found that the old hardware still performed, for the most part, as well as it did when I started using it. The hardware specs didn’t matter nearly as much as the software that ran on it.

Even today, many people still fall into the trap of relying solely on specs to gauge whether hardware is “fast” or “slow” and forgetting the giant role software has to play in performance. Both hardware and software companies incentivize this mentality, as it means more frequent sales for hardware vendors, and customers who are more likely to blame their “old” hardware than bloated software for poor performance. In this article I will discuss some of the consequences that come when you only assess hardware by specs.

Read the rest of the post here:


Good read. This issue of slow software has come up a lot lately for me, as I await my Librem 5 and try to make my PinePhone more pleasant to use.

I encourage Purism to focus even more on providing efficient software that meets people’s needs, for example, taking inspiration from sxmo, which is the most usable interface for me on the PinePhone, because it is so much faster to use than the other options. When I am using sxmo with foot terminal, w3m browser, and wvkbd keyboard, I find that my “slow” PinePhone actually outperforms my family members’ Android devices for some web browsing tasks. Comparing that to using Firefox or Waydroid, I will just say that the PinePhone does not . . . outperform . . . to say the least. Although I have heard that the Librem 5 is a significant step up in performance from the PinePhone, I believe there will be a parallel with the Librem 5 as well.

To be clear: I would not want Phosh to become sxmo. For more than 95% of non-techie people, sxmo would be horrible to use. The vast majority of people do not want to learn terminal commands to do basic tasks on their devices. Phosh is necessary for the wide adoption of mobile Linux among non-techie people. Nevertheless, I think that sxmo can be a good inspiration for thinking about how Phosh and the Librem 5 can become more efficient and faster.


I do think the mobile use case is causing Linux app developers to revisit their own apps’ performance (and design). It is easy to miss a performance problem if you are testing an application on an over-specced laptop with many fast cores and tons of RAM, on a gigabit network. This is the same sort of wave that happened with websites and web applications over a decade ago when people started visiting them from mobile browsers and having poor experiences over a 2G or 3G network with bloated, busy websites that didn’t fit on the mobile screen.

I think having more modestly-specced mobile platforms that run (and this is the key) the same exact software as the desktop, and platforms that we intend to support and improve going forward, there is more incentive (like there is on console gaming platforms) to improve software performance and optimize, instead of just throwing more hardware at the problem.


And the critical issue with company laptops, are the mandated windows platforms.

It used to be an a proverb in the 1980s about IT managers, “No one ever got fired for buying IBM.” (That’s when the central platform wars were ongoing, HP vs. IBM vs. VAX vs Burroughs, and a half dozen others also no olonger in biz.)

Just swap IBM for Windows today for that proverb.

Even though the proverb swap is H/W vs S/W which is like apples vs. oranges, I still think the comparison is apt.


As someone working on development for a x186 based platform this post really resonates with me. It is crazy how fast technology improved in the computing industry, but hardware that we all consider ancient is still very capable. I think for software engineers, it should be required to have to develop on a system of yesteryears. Being memory, processing power, and display constrained are things that help you to write better software.

That said, Kyle, I just wanted to publicly say that I think your hobbies are rad!

1 Like

Looxury. When I were a lad we used to dream of having an x186. :wink:

I think another potential benefit of developing on a system of yesteryears is that you often end up developing closer to the real system and gain an understanding of how things really work under the hood. But we digress …

1 Like

Yeah the CPU in the HP 200 LX is just a legend. Such a fully capable XT class machine in the palm of your hand, all running on 2 double A batteries.

Just the other day I was able to get Wolfenstein 3d running on it. Wolfenstein 3d CGA, for those interested, curtesy of the same guy who brought us the Microweb browser. It runs well on my LX, which is a double speed 32 Meg version (upgraded of course). I have beat the first level already. Not that easy using the keyboard though. Aiming is a pain.


I think at the very least if companies insist on giving developers high-powered, cutting edge hardware to develop on, whoever is testing that software for release should be testing on much less modern gear.

I don’t fault the developers as much as the industry. For a long time, the promise of newer, faster hardware has incentivized optimizing the speed of software development and not the software itself. Software developers are among the most expensive parts of the process, so entire industries and methodologies and frameworks sprung up to focus on how to make software require lower development resources instead of hardware resources. As long as software is “fast enough” on the latest hardware, that’s acceptable for many organizations.

As a sysadmin I had plenty of conversations about software optimization vs. hardware upgrades, and most of the time, the business argued that it was cheaper to upgrade the hardware, than to spend more developer time so software used fewer resources. So until we got to the point where we could no longer scale hardware vertically or horizontally, that was the accepted way to make software faster.

Optimizing software for speed takes time and expertise. When you layer framework upon framework and optimize for speed of development, you aren’t going to end up with fast software.


There are three different aspects here:

  • what you develop on
  • what you develop for
  • what specs of hardware get tested.

(So, for example for the first two, I would rather cross-compile on my desktop for my Librem 5.)

It’s a fine line between “layered, frameworked bloatware” and “premature optimisation”.

I’m curious. Is this a work project or a personal project? Because I would have thought that, for general personal use, a Librem 5 with Bluetooth folding keyboard (or something like it) would be a similar form factor but with some significant improvements.

HP110 Plus