Librem 5 Latency

Can we please get some interface latency measurements ahead of time? I don’t think any customers expect hardware to compete with Google’s or Samsung’s or Apple’s specification-wise, but latency of HCI is super important - and is a great point of competition. The reason HCI latency on smartphones is a great place to compete is because it has been somewhat ignored in the last generations of Android and iOS. With control of the hardware & software you have a lot of potential for optimizing this - and it doesn’t require having the best CPU & most memory!

Here’s some benchmarks for example:
Useful thread on the topic:

Again, this could be a huge selling point! Can we discuss?

Those numbers seem a bit dated. I’m surprised apple is that bad. OTOH, keep in mind that in comparison to Android, Linux apps are closer to the metal. Less layers of abstraction.
Hardware wise it’s too late anyway. Alea iacta est.
Software wise, this might be an area where improvements can be made, but it’s also likely that improvements in different areas are more beneficial. First measure, then improve.


I was not able to find more recent numbers, but I think we can probably collect them.

You said, “keep in mind that in comparison to Android, Linux apps are closer to the metal.”
Exactly! So that’s good news for… Any idea who in the team could “first measure, then improve?” I would be happy to, but Librem 5 isn’t released yet… I definitely think this would be a huge selling point; Apple and Android-based phones have become complacent in many ways.

I see two main issues with latency:

  1. The actual HW/OS/App latency (showing in the benchmarks above)
  2. The amount of loading that is being done “from the cloud” these days. Caching is not done very intelligently and apps are loading everything from the cloud every time you open them. Librem could start a good precedent here.

We already have the dev kits available. We’re happy to accept patches to any part of the phone that improves the latency - all our code is out there.


Possible to get some measurements posted in here before I shell out for the dev kit? Also this post shows “last call” for dev kits expiring ~1yr ago:

Sorry, I meant to say that there are people who have the dev kits already, and you can team up with them. Unfortunately, we don’t sell them any longer.

Latency measurements will happen when we notice terrible latencies - so far so good, so there is nothing to show.


Just my personal guess: touch latency will probably not be the thing that needs most attention. Should not be hard to be at least average.
And as you said, having native apps that don’t load Web stuff is at least as important for snappiness.
I assume a lot of work, even after release, will go into optimizing battery life. There may be many little things that can be improved that add up. Maybe somebody will notice “hey, the touch sensor could be turned off when the screen is off” and suddenly the battery lasts a few hours longer.
Not saying delay is important, but if it’s okayish, you’ll win more people with battery life.

Other note: Measuring that delay is AFAIK not an easy thing to do. I saw a Microsoft research video on that. What they did was using a hispeed camera to measure the delay of an object following the finger moving around. So this needs some preparation.

1 Like

This claims the tools to measure interface latency can be built for $50

I would rather know ahead of time what the interface latency is… I’ll be shooting in the dark in conversation about Librem5 until I have some real numbers. If this is so cheap to measure and could be a significant selling point, then let’s try to find someone with a dev kit & the willingness to measure. Using their measurement apparatus, they should measure some Android phone with known latencies - to help verify the apparatus/method.

in this context would the latency be observable when in desktop mode or just when using multitouch on the screen-display itself ? what about multitouch on external screens ?

1 Like

@reC, this is a really complex topic, which is why I consider it premature to think about optimizing it without any indication that there’s actually a problem.

In a simplified (!) overview (in general valid for all modern architectures / operating systems), the delay consists of three parts. Let’s call them System, Application, and Graphics.

System represents the delay from physically emitting a signal (touch, click, keypress), being processed by the kernel and delivered to the windowing system (X11, Wayland).
In general, this delay should be comparable for different input devices (touch, mouse, keyboard), at least if they are connected the same way (USB, not Bluetooth). This layer probably accounts for most of the delay, but a lot of it comes from the hardware itself (including connectivity, e.g. USB)

Application: the windowing system delivers the input to the application, which thus adapts it’s state and redraws the window content. This should be a very quick thing, <1ms.
(Except we’re talking about a javascripty, bloaty web page…)

Graphics: finally the delay until the changed content is visible on the screen. In general, this should also be very fast (<1ms), but it depends a lot on the actual implementation (DRI, DMA, compositing…).

Intrinsic delays
I briefely mentioned inherent delays that happen in the hardware (controller chip of the input device, plus USB bus etc.).
But all of the above layers have intrinsic delays that I did not mention, and that cannot easily be avoided. On a multi-core system with a reasonable CPU-load, every context switch takes (simplified) 0…10ms. That is: kernel(input) -> windowing system -> application -> compositing manager -> kernel(video). So, worst-case this could add up to 40ms, but usually should be significantly less. This can be tweaked in the kernel, but if there were only benefits to smaller time-slices, then it would obviously be the default.
Additionally, there is the screen refresh rate. At 60Hz, this adds a delay of 0…16ms.
So, 8ms on average, right?
Yes, but easily >30ms if multi-buffered.
That means that going from buffered to unbuffered can save you 30ms of input lag. But possibly somebody had a reason to enable it in the first place.

Here’s Bryan Lunduke - Keyboard lag sucks, talking about how and why systems built more than thirty years ago have less (!!!) input lag, despite of CPU clockspeeds thousand times slower than what we have nowadays.

Another interesting read: Why Modern Computers Struggle to Match the Input Latency of an Apple IIe.
It contains a link to the MS video I posted above, but more interestingly a link to this comparison table of different devices throughout the decades.
(Also adds a lot more detail, for example explains that the display itself can take like 10ms to make a white pixel black etc.)
It seems to indicate that well designed Linux systems should have an input delay of 50…100ms.
That’s why I’m rather optimistic that the Librem 5 will not be worse than average Android devices.

My favourite developer quote is “All problems in computer science can be solved by another level of indirection”.
Input lag must be the exception to the rule. That’s why some people like to add:
“…except for the problem of having too many levels of indirection”.


wow ! and that’s only the short version huh …

So far no one is saying to over-optimize or even to optimize… Just to make measurements. I don’t see any reason to put off making measurements. I would simply like to know :slight_smile:

As Dorota said, you could try to team up with somebody who has a dev board and is willing to spend time on this. Or you wait for the phone (which possibly behaves slightly different anyway), grab an iPhone (that’s what that dude up there used for hispeed filming) and measure it.
That would certainly earn you some geek credits.

Extra points for measuring the differences between setups:
PureOS Gnome
PureOS KDE Plasma
PostmarketOS (yes, it boots - See 7th paragraph)

1 Like

In my phone history, each phone has been slower to navigate things like settings than the last, which is really quite sad. I hope Gnome and KDE will eventually be pretty snappy on the Librem 5.

I would hope neither Gnome nor KDE will be used. Thanks… And yes, this is an issue.

GNOME is the default, modified with Purism’s phosh shell. Purism partnered with KDE, so that KDE devs can ensure Plasma Mobile works too. Purism also partnered with UBPorts to hopefully have Ubuntu touch viable as well.

If you don’t like these options, you can install and use any Linux DE you want, but it almost certainly won’t work well on the phone display, at which point you are welcome to modify them yourself, since it is all open source :slight_smile:

1 Like

Thanks I didn’t know that. I find it strange that the default choices would be the two worst performing Linux WMs.

They are the most popular ones and thus most suited for the “convergence” aspect of being able to plug your phone into a monitor and using it as a traditional desktop computer. If most users are coming from GNOME or KDE, they will want to see that when trying out the convergence features.

1 Like

You see the main customer base of librem 5 are people who are “coming from Gnome or KDE”? That’s a tiny number of people in the world - so if that is true, then is optimizing a phone to not be successful.

The number of daily Linux users is already a tiny number of people in the world. I don’t expect the initial phone launch to pull in huge swaths of non-Linux people. Only once it’s grown and successful with I think a large number of non-Linux people join.

Within Linux, Ubuntu, Fedora, and Debian I think all ship GNOME by default. I don’t know the numbers, but I would guess those three distros make up the majority of Linux users (happy to see some numbers which prove me wrong, though). Maybe Linux Mint has more than I know, so maybe Cinnamon is up there as a DE. Maybe a large number of people still using older Ubuntu versions with Unity. But I do not think that Mate, XFCE, LXQT, Enlightenment, Budgie, or have the userbase to warrant Purism development support.

But I am serious about seeing numbers if they exist. I would be interested to know the breakdown of DE usage.

1 Like