What proprietary phone(s) most closely comapares w/ Librem 5?

I’m curious to know what 5’s closest proprietary phone is in hardware & functionality (they may differ). E.g., I can’t find which normally proprietary processor most closely compares with Librem’s i.MX8. I’d like to know the answer to both questions.

1 Like

Hey, you got the Pinephone which doesn’t focus on the openness like the Librem 5, it’s very similar but has got more proprietary blobs.
If you’re asking on a more widespread scale I won’t be able to help much since I don’t understand much of the ARM world (but I don’t have much problems on x86/x64 systems) but I think I saw somewhere on the web (maybe it was here or maybe it was someone on the comments on Phoronix idk so take it very lightly since I can’t source anything) someone that said that the Librem 5 in raw power could be compared to a 2015 flagship phone.
But I don’t really know if you can compare Android to pure GNU/Linux since the later one is supposed to be lighter than the first one.

Thanks for the response. But I’m interested only in the hardware, how Librem 5’s computing power will compare with, say, a Samsung Galaxy or LG or Nexus, etc. That’s why I was asking, for example, how the i.MX8 compares with what Snapdragon or Xenus (sp?) processor.

1 Like

The Librem 5 will have the i.MX 8M Quad 28nm SoC, which contains four Cortex-A53 cores. Here are the only benchmarks that I can find for this SoC:

You can compare the CPU in the i.MX 8M Quad to the Snapdragon 425 which was released in 2016, but the i.MX 8M Quad isn’t as power efficient, and it doesn’t have a dedicated digital signal processor or image signal processor like the Snapdragon 425:

You can compare the Librem 5 to a Moto E5 Plus, which also had 3 GB of RAM.

However, the specs of the SoC don’t tell the entire story. The Librem 5 will be running Linux/Wayland/GTK+/phosh, which should be more efficient and require less processing cycles and RAM than Android which runs everything in a Java Virtual Machine, so its interface speed will probably be decent and it should be capable of driving an external 1080p monitor with no problems.

As a computer programmer, I know that modern SoC’s have plenty of processing power, and benchmarks aren’t that important at the end of the day. You generally don’t see the difference unless you are comparing two phones side by side. Where you will probably see a difference is in the speed of the camera. For everything else, you won’t have the kind of software, where it is that important to worry about the SoC performance. Only if you plan on doing some intense desktop work when using an external monitor or somebody decides to make an app that runs Doom are you likely to notice the performance of the SoC.

At this point, we don’t know the camera specs, so it really premature to speculate about the Librem 5’s camera and video performance, but I wouldn’t plan on it being too good. The other area where the Librem 5 probably won’t be that good is in battery life, so you will probably have to charge the phone every night.

However, the Librem 5 will be able to do other things that other phones can’t. No other phone is able to run Linux, Coreboot and work on 100% free software, which means that your software will be supported forever, whereas Android phones only get 2 - 3 years of software updates. No other phone will allow you to run a Linux desktop. No other phone has an SoC which has 10 years of support from the manufacturer and gives you a M.2 slot so you can replace the cellular modem, plus replace the battery, so the Librem 5 could potentially last 10 years. If you think of it that way, the SoC in the Librem 5 is pretty good.


Wow! Thanks, amosbatto, that’s the kind of response I was hoping for. Nothwithstanding the wonderful foss software and other security/privacy features the 5 will have, your hardware performance info. is still considerably disappointing, but I believe I understand why. I suppose that as sales/adoption increases over the next few years, the income from that will go into continuing demand and, thus, upgrading processing power. By contrast with L 5’s $650-700 price, I just bought a new (two-year old technology) Moto X4 for $160, including tax, which I much better specs than even the Moto E5 (haven’t checked all the details).

Any idea what the L5’s display’s specs will be?

No, the E5 has only 16GB RAM, whereas the X4 and L5 have 32. Now that I check, it looks like the X4 is considerably more powerful than the L5 will be and the E5.

The Librem 5 devkit had a 1440x720 18:9 5.7 inch LCD display. The Librem 5 might have a better screen, but I wouldn’t count on it.

The Librem 5 will have 3 GB RAM and 32 GB storage. The cheaper version of the Moto X4 has the same (3GB RAM and 32GB storage), which is probably what you got if you only paid $160, but the Snapdragon 630 in the Moto X4 is a much better SoC.

The i.MX 8M Quad is the best SoC that Purism can get for mobile device that runs on 100% free software, and Purism had to pressure NXP to not require a binary blob to initialize the SoC. See:

By the way, the Vivante GC7000 Lite GPU in the i.MX 8M Quad is not that bad.


The Moto X4’s display is sharper than the devkit’s, but still the devkit’s looks not too bad. Yes, the X4’s mem/storage is 3/32 w/ the Snapdragon 630, as you guessed.

Yes, I’d read something to that effect. I’m sure that as the 5 gains traction w/ early adopters, followed by idealogues and privacy/security people, the increased sales volume will enable Purism’s hardware improvement.

Can you explain this, give me some reference?
Thanks for your time and help. I really appreciate it.

While not a phone, a Raspberry Pi 3B+ makes a decent comparison to the system as a whole

The phone’s hardware is either identical to (CPU) or going to be a fair bit stronger than the Pi (3x the RAM which will also be faster, 3x the 3D acceleration), but the real similarity is the OS. Both run what is essentially Debian so the overall system responsiveness will be directly comparable between the two (but again, noticeably better on the phone because of the increased RAM).

1 Like

Very interesting. Your Wikipedia reference says, I believe, that the RPi3’s processor speed is 1.4GHz; isn’t the i.MX8’s 1.7GHz? Only a bit faster (the X4’s is 2.2GHz). But I don’t understand how you can say that the i.MX8 is equivalent to a Snapdragon 801. I’ve read at least twice now that the nearest Snapdragon comparison is its 425, whereas the 801 is used in the Moto X (my first, now old, phone, which had/has 2GB RAM) & the Samsung Galaxy 5 and LG G3. Can you expand on this?

Are you saying that the 801 (2.5GHz) has the slightly faster cpu speed?

But which phone’s performance out there will the 5 compare with, be equivalent or similar to, including consideration for the 5’s more efficient OS?

Again, thank you very much for this info. I’ll appreciate all the understanding I can get.

The i.MX8 is 1.6 GHz (and with a stronger CPU architecture, Cortex-A72), but the i.MX8M is 1.5 GHz (Cortex-A53). We’re getting the latter. The Pi 3B+ is also 4x Cortex-A53, running at 1.4 GHz.

As for the CPU strength comparison, my numbers came from https://en.wikipedia.org/wiki/Snapdragon_810#Snapdragon_800,801_and_805(2013/14) (4x 2.3 GHz Krait 400), https://en.wikipedia.org/wiki/Comparison_of_ARMv7-A_cores (which gives a value of 3.39 DMIPS/MHz for Qualcomm chip) and https://en.wikipedia.org/wiki/Comparison_of_ARMv8-A_cores (which gives 2.24 DMIPS/MHz for the Cortex chip).

And you’re right, it seems that I got those two CPU efficiency numbers backwards (meaning that the 801 should be much faster). The 425 does indeed look to be a very direct match.

Comparing the overall phone behaviour is going to be much more complicated. Correcting for OS overhead is something that I don’t know how to even begin to start on. I don’t know exactly how bloated Android is and I’ve never even touched an iThing to comment on its responsiveness. All I have to offer is a second-hand anecdote that Nokia’s Windows Phones were extremely responsive when compared to contemporary Android devices, even when the Winphone had less RAM and a slower CPU.


…or Quake2: https://social.librem.one/web/statuses/102264820591247811


The i.MX 8M Quad’s Antutu 6 3D GPU scores are better than the Mali-450MP:

The spec sheet on the i.MX 8M Quad says that it can run a 4K external monitor (4096 x 2160 at 60 frames per second):


Well, thank you all, especially TungstenFilament. I guess this is as far as we can go with this; I don’t have anything further to ask. I guess I have some idea of what to expect from the 5. I’m thinking that the phone won’t compare with today’s high-end proprietary phones in overall usefullness because of low-powered hardware and not fully developed software. So, real convergence still is quite a ways away. But, with binoculars I can see it coming.

The capacity of the hardware video decoding circuits does not give any indication of the other uses of the CPU as I understand it.
Especially since the free drivers used may not be able to exploit this capacity.

i wouldn’t bet on 4k @ 60Hz but it’s still going to be pretty nice ALL things considered.

The H.265 and VP9 decoders are separate circuits so they don’t tell you much about the GPU, but Antutu’s GPU and UX benchmarks for the i.MX 8M Quad aren’t that bad and you need a decent GPU to be able to output 4K@60 and a 1.6 gigapixel/s fill rate.

Yes, the phrase “the spec sheet says” has a special meaning in the Linux/BSD world. We are used to not getting want the spec sheet says. :slight_smile:
It looks like the Etnaviv driver supports 4K (at least I found commits for it), but this chat from November 2017 says that Etnaviv devs weren’t able to get it work, but this was a long time ago:

17:24 cphealy_: wumpus: you mentioned 4K@60Hz was not working with the i.MX8M. Did you try display port instead of HDMI?
17:25 cphealy_: I ask as with HDMI, you need your display to support HDMI 2.0 for this to work.
17:28 wumpus: cphealy_: the display supports it, I tried with a chromecast ultra (which has HDMI out 4k)
17:28 cphealy_: Same cable?
17:29 cphealy_: The i.MX8M looks to support HDMI 2.0 (HW wise.)
17:29 cphealy_: Good datapoint though.
17:29 wumpus: the chromecast comes with a cable attached, so no I couldn’t try that
17:29 cphealy_: Do you know if the HDMI cable supports the higher frequency of 4k@60Hz?
17:29 wumpus: but I did try with two different cables, also the one that came with the monitor
17:29 cphealy_: OK, probably not that then.
17:30 wumpus: wouldn’t it completely block the signal in that case though? it seems work sort-of, just synchornization fails sometimesw
17:30 wumpus: does the board have displayport output? I haven’t noticed
17:30 cphealy_: I’m not sure if it has DP output.
17:31 cphealy_: If it does, that would be a good thing to try.
17:31 wumpus: I have displayport output attached to my desktop at the moment (its gfx card doesn’t support HDMI 2.0)
17:31 dv_: are there any figures about how many MPixel/s the mx8m can handle?
17:32 wumpus: but I’ll check
17:34 cphealy_: From what I’ve read, 4Kx2K @ 60Hz is supported by the HDMI transmitter.
17:37 wumpus: no displayports on the board
17:37 cphealy_: OK
17:37 wumpus: yes, it definitely should support it
17:37 wumpus: that’s why I thought it’s a kernel problem
17:39 cphealy_: I’ll see what I can learn.
17:44 wumpus: ok
17:54 wumpus: also it’s a very recent monitor model (from this year), so I’d be really surprised if it’s a monitor issue, though ofc there’s the possibility it’s an interaction problem between the specific HDMI transciever and monitor.
17:56 cphealy_: Yea, that stuff can be touchy. I’ve been battling some display port transmitter issues recently and it’s a pain getting it to work reliably.
18:17 wumpus: for some reason it seems a recurring problem with ARM boards for me
18:19 wumpus: I just remember I had another problem with this board: back when I started testing, another monitor would not work with it. Even though it supports 1920x1080x60 fine (“timing out of range” or such). I had to swap it with another monitior I had, which worked.
18:19 wumpus: so another indication the timing might not be exactly up to spec


also even if 4k@60 will be doable from a gnome/wayland/driver perspective it still leaves the issue of heat dissipation if it’s going to be pushed to the max.

i’m not confident at all with such prognosis.

1 Like

Put your phone in LN2 and you’re good to go :upside_down_face:
I’m curious about how the Librem 5 will handle 4k@60

1 Like

@reC don’t overthink this. We’re talking about showing a 4k desktop, not rendering a x264 video of that resolution or a 3D game.

I had a 0.040Ghz PC doing 800x600@60 and a 0.133GHz PC with 2MB video RAM doing 1600x1200 (256 colors), didn’t even have a fan on the vga.
Then 0.300Ghz PC with 16MB video RAM so it could do the same resolution in true color.
There were also adapters without cooler that could have two such screens attached.

20 years later, why would you have a heat problem by merely emitting twice the amount of pixels from a chip that is likely ten times as energy efficient than what we had back then?

How do you know, that the Pinephone uses proprietary blobs?