NXP will manufacture its i.MX 8M processors till Jan. 2033

I just checked NXP’s Product Longevity web site. It used to say that the i.MX 8M Quad would be part of NXP’s Product Longevity program till Jan 2028, but it now says that all the i.MX 8M processors will be part of the program till Jan 2033, meaning that NXP guarantees that will keep selling the chips till then.

NXP describes its Product Longevity program in these terms:

The Product Longevity program ensures a stable supply of products for your embedded designs.

Participating products are available for a minimum of 10 years from product launch (15 years from product launch for many products developed for the automotive, telecom and medical segments), and are supported by standard end-of-life notification policies.

When NXP talks about “end-of-life notification policies,” it is talking about when it sends out a notification to alert its customers that it will no longer provide software updates. In other words, NXP is guaranteeing that will keep providing firmware and driver updates until at least 2033. The questions I have are: (1) For how long after 2033 will NXP continue to provide proprietary firmware updates? and (2) Will NXP continue to contribute to the mainline Linux driver?

It is easy to verify that NXP does provide firmware and driver updates for its chips when they are part of the Product Longevity program. For example, NXP released the i.MX 6Quad in Nov. 2012, and it got new BSP (board support package, i.e. firmware), Drivers and Middleware releases on 2021-03-29, 2020-12-28, 2020-11-10, 2020-11-09, 2020-11-05, etc.

While I often lament the fact that the i.MX 8M Quad doesn’t compete with recent Snapdragon, Exynos and MediaTek Helio chips in terms of performance, but those chips are generally only supported for 3 - 3.5 years by their manufacturers. We may not get the fastest processor with the Librem 5, but we are getting a chip that will probably be supported for longer than any other mobile processor on the market.


Moore’s law is nearing an end of life itself also. So a 2033 end of life at that time might not be as significant as it may seem to some people at first now. And the announcement is a guarantee to continue production until that date, and not an announcement that NXP intends to stop production on that date. When put another way, this is good news. The manufacturer expects a strong demand for this device to continue through 2033. Ten years ago this probably wouldn’t have been possible. We’re nearing the end of exponential growth in shrinkage of IC dice size.

There is still room for the shrinkage of integrated circuit sizes. But we’re approaching the end of that shrinkage process and growth is slowing. For example, the laws of physics dictate minimum trace width sizes. If the trace on the dice were compared to a narrow tube, then it can be said that even in single-file, the flow of electrons requires a minimum size of tube to fit through. We’re very close to being able to make these tubes (a metaphor and not literal tubes as traces are solid and flat) that small now. Beyond that, we can’t go any smaller. Although this over-simplification is not precise (some Engineers might even cringe), the point is that we can see an approaching end to the existing gravy train that says we double the number of transistors while cutting the dice size in half, every two years.

For this reason, we can expect more features and customizations to existing product types in coming years, although we won’t see much smaller dice than we have now in today’s mostly highly integrated chips. The lifetime of use can increase. Improvements to existing products will become more common. Instead of going smaller, smaller, smaller, maybe NXP might find a better niche in catering more to existing customers with existing products in new ways.

True but, for example, Intel x86 CPUs are somewhat ahead of where ARM CPUs are (at least the i.MX one that the Librem 5 uses) e.g. Intel 10th gen CPUs using 10nm process whereas the currently used i.MX 8 CPU using 28nm process - so that means that if a physics wall is hit then x86 is going to hit it sooner than ARM.

It looks like Samsung and TSMC are willing to do the investment necessary to keep Moore’s Law going at least down to 3 nm. From there it seems like the laws of physics are going to make it much harder to keep making smaller node sizes, but 3nm chips are going to have incredible processing power in comparison to 28 nm planar chips, which is why I find it shocking that NXP is willing to guarantee production until 2033.

My guess is that NXP figures that it will cost very little to keep producing at these older nodes, because the foundries have these old fabs just sitting around and the investment has already been sunk, so they might as well keep running them. Companies are willing to pay a lot for older chips, in order go avoid the cost of redesigning their devices. Before the current chip shortages started, the i.MX 6Quad cost $54, whereas the i.MX 8M Quad cost $35.50, so NXP can charge a premium for its older chips.

I do wonder how many people will still want to use the Librem 5 in 2033.

PS: Right now if you search for the i.MX 8M Quad, it has gotten a lot more expensive:

  • Mouser: 23 in stock, $58.91 for 1, $57.24 in lot of 10, 27 weeks lead time
  • Arrow: None in stock, $41.72 in lot of 90, 26 weeks lead time
  • Digi-Key: 26 in stock, $78.70 for 1, 26 weeks lead time

Purism wasn’t kidding when it said that it would have trouble getting the i.MX 8M Quad.


Why wonder about how many people use it? What is so significant about the value of knowing such information?

1 Like

The more people who are still using the Librem 5 in 2033, the more likely that there will be a community that exists to keep updating the software. Maybe Purism will still exist in 2033 and still be providing software updates, but the tech industry is not known for long-term stability, so we are safer if we have an active community of users to maintain the software, similar to what happened with maemo.org still supporting the N900 which was released in Nov. 2009.

In my opinion, planned obsolescence and Surveillance Capitalism are the two biggest problems with the modern tech industry. It was the fact that the Librem 5 is the first phone to ever promise lifetime software updates that convinced me to preorder. Having looked into the design of the phone, I am convinced that Purism truly is making a phone that is designed to last 10+ years both on the hardware and the software side. (The big question at this point is whether Purism will sell replacement parts or release the Gerber files so other companies can make replacement parts.)

The first step to changing the tech industry is to have working examples that show that another business model is possible that isn’t based on planned obsolescence. My long term hope is that the Librem 5 will help jump start a new market for phones that are designed to last for a decade and are based on privacy.

The tradeoff with selecting a chip with long-term support is that you don’t get good performance. With the RK3588, however, I think that it would be possible to make a phone that has good enough performance that people wouldn’t mind using it for a decade.

If the Librem 5 had a 7nm processor, I would agree. The issue is that the 28nm planar i.MX 8M Quad is roughly 8 years behind Apple’s A14 (5nm EUV FinFET, 11.8B transistors) in terms of its tech, and the Quad is going to feel positively ancient in comparison to 3nm GAA-FET processors in a couple years.

I have been reading articles predicting the end of Moore’s Law for a decade, but I don’t see much evidence for it so far:


Here is the compound annual growth rate (CAGR) in the number of transistors per mm2 over 5 year periods:

Period CAGR
1971-75 40.34%
1975-80 48.28%
1980-85 11.34%
1985-90 24.45%
1990-95 44.65%
1995-2000 57.75%
2000-05 22.92%
2005-10 28.89%
2010-15 33.73%
2015-20 45.13%
1971-2020 31.67%

Between 1971 and 2020, the number of transistors per mm2 increased from 188 to 134.453,782, which is a CAGR of 31.67%, but the CAGR between 2015 and 2020 was higher than the historical average at 45.13%. Intel, GlobalFoundries and UMC haven’t been able to maintain Moore’s Law, but TSMC and Samsung have.

The real issue is that the cost of building a 3nm fab costs $20 billion and only 3 companies in the world can afford to spend that kind of money. TSMC has enough volume that it can probably make the cost per transistor keep dropping like it used to with previous process nodes, but I’m not sure that Intel and Samsung can, and they will probably lose billions trying to compete at 3nm and 2.5nm. I do think that Moore’s Law is going to have hiccups when we get smaller than 3nm with tech like nanowire gate-all-around FETs, high numerical aperture EUV and germanium or another III-V material.


But this is a part of the issue, i.MX.8M quad goes out of support in 12 years, - the implication here is that Purism need something new out in that time also! (otherwise they will hit a wall with processor shortages and suddenly have no product.)

Then the question is what time can Purism dedicate to supporting the Librem 5 (mk1) when Mk2 on a different chipset is out. - sure a lot of things may be non-chip specific… but that’s really the bet that is being taken here, that there will be no critical hardware bugs that change something. - or that Purism have resources…

The chips are designed for automotive use, die size is not important when you’re making devices measured in grams to be put into cars. - they half the size and weight? - so what! it makes no appreciable difference to the auto manufacturers…
They are principally sold for automotive use running infotainment systems, that essentially have a never changing software foot print.
or engine management systems, (literally never changing software! - stupid as it sounds, windows 3.1 works as well today on original hardware than it ever did. - but if I tried to install a modern OS onto my old 486 it’s not going to play nice! The issue is what we expect to do with devices. (For example HTML5 massively increased the compute resources needed to view webpages!)
(for that matter if I turn on my old Nokia, I’m sure that’s just as good at doing what it did 20 years ago today… it’s just what it did seems very limited today!)

The chip might be supported for 12 years.
The real question is if it will still be fit for purpose in 10 years!

1 Like

Whether or not the chip will be fit for purpose in ten years is a valid question. But I think that many of today’s chips will still be fit for purpose in ten years.

At one time, eight-bit chips were the most advanced technology available in the market. Surprisingly to some people, eight-bit chips still lead the market in quantity sold. Those eight-bit chips are in everything. You don’t need a sixty-four bit processor to manage the operation of your refrigerator or your laundry washer and dryer or the lights in your children’s shoes, or in your yard’s sprinkler watering system. And if you quadruple the speed at which the light to your refrigerator comes on when you open the door, no one will even notice.

In the same way that those eight-bit chips are still being sold in the billions, I think that today’s sixty-four bit chips will be strong markets for a long time to come. So they eventually add more peripherals and make it half the size. The growth curve will still level-out as that happens.

Unless there is a new revolutionary killer app that requires huge processing power to run, my guess is that the growth in mobile phone functionality won’t change greatly over the next ten years, anymore than the eight-bit chips have changed significantly. Now the eight-bit chips have indeed changed. They run on less power. They have many more peripherals and even core-independant peripherals. But at some point, you have to move-up to a sixteen or thirty-two bit processor when you reach a certain level of complexity. Right now however, we don’t have any 128-bit demand calling to us, at least not on a consumer level. So we can expect more bells and whistles on our phones. But the required underlying processing power is almost at the end of need for exponential growth now.

When you add TCP/IP or Bluetooth to your eight-bit application, usually the programmers will tell you that they need sixteen or thirty-two bits for that. Does anyone know a case where the app engineers are saying, “if we add that feature to this application, we’re going to need 128 bits”? If not, everything is just some shrinkage and making the existing technology more efficient, but not more efficient and smaller on a revolutionary way, like going from an eight-bit chip to a sixty-four bit chip or even from thirty-bit to sixty-four bits.

NXP guarantees that it will manufacture i.MX 8M processors till at least Jan. 2033. It may decide to manufacture the chips for longer, and even when it stops manufacturing, it will probably keep providing software updates for the chips for a few more years. I think that 15 more years of at least security updates is likely considering the industries that NXP targets. NXP charges more for its chips than similar chips from Rockchip, Amlogic and Allwinner due to its long-term support, so it has a reputation to maintain.

The Librem 5 was purposely designed so that it will have low maintenance costs in the future, so that it won’t cost Purism much to keep providing software updates. See:

Aside from updates to the proprietary firmware, it is likely that the community can maintain support for the Librem 5 even if Purism goes out of business or decides to drop support.


But the Librem 5 isn’t a fridge (and arguably, you probably don’t even need 8 bit processors in your fridge or washing machine! - they are just most cost effective), the L5 is a computer (with a radio module)

So the question is whether the functional requirements will increase in future (I’d agree impossible to say.)
the functional requirements of a fridge/radio/engine management system do not increase…
but the functional requirements of a device principally being sold as a convergence device probably will…

I guess it is one of those ymmv, depends how you want to use it things.

Those Purism security updates will be mighty useful when the device is confined to a draw because it cannot perform well enough to be used with modern software…
Those NXP security updates will be mighty useful when they are Purism backlog items for integration into the OS of a device that the company has no commercial interest in supporting…

Don’t misunderstand me, support for longer than the useful life IS great…
but if you’re still on your L5 in 12 years, you’ll probably be the only one, and it won’t make sense for purism to take time from supporting Mk2, or mk3 to port to your multiple generation old device…

Here’s the awesome thing about the L5: once we reach full mainline kernel, you won’t need to rely on Purism to integrate the updates. Anyone with access to the kernel mailing list will be able to do that.


It seems like Purism has fifteen years to wrap-up the remaining work on the Librem 5 and come-up with their next generation phone. That gives them all the time in the world compared to when their Librem 5 campaign started and now. That target should be an easier target for Purism to hit than what they’re actually doing right now. In addition, competition should drive improvements in phone SOC’s going forward the same way it has driven improvements in the eight-bit MCU market over the past twenty years. It’s reasonable to believe that we should see a purpose made SOC that is fully FSF and RYR as a result.

If I am happy with my Librem 5 (assuming I actually receive it), I will place a pre-order for the next generation Librem phone, and won’t mind waiting for it. I might even buy Librem 5 phones for my family members also while waiting too.


I do see a business case for Purism to continue providing software updates. Let’s assume that Purism stays in business and is still selling phones 12 years from now. First of all, there will be a market for parts and there might also be long-term support contracts for businesses that need the special security/privacy and long-term support. Also, by continuing to support the Librem 5 v1, Purism will be able to make long-term support a major selling point for its latest phone models. Basically, it can say: “buy the Librem 5 v5, because you can see that the Librem 5 v1 is still supported.”

It won’t require that much work for Purism to keep providing newer Linux kernels when all the hardware in the Librem 5 has open source drivers in mainline Linux. The latest kernels provided by Debian should “just work.” Likewise, it should require very little work for Purism use the latest versions of most of the software because Purism has worked to get its changes upstreamed. In most cases, the latest versions of the GNOME applications and libraries should just work on the Librem 5 without modification to the code.

The big question is whether Purism will still be paying developers to work on Phosh. If so, the cost for Phosh to still support the Librem 5 v1 will be marginal. In the same way that Matt DeVillier still supports the Librem 13v1, because it is part of his standard build, supporting the Librem 5 v1 will be a marginal cost that is part of supporting the Librem 5 v2, v3, v4, etc.

However, if Purism decides that it is no longer in its business interest to keep developing Phosh, then it becomes much more likely that Purism will stop providing software updates for the Librem 5 v1. In that case, we fall back to community development efforts. The actual code that Purism currently maintains alone is not much. The GNOME community will take over development of libhandy, Calls and Chatty (or develop successors), because those programs are becoming part of the standard GNOME desktop. The remaining code (phoc, phosh, feedbackd, haegtesse, wys, squeekboard, libre5-base and gtherm) isn’t that much and can be maintained by volunteers if there are enough users.

At this point, Phosh is packaged in all the major parent distros (Debian>Ubuntu>Mint, Arch>Manjaro, Fedora and openSUSE) except Gentoo, and it is rapidly becoming the way for the GNOME ecosystem to go mobile, so I think it highly likely that volunteers will maintain the code even if Purism drops its development. Only if Purism shuts out outside developers (which it hasn’t done so far with Mobian and postmarketOS developers), do I foresee a rival GTK-based mobile environment arising, so I think it likely that everyone in the GTK/GNOME world will continue to contribute to Phosh, and it will become the de facto mobile version of GNOME.

In other words, I think that the Librem 5 will still be getting software updates and still be able to run the standard GNOME applications in 12 years time. Having installed modern Linux distros on laptops that are 12 years years old, I was surprised how well they run the latest versions of Firefox, LibreOffice, etc., so I think it likely that the Librem 5 will be able to run most of the GTK/GNOME applications 12 years from now.

Obviously, the Librem 5 won’t be adequate as a convergent desktop PC 12 years from now, but a lot of people just want a simple phone. For those people, I think it is more of a question of whether there will be WiFI/BT 7 and 6G modems on M.2 cards that they can use, and whether they can live with a phone that is so heavy, bulky and underpowered. Right now 5G modems require too much energy to be cooled with the heat spreader on an M.2 card, but I expect that eventually will be solved and most people won’t need mmWave 5G and its new antennas. I think it is likely that we will have M.2 cards for the Librem 5 12 years from now, so we won’t have a situation like the N900 where people are forced to abandon it because the 2G networks are being shut down.


N900 support 3G.
We need to support Purism to keep it safe alive.

While that’s true, here’s why the internet is a bad thing :slight_smile:: Most computers do not work in isolation. This causes several problems.

  • new protocols and new data formats - if they become ubiquitous and older protocols and data formats fall out of favour then your old Windows 3.1 computer is in practice not usable - unless Microsoft (or potentially someone else) provides the support for a long obsolete platform
  • security problems - if longstanding flaws are found in the implementation of some feature that interacts with the internet then your old Windows 3.1 computer is in practice not safe to use - unless Microsoft (or potentially someone else) provides the bugfix for a long obsolete platform

A variant of the first item is that the provider of a particular online service intentionally changes the protocol with the goal of dropping support of long obsolete platforms.

(Obviously there is a freedom in the open source world to keep a long obsolete platform alive that does not exist in the closed source world, provided that someone chooses to do it.)

So the internet exacerbates built-in obsolescence.

I think you will find that over the coming decade, cars are more subject to these problems than they have been in the past. You may think of the EMS as “never changing” but these days it is connected via the ‘network’ during service in order to download performance data etc. Some cars will be online permanently for a range of ‘helpful’ (not) functions.

That’s exactly my point. windows 3.1 works as well as it ever did, but today it’s practically useless (baring some esoteric edge cases.)

BUT… even if there was somehow the security patches, and a modern version of office.
the thing is still useless…

to compare this directly to the L5 we’re going to have to also point out that the 486 in question only had 8x 1MB SIMMs!

It’s not going to get very far opening a word document! - let alone trying to playback a 4k video
(Does that help explain what I meant when I said “functional requirements will increase”)
my requirements for functionality have increased to greater than what the hardware can support. - so it really wouldn’t matter if the OS had a load of updates. it wouldn’t matter it people were fixing hardware implementation issues for 80486… the chip just isn’t good enough today for my “new” uses.

I was saying is the basic functions of an engine, (measuring throttle position, comparing to wish map, detecting engine torque determining injector pulse durations etc… will not get very much more complex, during the lifetime of a vehicle. (the functional requirements of the ECU do not increase!)

it used to be that updates to EMS systems tend to be, very small. (where specifically most modern cars have separate engine control, body control and entertainment. - so updating rollover weakness in the locks for example, didn’t affect the ECU, separate system.

but yes, on modern cars especially cars with “internet features” there are significantly more updates that will be applied. (including updates for functional changes.)


If we want to compare in a period of 15 years from official launch, we should replace Windows 3.1 (launched in 1992, extended support ended in 2001) by Windows Vista (launched in 2006, extended support ended in 2017).
If we want to consider processors, we should replace Intel 80486 (launched in 1989, discontinued in 2007) by Intel Core 2 (launched in 2006, discontinued in 2012).
It is still possible to use Vista on Intel Core 2, although not recommended and to use GNU/Linux on Intel Core 2; at that time it was available Debian 3.1 Sarge with kernel 2.4.27 or
So using L5 in 2033 with software updates for me will be possible.

Just one question…
Are you still using a computer from 2006, (if not why not?)

I’m still using a desktop from 2007, although only as file server, my plan is to replace it as soon as possible, I work on a laptop from 2013 and its display is gone, I’m waiting for L14…