Discrete 5G Modem Coming to the Market

From what I have read, the high frequency radiation has less ability to penetrate the human body, so it should be less dangerous. However, you have to balance that against the fact that mmWave 5G is going to require base stations every 250 meters, so a lot more people are going to living next to a transmitter, and going to receive a lot more RF pollution.

Sadly, normal people are not organized and the only lobby in the room was the cellular providers and hardware companies that will benefit from 5G. There was no organized lobby demanding a decade of testing in small areas, before we turn a billion people in urban centers into guinea pigs for massive amounts of mmWave RF.

The reality is that we humans will use whatever wireless bandwidth is available, but there are diminishing marginal returns. People talk about being able to stream 4K and 8K video with 5G, but the practical utility of video doesn’t scale as resolution increases. 1080p doesn’t give twice as much utility as 720p and 4K doesn’t give 9 times as much utility as 720p, but that is how much extra data it requires.

Resolution Vertical Horizontal Pixels / frame Times pixels
HD 720 1280 921,600 1
Full HD 1080 1920 2,073,600 2.25
WQHD 1440 2560 3,686,400 4
4K 2160 3840 8,294,400 9
8K 4320 7680 33,177,600 36

I can’t even see the difference between 8-bit 4:2:0 format (which stores 1.5 bytes per pixel) and 10-bit 4:2:2 format (which stores 2.5 bytes per pixel).

I often see 5G being promoted because it will allow all sorts of marvelous new things in the future, like allowing autonomous cars to communicate, but I’m pretty sure that all those things could happen just the same with LTE-A. Just like with processors, we find ways to use all the extra processing power, bandwidth and memory available, but I honestly can’t say that I’m more productive with a 10th gen Core i7 processor than a first gen Core i7 processor from 2006, because the software in 2006 appeared to be just as fast since it wasted fewer processing cycles and consumed less memory.

1 Like

I find them below trees sometimes too

5 Likes

Not memory, I bought laptop with two sodimm slots one stuffed with 16GB and was planning to extend to 32 once I hit the limit. I’ve never hit that. I don’t deny that some usage pattern will eat 128G (I have that at work and I eat it all - mostly due to virtualisation) but on my home [linux] laptop I run lxc and dockers and games and browsers and compilers… and never hit 16G (14 was the max I hit).

With the understanding that these are uncompressed numbers and that the true data rate required depends on the compression algorithm, its parameters and its efficacy with the particular data, plus container and protocol overheads.

The fact is that there have been no long term studies of the effects of RF on human beings, and there is no accountability placed on any of the providers to prove that their technology is safe. I’m all for technology but my health is more important than any of this stuff. Needs to be studied for sure.

Diminishing returns yes. It’s become specmanship. The average person isn’t really going to be able to tell the difference between a 4k video signal and an 8k one. I’m not sure I would either.

particularly not on a screen that is below FHD.

Don’t worry! They will figure out how to add some assisted reality on top of that video or some new wiz-bang surveillance Capitalism feature that is sure to suck up all the extra 5G bandwidth. We will be totally inundated by marketing to convince us that we simply can’t live without 6G whenever they need another wave of planned obsolescence to generate more sales in the future.

5 Likes

…until they get brain cancer or go senile.

Cell phones didn’t hit the mainstream market until starting around 1995 in the form after a brick phone and in low quantities to start. It was a little bigger than a brick, was shaped like a brick, and was the first phone that you held up to your head to use it. In about 1988, I was an early adoptor of cell phones. I had a unit about the size of a CB radio mounted in to my trunk, with a handset by the driver’s seat. The eighteen inch antenna was mounted on the vehicle trunk. I think I paid close to fifty cents per minute in addition to a high monthly fee. The smaller feature feature-phones didn’t hit the market until after 1990 or later. That is when the wider adoption of cell phones started taking off. So the majority of all cell phone users ever in human history, have less than thirty years of routine cell phone use (against your head). We still don’t have a sizable fifty-year sample of cell phone users yet. And those older phones weren’t in the GHz frequency range. Any sample of phone users in the GHz range and up has probably less than twenty years of cell phone use at those frequencies. So it’ll probably be another thirty years or more from now, before high GHz range 5G has become mainstream and we have a large sample to see what twenty years of use of a 5G radio transmitting directly against your head does to you. By then, there could be a resulting brain cancer edidemic. We’re still not out completely of the woods yet on the long-term health affects of 4G phones on the human body.

5G as a whole is MOSTLY nice IF you are going to be mobile and outdoors NEAR 5G access-points which ALSO means you are in range of “mmWave” and thus if anybody from the ‘top’ COULD ‘snipe’ you (precisely and effectively) … aren’t you glad you’re in ‘lockdown’ ? :upside_down_face:
Merry Christmas btw !

I meant it more in the sense that a 5G network operating on GSM frequencies is nice (and that will happen, they’re retiring 2G networks to free up the spectrum for newer technology), as it offers more bandwidth and lower latency for the same communication range.

I’ll be just as close to this particular access point as I would be to the 2G point it replaced - several kilometres away and far out of range of any mmWave signals.

I doubt its for actual MM wave 5G but rather the “Wideband” 5G that’s all the buzz today. Otherwise you are right and it need not be concrete walls, regular walls, or even an extra 50ft will cause it to fail over to lower freqs. That is if the carrier lets it. I am having issues with AT&T trying to force me to use 5G.

for the present moment i can select to only use 2G with my carrier on my Blackberry-q10 and to NOT automatically switch to other providers/cell-networks when signal is bad … at least that’s what the GUI of the black-box says it’s doing :upside_down_face:

@nicole.faerber or anyone, are there any promising discrete M.2 5G chips on manufacturers’ roadmaps since this discussion 3y ago that might be useable for Purism’s purposes?

Ideally of course it would match the L5 antenna tuning, and could be swapped in, however I’m sure there are other things that would need to be done, including validation.

I think given network shutdowns of 2G/3G in various countries, the ability to update the modem on top of the battery replacement would be a big seller for those interested in committing to the Purism ecosystem.

If the 5G technology is used at higher frequencies, then the wavelength should be smaller. Smaller wavelengths typically equate to smaller antennas. A smaller antenna can be encompassed inside of a bigger antenna (potentially, a 5G antenna inside of a larger 4G antenna) by making the antenna be resonant for the most part at mostly the desired frequencies. That is what amateur radio multi-band antennas do. Rather than having an antenna that is more than a mile long to match the size of the wavelength they are using, they electrically tune the radio circuit to match the antenna length. The goal is to get an impedance match between the radio transmitter circuitry and the antenna impedance at the given frequency. So we have to wonder why 5G antennas need to be bigger antennas. There must be a built-in inefficiency (losses in gain) in 5G systems that isn’t being overcome through other means. Bigger antennas and higher altitudes can solve most RF communication inefficiecny problems. But what is the root cause in the case of 5G?

If I remember correctly, Given:

L=Inductance in Henrys
C= Capacitance in Farads
f = Frequency
Reactance is in ohms

Higher frequencies = smaller wavelength and thus smaller antennas.

Inductive Reactance = 2pif*L

Capacitive Reactance = 1/sqrt of 2pif*C

So you just add reactance (capacitive or inductive - capacitirs or inductors - , as needed to increase or decrease the impedance) between the radio circuit and the antenna, to get maximum power output/reception efficiency. The big coils inside of the old AM radios is just one obvious example. So what really hinders 5G antenna requirements now? It’s got to be losses in gain from what I know. If so, the answer is obvious: build more efficient radios. This should be an electrical design issue, not a situation where society learns to deal with bigger antennas.

Imagine if the older AM radios would have come with several hundred feet of wire and instructions to climb the highest tree you can find, to attach the wire to the top of the tree. My guess is that silicon FABs haven’t yet caught up with an ability to implement those formulas given above on to silicon wafers. I doubt they do well at fabricating inductors on to silicon wafers. I could be wrong on that though.

1 Like