Librem 5 camera software

That thing with “AI” was not just a joke. I know it is nothing a little company can handle by its own, but AI can decide on details, we don’t recognize as human. Would be nice to see an open source solution some day we can use for our app. It should be also very cheap for calculation power (not the training, but the end product).

But yeah, I remember that issue.

1 Like

The presets were just some standardized illuminants that were added for manual usage when there was no auto white balance implemented in Millipixels at all. It has been added meanwhile, but in a way that simply reused the existing white balance code, inheriting the presets from the manual slider. It doesn’t make sense for it to work that way, but someone will have to write the code to fix it first :wink:

My very capable Nokia camera phone from a decade ago (pre MS) has a handy list of selectable pre-sets that are a quick way to jump from “overcast” to “fluorocent lighting”. It also has the free scale. This is a UI problem that they (IMHO wisely) chose to solve by giving user three levels UI/automation to choose: full auto (select only flash mode), pre-set values (“semi-auto”) and full manual (or “more manual” as even then they did not open it fully). And their sliders were more intuitive than what is currently in use (better size/position and in later versions it has a nice preview as the sliders were pieces of the image and the changes could be seen in those areas, compared to a piece of the original).

Point being, arguing like there should be only one way to take pics for all is silly and a bad premise.

Good to have choices so I think that to have the option of presets may be useful.

Now on the wikipedia article I wonder how precise it is. If I illuminate a red (under candle light) apple (the example of the article) I will see red because candle light has a rich spectrum including red. So red will be reflected back to my eyes. But if I light the same apple with a light that does not contain red, say a low quality fluorescent lamp or a bare led there is no way my eyes or brain can do anything to correct the issue and still perceive the apple as red. It will not be red period.

So for the article to talk about “correcting” sounds meaningless. If a person has never seen an apple in his life and you light it with a bare led (which emits only blue) the person will not see it as red. No way.

So what is this all about?

The camera sensors see light frequences/wave-lengths or only differences? If it sees differences then I understand that it needs a fixed point in colorspace to decide what are the colors.

But if it sees frequencies or wave-lengths then all this makes no sense. 650 nm will be red. 450 nm blue. In between values create a mixed reaction of our cone cells resulting in intermediate colors.

Just disable automatic white balance in Millipixels, set a single preset and don’t change it, and see for yourself what the difference is between outdoor light, indoor light, candle light, flashlight etc.

3 Likes

OK I will try that but in the mean time I understood what is all this about. A good article (with some problems) is this:
https://web.archive.org/web/20080203165737/http://www.photoxels.com/tutorial_white-balance.html

White Balance looks to me as a trick to play clever from the side of the photographer. In the above article we see a picture of a fan next to a camera. The very first picture. Under incandescent (tungsten) bulbs, Auto White Balance (AWB) will try to shift the color of the fan from yellowish to white assuming you want the light to be from a fluorescent bulb.

But my eyes will see the fan yellowish and not white. There is no way I will see this as white unless I change the bulb to fluorescent. So the first question is why people want to alter the lighting conditions and use conditions that do not exists in the scene. You take a photo under candle light and you get a picture as if you took it in the Emergency room of a hospital. Why do you want to do this? I would imagine that people would want to take the photo as they see it and not as they would see it in other conditions.

The above article refers to this as a problem of the color temperature of the ambient light. This is wrong. Temperature is not the reason here. The reason is the richness of the colors emitted by the light source measured by CRI (color rendering index). All modern led bulbs report their CRI on their package. The incandescent bulb has CRI 100% (giving true colors) same as the Sun again with CRI 100%. The incandescent bulb is at 2700 Kelvin and the Sun at 6500 Kelvin.

A candle or an old pressure lantern work at about 2400 Kelvin and both have an almost perfect CRI.

Leds are bad with this. Their CRI is typically in the 80% or 85% range. True chromatic match is achieved at CRI above 95% but these bulbs are very expensive. I remember Philips had such a led lamp at about 98% and a cost of about 50 $. Who would buy…? Cheap leds are below 70% and these are suitable only for the stairs of a multi storied building.

CRI is measured in Europe on 14 standard colors including Red which is difficult for leds. And they measure it against incandescence light which is understood as ideal. In the US to make it easier for factories (and bad for people) they measure it against 8 colors excluding Red shades. They refer to the European standard as CRI(e) (e for extended) or CRI(Re) (Re for Red extended).

So to get back to photographs, it seems to me that AWB is and unnatural choice.
Like in the old days of film when photographers would apply colored lenses to their cameras to take photos altering the lighting conditions.

Personally I find this a bad choice or do not understand it. I will definitely disable AWB if I am given the choice.

image

Which apple would you grab?

2 Likes

(I know you didn’t ask about iPhone but) The standard iPhone camera app allows the user to adjust “tone” and “warmth” and one other thing that appears to be related.

The problem for me is that it is far from obvious what you are actually adjusting (and of course if there are too many possible adjustments then you are searching in a multi-dimensional space and may never get the right result).

But I guess if you are standing in front of the object being photographed, you can see whether the adjustment that you are making on the phone is making the image on the screen look more or less like your perception of the actual object.

Nope! The fan will be white for you, and the photo will look unnaturally yellowish. That’s the point. Right now I’m using a laptop with greatly red shifted screen and after a while its white looks like white to me, not like orange - at least as long as I don’t toggle redshift back and see the difference :wink:

CRI is another matter, and of course chromatic adaptation falls apart under extremes (Wikipedia mentions Purkinje effect, for example), but human brain is really surprisingly plastic with its interpretations of colors - see https://en.wikipedia.org/wiki/Color_constancy for more examples.

BTW. This is an example photo from L5 camera without white balancing:

You always have to choose the lighting to adapt the photo to. AWB simply tries to choose it for you, with better or worse results.

And here’s another photo that used an automatically matched preset in Millipixels, which didn’t end up as lucky as your one since no preset matched the actual lighting and calculation gets easily off due to unusual blue tint of almost the whole picture content:

Later I have corrected it manually to actually look more like the real thing:

4 Likes

78 ( It will taste better :wink::wink:)

Yeah right! :rofl: I should have written which one a kid would choose :wink:

1 Like

Let me see where is the catch. I do not doubt you. I try to see where is the mistake I make.

The incandescence lamp emits a full color spectrum. Doesn’t it? Is this deniable? I assume it is not otherwise it would not be considered CRI100.

The fan is white out in the sun and under incandescence it reflects back the whole color pallet it receives from the lamp. So I see it white. Is there any mistake so far?

If yes what is it? If not we continue with the camera.

The camera sensor, as my eyes, receives the whole spectrum reflected back by the white fan. Where is the yellow color coming from?

There must be a simple mistake I make on the above. I think you say that my brain gets into the game. But if I receive back the whole spectrum, all my eye cone cells are triggered, why does the brain need to get into this?

1 Like

Let’s leave out the LEDs and just consider full spectrum light sources, like the sun, incandescent lamps, or candles. These emit light because they are hot.

In this context, full spectrum just means that every visible wavelength is present. It does NOT mean that the intensity at every wavelength is the same.

The light from a hot object has a “hill shaped” spectrum with an intensity peak at a wavelength which depends on the temperature of the object. (This is the reason you sometimes see the term colour temperature.) The sun is hotter than the lamp filament and has a peak more towards blue than that of the lamp.

So, both have a full spectrum, but the sun provides more blue light than the lamp.

Looking at your kitchen, the spectrum that reaches your camera is the light source spectrum filtered by the cupboard surface. If sunlit, then more blue light will be reflected, and if lit by a lamp, then more red light will be reflected.

But more blue light will also be reflected if the surface is blue. So how does the camera know if it sees blue because of the light source or the reflecting surface? It doesn’t, unless you tell it what spectrum the light source has. This is what the white balance setting does. Then the camera can compensate for it when interpreting the levels measured by the sensor.

This is different from how you see things, because the brain has its own auto white balance. The algorithm is not open source [citation needed], but I’d guess it involves memories of what colour things usually are. The eye also has a much wider field of view than the camera, so there is more information available to the brain - especially if your camera sees a surface with a uniform colour.

In addition, try the auto WB of your brain :slight_smile: Sunglasses often have a yellow-brown tint. If yours do, wear them outside in sunshine for at least half an hour, then take them off. For a while, you will likely find that your surroundings will appear to have harsh, bluish colours.

[Edit to make more clear]

6 Likes

Please don’t do this when you need to convey a specific colour to someone. There are so many influencing factors and so much voodoo, even besides white balance, that you can’t rely on the outcome. Even if the photo looks perfect on your phone, you don’t know what will be displayed at the other end.

People in the graphics industry use carefully colour-calibrated equipment (like cameras, displays and printers) to make sure what they see is what they get. For sensitive work they even control the ambient light around their workstations, because ambient light influences how you see colours on the monitor.

To get the right paint, use a colour sample - one of those small pieces of cardboard available at paint shops or even one of the doors from your kitchen.

With a little practice, you could also estimate a hue pretty accurately in NCS if that system is known where you are and you don’t want to send a physical sample. The NCS code is immune to the factors affecting colour perception in a photo :smiley:

[Edit to insert mandatory xkcd]

4 Likes

OK, this makes sense. It is a serious difference.

You say that the sensor detects color from reflection on the surface and color by ambient light, right? So the color distortion comes from the ambient light and not from the reflection. So the brain enters the game in order to “isolate” the color as it reflects and “delete” the color from the ambient light.

If this is what you mean, then I got it. This makes sense. Sorry I did not understand this earlier. Thank you.

Yes, I think you got it :slight_smile:

Silly analogy: You have some salt in a bucket and you add water from a spring. How salty would it taste? Depends on the amount of salt in the bucket (reflection) AND the amount of salt already present in the spring water (light source). Once mixed, you can’t tell from where the taste comes or how much salt was in the bucket to start with.

All good, no need to be sorry! Nobody was born with that knowledge. (And I’m probably skimming over a lot of complexity, because my understanding is also limited.)

2 Likes

Be careful with “sun emitters full spectrum”. This is the real sun light spectrum, with all the absorption that don’t arrive the earth. It’s partly the fingerprint of the sunlight and partly the absorption through the atmosphere of the earth.

Also sunlight gets weaker through atmosphere, but some parts more than others. So some wave length arrive the earth ground more often than others (often just little differences, except end of visible violet range).

2 Likes

I want CRI 95, why its the highest number!

The best CRI, that is CRI100, is for incandescence bulbs, candles, pressure lanterns (I am sure @tracy has a pressure lantern hidden somewhere in his house :grinning:). The moment you move to leds CRI drops (up to now—do not know if in the future 100 will be achieved with leds).

Personally I do not like leds for reading. I prefer incandescence bulbs although they consume more energy. I find them more relaxing. However, it is almost impossible to get such bulbs in Europe, they are forbidden with one exception: I buy oven lamps for reading! Fortunately they have found no way to substitute oven lighting with leds so far because of the high temperature of the oven.

They have produced “economy” incandescence lamps but these contain iodine and I find they produce harsher light for reading.

3 Likes

Try one of these. Their LED bulbs produce a very natural, dimmable light that’s great for reading. I’ve got one clipped to the top of my laptop.

(Amazon link here.)

According to the manufacturer’s site the high end desktop models are CRI 85. Since this is in American market this measurement does not include Red. So it is even worse than the above apple at 85. Of course this is good for a book which typically is black letters on white or creamed paper. The less expensive (than the desktop) version clip light of the above links is “high CRI” on the manufacturer’s page, without giving a specific number. So it probably even lower than 85.

The temperature of these leds are either 2700 or 3000 (it is unclear) which is good. (If 2700, it is better).

Dear @amarok I do not trust the industry as I trust a social purpose company such as Purism. They clearly used the feelings of people for the climate to raise the cost of bulbs at least 6 times (incandescence compared to leds) and they rarely last longer. Moreover what substances do they use? When they forced us abandon the old bulbs their products where the fluorescent ones and they did everything they could to hide the fact that they contained heavy (toxic) metals. Why should we trust them?

If the substances they use to cover the leds are worn out (in case their leds last long), wouldn’t these leds emit light in the UV range? They will.

As for the excuse that they use less electricity, I have to say that humanity has the know-how to get all the energy we need from the Brother Sun. Stop pumping oil. Anything else is an excuse to make money from the climate problem.

1 Like