With surprise I realized, don’t know since when, that the app Megapixel went away and was replaced by Camera (Dev Preview). I took some photos without really understanding how do manage the app and I’m surprised again about the quality of the pictures…
In short, are there some instructions available, even a simple ASCII file would be fine, or where is the git page for this project? Thanks
Could some kind sol shed a bit light how to manage the GUI of the app, especially which values to be used for gain, exposure, focus and balance, or point me to some document where they’re explained.
One problem I detected is: when the app is started it is consuming a lot of CPU cycles because it is constantly taking the raw frames from the sensor and waiting on the click, then the last raw frame is used for further post processing. Why this file *.dng of ~13M is also stored in ~/Pictures while the resulting *.jpg file is only ~1M?
I imagine that the raw data from the image sensor is first stored in a DNG file and then it is converted into a lossy JPG image, but I haven’t looked at the code to see how it works. People who want to edit the image or want to store it in a lossless format like TIFF or PNG will appreciate having the original image data.
DNG is a free/open source format for storing the uncompressed information received from the image sensor. It is similar to the RAW format used by the majority of camera makers, but each camera maker has its own RAW format, whereas DNG is vender neutral and a free/open source format which is easy to edit. For more info, see: https://www.adobe.com/creativecloud/file-types/image/raw/dng-file.html
The new focus in the cam is a big step forward. Thanks to all who are involved.
What I’m missing is
a small indicator, a square or crosshair, where the focus is
a shutter signal, I’m always unsure when the picture is shoot (if one presses the shutter after a second or so, the picture gets more clearer, then it takes ~10 secs with a circling indicator until the small preview “button” shows up … when exactly the picture is taken?)
plus: a way to shut by a shell command from command line (while connected via SSH)
I have here a screenshot of a cam app on another mobile where I tapped certain point of the future image (the yellow thin line square) to indicate that this part should be sharp (and not the middle or the background).
Immediately when I tap the shutter or when it goes whiter for a moment one second later? If in the moment of tapping the shutter, why it goes whiter for a short moment after the tap?
Okay, that will be needed for autofocus, but won’t land any sooner.
Somewhere between that moment. The whiter version is your actual photo. It doesn’t have the exact same brightness because modes of different resolution are far from perfect still.
You don’t have to imagine. /usr/share/megapixels/postprocess.sh
It isn’t only about lossless v. lossy, but also about postprocessing the raw data from the sensor. You can generally do more CPU-intensive postprocessing on your computer, as compared with your digital camera (or even your smartphone), so some photographers will only transfer the raw file from the digital camera and do the rest on their computer. (As you say, raw files can be in proprietary formats, requiring the manufacturer’s proprietary software to work with, which is exactly what we wouldn’t want.)
The file name of the postprocessing script is actually /usr/share/millipixels/postprocess.sh. I did some small modification to log start and stop of the script:
i.e. it takes around 11 secs to run the postprocessing, which also means you have to wait 11 secs before you could take another picture. This isn’t practicable and should be redesigned.
You can shave some milliseconds off that processing time by replacing the Bash shell files in Millipixels with compiled C, C++ or Rust code, and I’m sure that there are other tricks that can be done in the software, but the i.MX 8M Quad lacks an image signal processor and hardware video/image encoder, so the processing of the photos has to be handled through software running on the 4 cortex-A53 cores, which aren’t that powerful. Even with good optimization of the software, I doubt that the L5’s camera will ever be very fast. Both the i.MX 8M Plus in Fir and the RK3566 in the PinePhone 2 will have an ISP and hardware video/image encoder which should improve the speed of their cameras if there is proper driver support for that hardware.