I wonder what’s going on in the background there in processing the picture. 5 minutes exposure creates star trails even on a wide angle lens. Must be multiple exposures and focus stacking.
Unsure, but it also outputs a time lapse at the same time where you can see the stars moving - I’d guess this is a by-product of the process (also cool that you can see some slight cloud movement which I didn’t know was there before)
Yeah this is generally how low light and HDR camera modes work - by taking multiple photos and then using a technique called “Stacking” to average out noise and improve signal (useful light).
You can also just take 20-30 second exposures if your phone has a “pro” camera mode, then do the rest of the stacking with other tools. The astro mode simply does everything for you, but you get less control over the processing.
It most probably fakes dots based on arc intensity and width. So if two stars happen to be along the arc, they are both smeared together at the location of the first.
It’s taking a video and doing aligning/stacking of the frames like you said. Not taking an actual long exposure in the sensor. Most photos on modern phone cameras in low light are done this way. There’s a cool paper by google on their algorithm.
I wonder what’s going on in the background there in processing the picture. 5 minutes exposure creates star trails even on a wide angle lens. Must be multiple exposures and focus stacking.
Unsure, but it also outputs a time lapse at the same time where you can see the stars moving - I’d guess this is a by-product of the process (also cool that you can see some slight cloud movement which I didn’t know was there before)
Yeah this is generally how low light and HDR camera modes work - by taking multiple photos and then using a technique called “Stacking” to average out noise and improve signal (useful light).
You can also just take 20-30 second exposures if your phone has a “pro” camera mode, then do the rest of the stacking with other tools. The astro mode simply does everything for you, but you get less control over the processing.
It most probably fakes dots based on arc intensity and width. So if two stars happen to be along the arc, they are both smeared together at the location of the first.
It’s taking a video and doing aligning/stacking of the frames like you said. Not taking an actual long exposure in the sensor. Most photos on modern phone cameras in low light are done this way. There’s a cool paper by google on their algorithm.