The recent addition of an astrophoto time-lapse mode (uncovered by XDA Developers) coming to Google’s camera app on their Pixel line of phones piqued my interest. Not that I think it will replace all of our “real” cameras, but I do have a deep appreciation for the engineering wizardry required to push right up to the physical limits of a tiny sensor and lens. And as an astronomy enthusiast, any developments that might open an appreciation of the night skies to a wider population get me very interested.
Since even before the dawn of the smartphone (way back in 2007), one of the features hyped was the built-in camera. Initially, it seemed that these were little more than a point-and-shoot, daylight camera, but intensive competition between Apple, Google, Samsung, and other manufacturers pushed decent performance into low-light scenes by leveraging powerful processors and image processing hardware with sophisticated software.
A Brief History of Astrophotography Digital Cameras
Back in the days of film, astrophotographers seemed to be a largely ignored tiny minority of the photo community who had to resort to hacks to improve the long-exposure characteristics of film and undertake a never-ending search for film with the deep red response required for astrophotos. Once digital cameras came to the forefront, again, the cameras seemed to be tailored to “ordinary” daylight or studio photographers, and astrophotographers resorted to hacks to make the cameras suitable for astrophotography.
The pattern was disrupted in 2005 with the introduction of the “a” version of the Canon 20D. The EOS 20Da was specifically designed with astrophotography in mind: better deep red response, lower noise, live view focusing, and built-in capability for longer exposures. I have heard that it was a top Canon exec interested in astronomy that we have to thank. Canon followed up with the full frame 60Da in 2012, and Nikon followed with the full-frame D810a in 2014. Most recently, Canon continued the astrophotography line with the full-frame Canon Ra mirrorless camera.
It seems that there are also astrophotographers at work at Google. In 2018, “Night Sight” mode was introduced to Pixel 3 and higher phones. This is automatically enhanced to “Astrophotography Mode” when the phone’s motion sensors detect a stationary setup (i.e. tripod). In both modes, short exposures are taken, analyzed, and combined to reduce noise and adjust color balance. Depending on the exact model of Pixel used, in astrophotography mode, up to fifteen 16-second sub-exposures are used to provide a cleaner final image. The 16-second sub-exposures allow the system to not only stack images to clean up noise, but also compensate for the Earth’s rotation, which would otherwise turn stars into streaks.
Excellent examples of what can be done have been shared on a Google Photos page: Pixel 3 and Pixel 4 Astrophotography Examples. In particular, a shot showing the nearby galaxies M31 and M33 impressed me. M33, especially, is a dim target! Based on these examples, it seems to me that the Pixel 4 would be a good choice since it seems to have the image processing power and will be upgraded with the latest camera software, but perhaps we should wait to see what the anticipated Pixel 6 brings us this year.
Good results clearly depend on having the higher-end phones in the Pixel family so that the processing can take advantage of better hardware. Unfortunately for me, my phone is a Pixel 3a XL (nearly 3 generations old), which does not contain the GPU accelerator needed for really deep shots, so I have not-so-impressive results. In the comparison examples below, the Pixel 3a XL image in the middle was a four-second shot (max on my phone in astrophoto mode) at ISO 1,157 with the lens info recorded at f/1.8. Both of the flanking shots were at ISO 1,600 at 30 seconds (maximum length allowed internally) with f/2.8 lenses on dedicated cameras. While these quick shots (taken over-shoulder of my main scope) are not exact 1:1 comparisons, they give you an idea of what is possible with a low-end Pixel phone versus “normal” cameras.
Aside from the depth of exposure limitation of my phone, another problem is the sky color. In the Pixel shot, the sky is predominantly green. At my location, I’m unavoidably shooting through light-polluted skies, so that is part of the problem, but the Pixel color balance may also have been thrown off by the dim red lighting I use in my observatory.
For long-duration shooting (e.g. time-lapse mode), I would be concerned about heating up of the phone due to the heavy computational load as well as drain on the battery. I’m guessing cold nights and a big external battery would be important to consider.
The Other Color Balance Problem
The color balance in the comparison shots brings up another issue that has long plagued astrophotographers: cutoff of the red end of the spectrum. An important color in astrophotography is the glow of ionized Hydrogen, deep in the red end of the visible spectrum.
In conventional photography, the standard (beginning with Kodak) was to cut off the deep red end of the spectrum. In part, it has to do with the difficulty of designing optics to correctly focus light across the entire visible spectrum. But this “tradition” also (amusingly) started with Kodak’s standard for color, named Shirley. NPR has the story behind the selection of a fair-skinned employee chosen to be the standard for color balancing, forever changing the color balance we live with!
As can be seen in the comparison astrophotos above, the red-end cut-off is not exactly matched even among modern digital cameras. In my opinion, the Fuji XT-1 does a better job on the red end (astronomically) than the Nikon D850. Both are standard cameras set for daylight balance.
With Google’s image-processing wizardry, I don’t see why the sensors can’t be redesigned to be effective for astrophotography, with color balance tweaked for daylight use with appropriate image processing (Google, are you listening?). The sensors themselves are inherently sensitive well into the infrared. It’s just a matter of changing the red cutoff filter placed on top of the sensor. Most DSLR and mirrorless cameras can be modified to replace these filters by third parties, but this isn't practical on tiny cell phone sensors, and besides, they are tightly tied to the image processing supporting the sensors.
Back to Reality
Realistically speaking, cell phones aren’t going to replace cameras with large sensors and lenses. But I can still appreciate how far they have come, producing amazingly good photos like the one below, taken with a Pixel 3 (hand-held, by my wife) during the January 2019 lunar eclipse!
Google, I’m looking forward to the next astrophotography features you’ve got up your sleeve.