- Google’s Night Sight mode has greatly improved since it was first introduced on the Pixel 3, producing faster, sharper, cleaner, and brighter results on newer Pixel models.
- Night Sight is available on all Pixel phones, including older models, and can be used on both front and back cameras, allowing for great selfies.
- Night Sight utilizes AI and machine learning to reduce blur caused by motion, optimize exposure settings, and create more realistic and balanced colors in low-light photos. Google can continue to refine Night Sight through updates, improving camera performance over time.
When Google announced the Pixel 3, it showcased a new photo mode called Night Sight. Since then we’ve seen the technology improve massively, thanks in part to improving camera hardware but also thanks to all the work Google puts into its photography algorithms. Because of that, you’ll find Night Sight has become faster and produces sharper, cleaner and brighter results on the Pixel 7, Pixel Fold and Pixel 7a than what was possible on the Pixel 3 when it was first introduced.
Here’s everything you need to know about Night Sight and how it works.
What phones get Night Sight?
The good news is that isn’t a technology that’s limited to the latest Pixel handsets. Google is being pretty generous and offering Night Sight to all the Pixel phones. It updated the older models when the Pixel 3 launched and has continued to offer Night Sight on new releases.
The best part of this news is that Night Sight will work on both the front and back cameras, so you should be able to get some great selfies as well.
Google continuously says that the Pixel Camera is only designed to work on Pixel phones. It’s not technically available on other devices, although some have imported the Pixel Camera to other Android devices to access Google’s functions.
How does Night Sight work?
Night Sight relies on AI and machine learning to give you results that you normally wouldn’t be able to achieve without a tripod and a DSLR camera.
The technology itself is actually very similar to that first seen on Huawei phones, using multiple shots and combining them, with AI working to make sure that the colour stays in balance and that shake is kept to a minimum.
Here’s what’s happening:
1. Night Sight detects motion
Night Sight detects motion both in terms of hand shake and movement in the scene. Motion is a problem for long exposures because it causes blur. By detecting the motion before the photo is taken, Night Sight can optimise the capture process to reduce blur and give you a sharp photo.
If the phone is steady, it can use a longer exposure, to get more light in without the worry about blur. If there’s more shake or movement, it will use more shorter exposures and then merge the results to minimise the effects of any motion. Merging photos is part of the magic here, as it allows snaps at different settings to pick out information and contribute it to the final image.
All this is automatic – all you have to do is press the button and you get your photo, but you do need to keep the phone steady while this is happening for the best results.
2. Night Sight uses AI to rebalance colour
Night or low light photos often don’t look anything like the actual scene – often they turn yellow or red, taking on a warm cast which doesn’t look natural.
This is where machine learning comes in, as it has learnt what things should look like, so can aim to give you a photo that looks realistic – and not just like a poor low-light smartphone snap. Google says that the aim is to have a photo that looks like what you see with your eyes.
Once both these processes have happened, you should have a night photo that actually looks good, rather than a blurry mess.
The great thing about this process that relies on AI is that Google can refine it, through updates, so the camera performance gets better over time, rather than relying on hardware changes.
How to use Night Sight
When you open the camera on your Pixel in lower light conditions, you’ll notice the shutter button changes from a solid circle, to a solid circle with a crescent moon icon inside. This means it’s automatically switched to Night Sight mode.
Or you’ll be able to find Night Sight in the shooting modes in your Pixel camera by swiping from left to right at the bottom of the camera screen. Once you’re in Night Sight mode, you just have to tap the shutter button and it begins the capture, which will take a few seconds.
If it’s quite dark, you might need to tap to focus and you’ll normally need to find something that gives some contrast so the camera can snap into focus.
Night Sight sample photos
Google has suggested you share your own Night Photos with #teampixel #nightsight on social media and has shared some of its own photos.
We’ve embedded some photos and selfies above from the Pixel Fold and Pixel 7 Pro so you can see for yourself what the end result is like.
On the Pixel 4 Google introduced an extension of Night Sight called Astrophotography mode. This is designed to take 4-minute exposures of the night sky, designed to capture the stars. It will only work if the phone is completely steady, so it needs to be supported or on a tripod – then you can engage the mode and take pictures of the stars. It works really well, but you need to be somewhere with no light pollution to get a good result.