A quick tip for taking better phone pictures
Every time we upgrade our contract and get a nice new shiny smartphone, the technology has leapt ahead. Essentially, all the software is trying to make things easier for us by doing a lot of the hard work i.e. decision making, for us. But that’s not necessarily a good thing. Here’s an example of what I mean.
We all use our phones to take photos. Even a professional like me. I don’t tend to take a camera with me when I’m out and about, but my phone is always in my pocket. They’re with us all the time, dead easy to use and, most of the time, do a decent job of getting the technical side of photography right. But that makes it easy to assume that the phone is clever and making the right decisions for us. But it’s not – clever, that is. It’s just a bit of software programmed to carry out actions based upon thousands, possibly millions, of possible permutations that it runs through. So it’s been designed to work in most situations, but sometimes it gets things wrong. Or does an okay job when a thinking, living, breathing human being could do it better.
Let’s look at the face recognition system, for example. Hold your phone up to shoot a group of people and little boxes appear over faces don’t they? That’s the camera recognising faces – as it’s been programmed to do – and assuming they’re the most important part of the image. They’re likely to be, so the camera works out the exposure of the photo based on the light falling on those faces. And 9 times out of 10 that photo will be fine. But sometimes the subject isn’t so obvious, like in the photo below. Now, look at that yellow box. It’s similar to the face recognition one, but actually I put it there by touching the screen. By doing that I told the camera, “this is the most important bit of this photo.” So the camera will focus on that part, and work out the exposure of the photo from the light on that part of the leaves.
Try it for yourself. Point your camera at a landscape, and try touching different parts of the screen. Notice how the photo gets a bit darker or lighter as you touch different places? Here’s a great tip coming up. If you’re trying to get a shot of a great sky, but it looks a bit dark or washed out onscreen, try touching where the great colours in the sky are. You’ll see the screen change to resemble what you can see in front of you. Because you’ve told the camera to do it’s calculations on that bit – because that’s the most important part.
Look around this photo. What’s the most important part? It’s the two pointed roofs of the building isn’t it? But there’s a light area below – the reflection in the swimming pool – and pale sky above. There’s also a dark patch to the right in the shadow of the trees. In photographic terms, this photo is all about the beautiful light, and those colours. But by trying to make everything look good, the camera on my phone washed out the colours, so I helped it by touching around the screen until it resembled the actual scene in front of me.
Dead easy, but people forget that the touchscreen exists when they turn on the camera. Give it a go – I promise you you’ll be impressed by how your photos will improve.