sparse shelves at a publix

Figure 1: A frozen food section at a local Publix

I took this Tuesday of last week at a local Publix. I photographed this because I’ve noticed that the shelves all around the store have holes in them where there is no stock, while other shelves had limited stock spread across the front and pulled forward. I don’t know why at this time a major grocery store like Publix would be having stock issues. The store shelves were like this in 2020, during the height of the first phase of the pandemic due to panic buying. This time, it doesn’t feel like panic buying has set back in again. It just feels like the system that supplies us our groceries is breaking down a bit. It tends to stir the apprehension.

Commentary on the Photo

This photo was taken with my iPhone 11 Pro’s wide angle lens. It came “straight out of the camera” with no post processing whatsoever. The barrel distortion, especially in the outer thirds, is horrible. I haven’t seen anything this bad since I first started with interchangeable lens film photography back in the mid-1970s. I know that distortion like this can be corrected in-camera because my Olympus Micro Four Thirds cameras do it (as does the few Panasonic bodies I also own). Fortunately the regular and telephoto lenses produce much better photographs. But the wide angle? I use it only when it’s absolutely necessary, and I have no other choice because it’s all I have at the time. I’ve read too many times that the best camera is the one you have with you. Not if the camera you have with you is an iPhone 11 Pro using its wide angle lens.

a cat photo illustrates the limits of computational photography

This cute photo of Danï was taken this evening with my iPhone 11. I’d just come back from a wild night at Chuy’s and we were all sitting around in the TV room, with the TV off, and reading. As I looked up I glanced towards Danï, who was dozing and looking so cute while doing that. I didn’t have a “regular” camera with me, and I didn’t want to disturb her and the moment, so I used my iPhone to capture a few images. I post-processed it in Snapseed, then uploaded it to my Flickr account (clicking on the image will take you there).

Looks nice, doesn’t it? Except for the computational photography flaws where the camera software is attempting to produce the equivalent of bokeh; I see unexpected and incorrect blurring at the tips of Danï’s ears. Note the fuzziness on her left ear tip (your right) and the lack of any detail on her right ear tip (your left). What’s wrong, you ask? Let me show you with another photo.

This photo was taken last Christmas when I was first trying out my new, used, M.Zuiko 1.8/75mm prime lens. Note the tufts on the tips of both ears. Then look back up at the first photo. The iPhone’s camera software can’t handle that kind of detail when you’re using the iPhone camera in portrait mode. I’ve seen this issue going all the way back to at least my iPhone 7 Plus. It’s not a camera hardware problem, it’s Apple’s camera software problem. And before you say it, no, trading up to an iPhone 13 won’t solve this issue. I’m using iOS 15.3.1, the latest and greatest on all currently supported iPhones.

That doesn’t make the iPhone camera tragically flawed; far from it. I like using the camera, and in good light (which we have quite a bit of here in Florida) and when I’m not using portrait mode, I get excellent results, as you will note in the last post. But you’ll forgive me if in the future I don’t reach for my Olympus or Panasonic cameras with a suitable lens for portrait work. Or as the Brits say, horses for courses.