iphone and ipad technology failures in the age of coronvirus

There are countless and mounting problems with our so-called “perfect” technology having to deal with imperfect real life. Here are two example problems with my use of Apple technology during the coronavirus pandemic here in the US.

I own a number of Apple devices; several Macs , an iPhone and a pair of iPad Pros. The biggest issues I have are shared by the iPhones and the iPads having to do with their biometric locking features, TouchID and FaceID.

If I use TouchID on the iPads, over time after initial configuration both iPads will exhibit increasing failures to unlock when the front button is pressed. No matter how many times I erase and then reprogram touch ID, unlock remains erratic. I believe the problem is TouchID wants to decode “perfect” fingerprints, and mine are now far from perfect.

Based on recommendations due to coronavirus, I wash my hands extensively, both in length of time and number of times during the day, especially if I (seldom, as it turns out) have to leave home. To clean up I use Softsoap manufactured by Colgate-Palmolive. It does a great job, but one problem is it dries out the skin, especially on my fingertips. This leaves a lot of cracking and tiny flakes of skin. That detritus makes fingerprint matching with TouchID very difficult, requiring multiple touches/button presses to open the device. Either it eventually succeeds, or else I just punch in the password (which for me is at least eight characters because I’m that paranoid). In the end I decided that, because I don’t go out with either device, I would disable both password and TouchID. Now, when I want to open the iPads I just push the button and I’m into the device. With a cover on it, when the cover is opened I’m immediately into the device. Securing the device with TouchID while at home is too much of a burden.

Which leads us to the latest and greatest iDevice biometric lock, FaceID. In practice, on my iPhone 11, it works better than TouchID does on any Apple device that supports TouchID. When my phone can see my face at the position where I would comfortably look at the screen, it rarely fails. When it does fail it’s because I’ve either pushed my glasses up onto my forehead or I’ve taken them off completely. The second time I push up on the bottom edge of the screen, it succeeds. So only two attempts to unlock, which is far better than TouchID. The only time I have to enter my password is when it explicitly asks due to its built-in security timeout (after so many days you have to enter the password to get into the iPhone).

FaceID worked great until coronavirus and having to wear a mask. Now it fails repeatedly while I wear my mask. The only saving grace of failure is that FaceID fails fast. And now that I’m used to it failing with my mask on, I’m ready to enter the password. The time to fail first and then enter the password isn’t that much longer than two successive failures, where the second FaceID attempt normally succeeds. Failure due to a face mask underscores the limitations of the existing system.

I say existing, because rumor has it that a better FaceID that can work with a mask on is being engineered and will be installed with the next point release of iOS. In the meantime I can live with the way it behaves because, well, I really have no choice. I won’t remove the device lock because I always travel away from my home with my iPhone and I do want it locked down as much as possible.

I look at the failure of both of these security systems as yet another example of many. These failures illuminate the deep limitations of machine learning that backstops these features running on the devices. It also points to a broader level of poor quality and too many failures in our current technology in general. We spend too much on tech that doesn’t work nearly as well as advertised, and we’re putting it into critical parts of our system of systems, leading to overall fragility in our daily lives. I’ve begun to look at our world as a house of cards. Trouble is, you can’t go anywhere else in the world without running into the same issues. If we ever do have a machine apocalypse it’ll be because the machines themselves malfunctioned, not because they rose up in revolt.

Personal Postscript

Back in October 2019 I spent a premium sum on this particular iPhone 11 Max Pro just for the privilege of trying it out in real life. After seven months of continuous use I now consider that purchase a mistake, especially given how little I’ve come to use a number of its vaunted features. I won’t spend that amount of money again on an iPhone, or any smart phone for that matter, especially an Android. My next iPhone will be “down market”, a much cheaper iPhone 12. I’d considered the latest iPhone SE, but it has TouchID, which I don’t care for all that much anymore. I’m looking at a “low end” iPhone 12 with a smaller screen but with as much battery as possible inside the device. Now all I have to do is bide my time until October of this year.

using the iphone 11 max pro – the camera


In my first post on the iPhone 11 Pro, I wrote in very general terms about my initial experiences with the new hardware. The overall impression of the 11 was very good, with the exceptions noted.

I’m now going to speak a bit to the 11’s new camera array. There’s not going to be very much here, certainly nothing extensive, and most certainly not in the pixel-peeping specification vs specification manner. I don’t have the time, nor the patience.

The first photo (“first light”) photo I took was of the Apple store interior at Florida Mall. This is the classic use case for cameras of this type; interior photos taken with ambient light, and this one using the ultra wide angle lens. To be honest I’m not too impressed with the ultra wide in this setting. First of all is the noticeable barrel distortion, especially in the outer one-third of the image. Look at the ceiling steps at the top as an example of the distortion. Another aberration the UW lens produces is coma, or in this case negative coma. You can clearly see an example in the upper left corner where two ceiling spots look like tiny comets with their tails pointing inward. I’m no fan of the UW lens and will avoid using it unless I absolutely have no other choice.

Now let’s look at the regular focal length lens on the iPhone 11. For this test I used my marmalade cat Bo as my test subject when he decided to rest atop his crimson pillow. He hung around long enough for me to take two photos of him, one with my Olympus OM-D E-M5, and the other with the iPhone 11. Both photos are straight-out-of-camera with absolutely no post processing except to crop the images as 16:9.

Bo taken with Olympus OM-D E-M5 and Panasonic Leica 1.4/25mm
Bo taken with the iPhone 11.

Note that the interchangeable lens camera is micro four thirds, and the lens and body are both circa-2012 (I believe the lens was released first in 2011). We’re thus comparing a seven-year-old camera with a just-released smartphone camera.

Basically, with the Olympus system, I took the photo with the PanLeica 25mm wide open at f/1.4. I used the Portrait mode on the iPhone 11 camera. You’ll note that both have nearly identical bokeh, it’s that the iPhone 11 achieved its bokeh with computational photography while the Olympus used plain old optical physics. The only place on the iPhone 11 photo where computational bokeh still has problems is with the hairs in Bo’s left ear. If you enlarge the iPhone 11 photo a bit and look you’ll see a clear line outlining those hairs projecting from Bo’s ear, as if someone laid on a mask and forgot to remove it. It’s not that noticeable except to someone like me, and then I had to enlarge the photo to really see it. I sincerely doubt it the target user wouldn’t notice it, and if they did, they probably won’t care.

Also note the overall quality of the images. Again, they are indistinguishable except under the most careful observations. I find it interesting that the iPhone sensor uses 12MP sensors, while the E-M5 is a Sony 16MP micro four thirds sensor, and at 13mm by 17mm, considerably larger than the iPhone sensors. Those so-called low resolution sensors are more than adequate at capturing quality images. I’ve even pulled out and recently used my E-P2, which as a 12MP micro four thirds sensor.

I’m quite impressed with how the overall iPhone 11 system operates. I’m not going out and throw my Olympus cameras in the trash. Rather, the iPhone 11 has reached a point of parity such that I can integrate its use with my other cameras without concern. With careful thoughtful use you won’t be able to tell the two apart just by looking.

Finally I present this photo of my miniature hibiscus growing in my back yard. I used the iPhone 11’s telephoto lens as an impromptu close-up lens, and adjusted the exposure on the iPhone’s screen down about 1 1/2 stops to my taste. The resultant photo was cropped 1:1 and then posted here. Once again, it’s a lovely image (to my eyes) and matches the quality of the images I took with my even-older E-P2 and recently posted here.

This will probably be the last post devoted to testing the iPhone 11 camera. For me, the iPhone 11’s camera (with the notable exception of the ultra-wide lens) is a superb instrument, on par with all my other cameras.