Is your iPhone sharing photos with Apple by default?
Apple occasionally makes choices that tarnish its strong privacy-forward reputation, like when it was secretly collecting users’ Siri interactions. Yesterday, a blog post from developer Jeff Johnson highlighted such a choice: an “Enhanced Visual Search” toggle for the Apple Photos app that is seemingly on by default, giving your device permission to share data from your photos with Apple.
Sure enough, when I checked my iPhone 15 Pro this morning, the toggle was switched to on. You can find it for yourself by going to Settings > Photos (or System Settings > Photos on a Mac). Enhanced Visual Search lets you look up landmarks you’ve taken pictures of or search for those images using the names of those landmarks.
To see what it enables in the Photos app, swipe up on a picture you’ve taken of a building and select “Look up Landmark,” and a card will appear that ideally identifies it. Here are a couple of examples from my phone:
On its face, it’s a convenient expansion of Photos’ Visual Look Up feature that Apple introduced in iOS 15 that lets you identify plants or, say, find out what those symbols on a laundry tag mean. But Visual Look Up doesn’t need special permission to share data with Apple, and this does.
A description under the toggle says you’re giving Apple permission to “privately match places in your photos with a global index maintained by Apple.” As for how, there are details in an Apple machine-learning research blog about Enhanced Visual Search that Johnson links to:
The process starts with an on-device ML model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “landmark” domain, a vector embedding is calculated for that region of the image.
According to the blog, that vector embedding is then encrypted and sent to Apple to compare with its database. The company offers a very technical explanation of vector embeddings in a research paper, but IBM put it more simply, writing that embeddings transform “a data point, such as a word, sentence or image, into an n-dimensional array of numbers representing that data point’s characteristics.”
Like Johnson, I don’t fully understand Apple’s research blogs and Apple didn’t immediately respond to our request for comment about Johnson’s concerns. It seems as though the company went to great lengths to keep the data private, in part by condensing image data into a format that’s legible to an ML model.
Even so, making the toggle opt-in, like those for sharing analytics data or recordings or Siri interactions, rather than something users have to discover seems like it would have been a better option.