“Siri, show me pictures of Jeff Carlson” is a phrase that should work on my iPhone when I’m looking for a new profile picture or to see if I wore the same shirt at last year’s family gathering. Instead, the Apple Intelligence assistant brings up a web search of other Jeff Carlsons, such as a star of the movie Slap Shot, the leader of a rock band and the late science fiction author. So much for smart AI photo management.
The conundrum is that the Apple Photos app on the phone already knows who I am: I’ve identified photos of myself, and if I do a text search within the app for my name, I get the results I’m looking for. In fact, Apple Photos makes extensive use of AI technology to find images and surface memories — it’s just not yet tied into the larger Apple Intelligence framework.
When you do want to locate photos that you know must be in your library somewhere, you don’t need to haphazardly scroll through the thumbnails to find them. Here are some strategies for making smarter searches that will save you time and trouble.
It’s all about the photo metadata
When you search for a document on your computer, you typically type a word or phrase that you know appears in it; for example, I can quickly locate my tax information by searching for “2024 taxes” or “1099” in the MacOS Finder or Windows Explorer.
However, an image is composed of colored pixels.
The key to effective photo searches is having metadata — information that lives inside the image file and describes what the image depicts. It used to be your job to tag each photo with keywords and descriptions to make it easier to find. In reality, few people actually go to the trouble of adding this information. Plus, in this mobile age, it’s difficult to actually add it; the Photos app on the iPhone and iPad only includes a Caption field for writing descriptive text.
But in this case, Photos is actually on your side, because it’s doing a lot of that descriptive work for you in the background. The app uses machine learning to analyze each image and build its own internal database of what it discovers. So even if you’ve never thought about tagging your images, Photos is building metadata for you.
This photo could include metadata such as harbor, bridge, ferry, skyline, water and city to describe its contents.
Strategy 1: Search for items, scenes or places in the photos
There’s no way to tell what information Photos comes up with for any given image, so the rule of thumb is to assume it’s there — or is at least close enough to narrow your search. For example, tap the Search button and type “flowers” in the Search field.
As you type, the app displays an assortment of results: Photos you’ve favorited that contain flowers, images with any type of flower, images with the word “flowers” in it and so on. Tap one of the items in the pop-up menu that appears or tap the Search button to reveal everything it found.
Searching for “flowers” (left) brings up all images that Photos thinks include flowers.
Similarly, the Photos app can make guesses about scenes and concepts, such as “spring” or “skiing” and serve up images that match those ideas.
And if your images were captured by an iPhone or tagged with location data, the search will pick up names of cities, neighborhoods or some landmarks.
Apple says the following metadata are detected when scanned:
- Date (month or year)
- Place (city or state)
- Business names (museums, for example)
- Category (beach or sunset, for example)
- Events (sports games or concerts, for example)
- A person identified in People & Pets
- Text (an email address or phone number, for example)
- Caption
Strategy 2: Search for identified people
The Photos app also uses machine learning to identify when people are in a photo. If you’ve identified them in the People & Pets
category, you can type someone’s name into the Search field to find all the photos in which they appear.
Individuals you’ve identified in the People & Pets category (left) can be found by name using the Search field (right).
In fact, when you add multiple people to the search query, Photos narrows the results to photos in which all of the people are present.
When you specify multiple names of people in the Search field, the results show only photos in which all of them appear.
Strategy 3: Search for text that appears in photos
At one time, you’d pay a lot of money for software that could recognize text in a document and turn it into editable text. Now, the Photos app automatically scans all your photos looking for text that appears on signs, menus and other real-world sources.
Any text you type in the Search field that matches what appears in photos will come up in the search.
A search for “coffee” includes results in which that word appears in photos.
Strategy 4: Combine elements for more targeted searches
You can see where this is all going. When you type multiple criteria into the Search field, you can narrow the number of results that appear and find your photos quicker. For example, searching for “Jeff L. Carlson in Italy in 2022” — tapping each of those items in the pop-up menu that appears to denote that you’re using the results it knows, highlighted in blue — brings up images of me during a vacation that year.
Combining search terms narrows the number of results.
Strategy 5: Add keywords in Photos for Mac for more specificity
I know I said people don’t tag their photos with keywords, but that doesn’t mean it’s not an effective method of organization. The Photos app on MacOS does include a Keywords field, though it’s not obvious: Select a photo and choose Window > Info to view the floating Info panel, then type in the Keywords field. Separate terms with commas.
Keywords transfer, but it can take some time due to the Photos sync process. In one example, an edit where I converted a photo to black and white synced, but it took many minutes before the keyword I added (“Grabthar”) came up in a search.
You can add keywords in the MacOS version of Photos and they will be recognized when searching in the iOS and iPadOS versions.
What does “Some Results May Not Appear” mean?
At the bottom of the search screen you may see an unobtrusive message: “Some Results May Not Appear.” This is Apple’s way of tamping expectations in case the photo you’re looking for doesn’t come up in a search. More specifically, it’s there because of how Photos indexes your library.
I mentioned earlier that Photos scans your images to find recognizable elements, perform people recognition and collect other metadata. According to Apple, that scanning only happens when your device is locked, charging and connected to Wi-Fi. There’s no way to force it to update its database, which can be frustrating when you’ve recently imported a bunch of new photos but the people in them aren’t yet recognized.
With AI, there’s always something to look forward to, I guess.
For more on working with Apple Photos in iOS 18, don’t miss how to use the new Clean Up tool to remove unwanted elements from your pictures. And if you’re flummoxed by the new interface, learn how to pare the app down by hiding many of the collections and categories fighting for space.