Inspired by an episode of ATP and by this blog post by Alex Chan I decided to write a simple app to find out which lenses I use the most on my iPhone.
The post linked above teach you how to use exiftool, and if you have your photo library on the Mac this is a great option.
But what if you only have your library on the iPhone? Or the pictures on your Mac are not the same you have on the phone?
I do have a photo library on my Mac, but my DSLR pictures are there, while most of the pics I take on my iPhone are on my iCloud library, I only save on the Mac the most important ones.
I created a project and you can find it on GitHub, so if you’re curios about your camera usage you can run on your iPhone and find out how many pictures you take with each lens.
EXIF metadata
In order to find out the lens used to take a particular picture, we need to read the EXIF metadata associated with it.
I’ll just leave you the link about EXIF on Wikipedia. TLDR: EXIF metadata is a dictionary associated to the pictures with various information like which lenses was used, the GPS location, shutter speed, aperture, ISO and many more.
For the sake of our project, we just need one value, called LensModel.
Once we collect all the LensModel values of our picture we can show them on a UITableView and find out which is our favourite lens.
Access the library
I created a class ImageLibraryHelper to access the library and extract EXIF metadata.
When you need to access the pictures library remember to set NSPhotoLibraryUsageDescription on your Info plist. If you don’t, the app will crash when you attempt to access the library.
The text you set there, is prompted to the user when you try to access the library for the first time. In this app, I ask for permission as soon as the class is created so once the user taps on the scan button, I’m sure the permission has already been asked. Of course, if the permission is denied, we can’t scan the library.
init() {
let status = PHPhotoLibrary.authorizationStatus()
if status != .authorized {
PHPhotoLibrary.requestAuthorization { (status) in
if status != .authorized {
print("authorisation not granted!")
}
}
}
}
PHPhotoLibrary.authorizationStatus() returns the current status. If our app was granted access to the library before, we don’t need to explicitly request for authorisation again, otherwise we can ask for it and be notified once the user reacted to the prompt.
Get images from the library
Suppose we have the permission, we can now scan the pictures library to get images and read EXIF data from each of them.
If you want to view the code from this chapter, look for the getPhotos function in ImageLibraryHelper.
This is how it works: first it queries for assets via PHAsset, then for each asset an image is requested via PHImageManager and we try to read EXIF data from it.
let fetchOptions = PHFetchOptions()
fetchOptions.fetchLimit = limit
fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
let results: PHFetchResult = PHAsset.fetchAssets(with: .image, options: fetchOptions)
Let’s start with this piece of code. We create a PHFetchOptions object, is a class responsible for setting options for fetching assets. We can specify a limit, the maximum number of assets we want back from the call. It isn’t mandatory, but if you have thousands of images you may want to set a limit to avoid spending a huge amount of time processing all of them. The second parameter we set is about sorting, in our example we want the most recent images so we sort them by creationDate descending. So if we set a limit of 100, we can get the last 100 pictures from the library.
Finally, we get the assets by calling PHAsset.fetchAssets() the first parameter is the media type, we set .image but .video is also available if you want to get videos instead of pictures.
let manager = PHImageManager.default()
let requestOptions = PHImageRequestOptions()
requestOptions.isSynchronous = true
requestOptions.deliveryMode = .fastFormat
requestOptions.isNetworkAccessAllowed = allowNetworkAccess
After getting the assets, it is time to get the images associated with them.
We need to instantiate a PHImageManager, and there is a class for settings options to the manager. In my example, I have a variable to allow for network access. If is set to false, only the images found on the device are returned. Setting this option to true, allow the manager to fetch each image from iCloud. As you can imagine, it might consume a lot of data, so in my example I allow the toggle to be set only on WiFi, if you want to find out how to do it, read my previous article.
The isSynchronous options can be set to false to avoid blocking the calling thread, I set it true as my app just waits for the manager to finish fetching all the images, so I’m ok to be blocked.
The delivery mode option allow us to specify whether we want the best possible quality or if we are willing to sacrifice image quality for a faster delivery. As EXIF metadata is there even at a lower quality, I set fastFormat.
let asset = results.object(at: i)
manager.requestImageDataAndOrientation(for: asset, options: requestOptions) { (data, fileName, orientation, info) in
if let data = data,
let cImage = CIImage(data: data) {
if let exif = cImage.properties["{Exif}"] as? [String:Any] {
if let value = exif[self.exifKey] as? String {
self.exifStats.updateWithValue(value)
}
For each asset returned by PHAsset, we ask the PHImageImager to get the corresponding image. We don’t get a UIImage back, but an optional data, the file name, and other info.
In order to get EXIF data from the image, we first have to create a CIImage from the data returned by the callback.
What is a CIImage? Is not an actual image, like UIImage, you can think of it as a representation of an image that can be processed by Core Image to produce an actual image. The official documentation says that you can think of it as an image “recipe” and I like this analogy.
All we need from this CIImage is the EXIF data dictionary found in properties. There, we can look for our exifKey, LensModel.
There is another way to gather EXIF metadata from an image. You can save a UIImage as a jpeg and then try to find the EXIF data there.
You can find some examples on StackOverflow, but I tried that approach with PHImageManager and wasn’t able to read EXIF that way.
Maybe if you take the picture or if you chose a single photo via the image picker you can get a UIImage with EXIF data, but the approach I showed you was the only one working with PHImageManager.
Happy coding 🙂