The current release of my Nodality App cheats a little when handling images: after the user browses for an image using UIImagePickerController, I create a single UIImage instance of the selected image which is used in the node previews, as the final image and saved as binary data in Core Data.
This technique needs to be improved for several reasons:
- I want to improve Nodality's performance by having a low resolution proxy in the node widgets and a high resolution final image.
- I don't want to bloat my Core Data storage with duplicates of images that exist in the user's albums.
The solution is to use PHImageManager and this blog post discusses the scratch project I've created to explore it. PHImageManagerTwitterDemo is an application that presents the user with two collection views: the upper one displays thumbnails of all of their photographs and tapping on a thumbnail allows them to tweet that image. The lower one displays all the images that have been tweeted. Importantly, the tweeted images in the lower view are persistent, i.e. the user can close the application and that list remains populated.
Let's look at the upper view first. This is an instance of TweetablePhotosBrowser at the beginning of the class, I create two constants, a reference to PHImageManager's default manager and an instance of PHFetchResult containing the user's images:
let manager = PHImageManager.defaultManager()
let assets = PHAsset.fetchAssetsWithMediaType(.Image, options: nil)
Unsurprisingly, the UICollectionViewDataSource function for returning the number of items returns the count of the assets:
func collectionView(collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int
{
returnassets.count
}
...and the function for returning the item renderer passes in the current PHAsset into my ImageItemRenderer class:
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell
{
let cell = collectionView.dequeueReusableCellWithReuseIdentifier("Cell", forIndexPath: indexPath) asImageItemRenderer
let asset = assets[indexPath.row] asPHAsset
cell.asset = asset
return cell
}
Once the ImageItemRenderer has its asset property set, it needs to create a thumbnail sized UIImage instance of that asset using requestImageForAsset(). This is an asynchronous operation, the request is inside a didSet observer of the asset variable:
let manager = PHImageManager.defaultManager()
let deliveryOptions = PHImageRequestOptionsDeliveryMode.Opportunistic
let requestOptions = PHImageRequestOptions()
let thumbnailSize = CGSize(width: 100, height: 100)
var asset: PHAsset?
{
didSet
{
iflet _asset = asset
{
requestOptions.deliveryMode = deliveryOptions
manager.requestImageForAsset(_asset, targetSize: thumbnailSize, contentMode: PHImageContentMode.AspectFill, options: requestOptions, resultHandler: requestResultHandler)
}
}
}
....and the result handler simply populates a UIImageView with the returned image:
func requestResultHandler (image: UIImage!, properties: [NSObject: AnyObject]!) -> Void
{
imageView.image = image
}
Back in TweetablePhotosBrowser, when the user selects an image I want to tweet that image at full size. Inside the UICollectionViewDelegate function didSelectAtIndexPath, I also invoke requestImageForAsset but with high quality and at a target size of the asset's actual pixel size.
func collectionView(collectionView: UICollectionView, didSelectItemAtIndexPath indexPath: NSIndexPath)
{
selectedAsset = assets[indexPath.row] as? PHAsset
let targetSize = CGSize(width: selectedAsset!.pixelWidth, height: selectedAsset!.pixelHeight)
let deliveryOptions = PHImageRequestOptionsDeliveryMode.HighQualityFormat
let requestOptions = PHImageRequestOptions()
requestOptions.deliveryMode = deliveryOptions
manager.requestImageForAsset(selectedAsset, targetSize: targetSize, contentMode: PHImageContentMode.AspectFill, options: requestOptions, resultHandler: requestResultHandler)
}
Once we have the high quality, full sized UIImage, sending it to Twitter is beautifully simple. Twitter support is part of the Social framework and only requires a handful of lines of code:
func requestResultHandler (image: UIImage!, properties: [NSObject: AnyObject]!) -> Void
{
ifSLComposeViewController.isAvailableForServiceType(SLServiceTypeTwitter)
{
let twitterController = SLComposeViewController(forServiceType: SLServiceTypeTwitter)
twitterController.setInitialText("Here's a photo from my album!")
twitterController.addURL(NSURL(string: "http://flexmonkey.blogspot.co.uk"))
twitterController.addImage(image)
twitterController.completionHandler = twitterControllerCompletionHandler
iflet viewController = window?.rootViewControlleras? ViewController
{
viewController.presentViewController(twitterController, animated: true, completion: nil)
}
}
}
If you refer to the actual code, I've added a little dialog if Twitter is unavailable.
Once the user has either tweeted or cancelled the Twitter dialog, I want to save the reference to the asset to Core Data. This is where I use the asset's localIdentifier which is a UUID style string that can easily be saved:
func twitterControllerCompletionHandler(result: SLComposeViewControllerResult) -> Void
{
iflet _coreDataDelegate = coreDataDelegate
{
let userAction = result.rawValue == 0 ? "Cancelled" : "Tweeted"
_coreDataDelegate.saveTweetablePhoto(localIdentifier: selectedAsset!.localIdentifier, userAction: userAction)
sendActionsForControlEvents(UIControlEvents.ValueChanged)
}
}
My CoreDataDelegate contains a few functions for centralising saving and loading.
The lower view is an instance of TweetedPhotosBrowser. It is populated by a Core Data call that returns an array of TweetedPhotos instances, so its cellForItemAtIndexPath has to find the correct PHAssets for a given localIdentifier. To do that, it simply loops over the assets returned by fetchAssetsForMediaType and finds the one with a matching identifier.
TweetedPhotosBrowser uses the same item renderer as TweetablePhotosBrowser, so once that match is found, it can pass the asset to that renderer:
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell
{
let cell = collectionView.dequeueReusableCellWithReuseIdentifier("Cell", forIndexPath: indexPath) asImageItemRenderer
let tweetedPhoto = images[indexPath.row] asTweetedPhotos
let assets = PHAsset.fetchAssetsWithMediaType(.Image, options: nil)
for i in0 ..< assets.count
{
if assets[i].localIdentifier == tweetedPhoto.localIdentifier
{
cell.asset = assets[i] asPHAsset!
break
}
}
return cell
}
And there we have it: a simple scratch app that demonstrates using PHImageManager for browsing a user's assets, tweeting those assets and using a string identifier for retrieving those assets between application sessions.
The source code for this project is available at my GitHub repository here.