Quantcast
Channel: FlexMonkey
Viewing all articles
Browse latest Browse all 257

Simulating Bokeh with Metal Performance Shaders

$
0
0

Bokeh is an optical effect where out of focus regions of an image take on the shape of a camera's iris - often a polygon such as a hexagon or octagon. This effect can be simulated in both vImage and with the Metal Performance Shaders (MPS) framework using the morphology operator, dilate. In this post, I'll look at a simulation of bokeh using MPS.

Creating a Hexagonal Probe

Dilate operators use a probe (also known as a kernel or structure element) to define the shape that bright areas of an image expand into. In my recent talk, Image Processing for iOS, I demonstrated an example using vImage to create a starburst effect. In this project, I'll create a hexagonal shaped probe using the technique I used recently to create lens flare.

MPSImageDilate accepts a probe in the form of an array of floats which is treated as a two dimensional grid. Much like a convolution kernel, the height and width of the grid need to be odd numbers. So, the declaration of my MPS dilate operator is:


    lazyvar dilate: MPSImageDilate =
    {
        var probe = [Float]()
        
        let size = 45
        let v = Float(size / 4)
        let h = v * sqrt(3.0)
        let mid = Float(size) / 2
        
        for i in0 ..< size
        {
            for j in0 ..< size
            {
                let x = abs(Float(i) - mid)
                let y = abs(Float(j) - mid)
                
                let element = Float((x > h || y > v * 2.0) ?
                    1.0 :
                    ((2.0 * v * h - v * x - h * y) >= 0.0) ? 0.0 : 1.0)
                
                probe.append(element)
            }
        }
        
        let dilate = MPSImageDilate(
            device: self.device!,
            kernelWidth: size,
            kernelHeight: size,
            values: probe)
   
        return dilate
    }()

Executing the Dilate

Metal performance shaders work with Metal textures, so to apply the dilate to a UIImage, I use MetalKit's texture loader to convert an image to a texture. The syntax is pretty simple:


    lazyvar imageTexture: MTLTexture =
    {
        let textureLoader = MTKTextureLoader(device: self.device!)
        let imageTexture:MTLTexture
        
        let sourceImage = UIImage(named: "DSC00773.jpg")!
        
        do
        {
            imageTexture = try textureLoader.newTextureWithCGImage(
                sourceImage.CGImage!,
                options: nil)
        }
        catch
        {
            fatalError("unable to create texture from image")
        }
        
        return imageTexture
    }()

Because Metal's coordinate system is upside-down compare to Core Graphics, the texture needs to be flipped. I use another MPS shader, MPSImageLanczosScale with a y scale of -1:


    lazyvar rotate: MPSImageLanczosScale =
    {
        let scale = MPSImageLanczosScale(device: self.device!)
        
        var tx = MPSScaleTransform(
            scaleX: 1,
            scaleY: -1,
            translateX: 0,
            translateY: Double(-self.imageTexture.height))
        
        withUnsafePointer(&tx)
        {
            scale.scaleTransform = $0
        }
        
        return scale
    }()

The result of the dilation benefits from a slight Gaussian blur, which is also an MPS shader:


    lazyvar blur: MPSImageGaussianBlur =
    {
        returnMPSImageGaussianBlur(device: self.device!, sigma: 5)
    }()

Although MPS supports in-place filtering, I use an intermediate texture between the scale and the dilate. newTexture(width:height:) simplifies the process of creating textures:


    func newTexture(width width: Int, height: Int) -> MTLTexture
    {
        let textureDesciptor = MTLTextureDescriptor.texture2DDescriptorWithPixelFormat(
            MTLPixelFormat.RGBA8Unorm,
            width: imageTexture.width,
            height: imageTexture.height,
            mipmapped: false)
        
        let texture = device!.newTextureWithDescriptor(textureDesciptor)

        return texture
    } 

...which is used to create the destination textures for the scale and blur shaders:


    let rotatedTexture = newTexture(width: imageTexture.width, height: imageTexture.height)

    let dilatedTexture = newTexture(width: imageTexture.width, height: imageTexture.height)

To begin using MPS shaders, a command queue and a buffer need to be created:


    let commandQueue = device!.newCommandQueue()
    let commandBuffer = commandQueue.commandBuffer()

...and now I'm ready to execute the dilate and pass that result to the blur. The blur targets an MTKView current drawable:


    rotate.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: imageTexture,

        destinationTexture: rotatedTexture)

    dilate.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: rotatedTexture,
        destinationTexture: dilatedTexture)
    
    blur.encodeToCommandBuffer(
        commandBuffer,
        sourceTexture: dilatedTexture,
        destinationTexture: currentDrawable.texture)

Finally, the buffer is passed the MTKView's drawable to present and committed for execution:


    commandBuffer.presentDrawable(imageView.currentDrawable!)

    commandBuffer.commit();

There's a demo of this code available here.

Bokeh as a Core Image Filter

The demo is great, but to use my bokeh filter is more general contexts, I've wrapped it up as a Core Image filter which can be used like any other filter. You can find this implementation in Filterpedia:





If you'd like to learn more about wrapping up Metal code in Core Image filter wrappers, may I suggest my book Core Image for Swift. Although I don't discuss MPS filters explicitly, I do discuss using Metal compute shaders for image processing. 

Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.




Viewing all articles
Browse latest Browse all 257

Trending Articles