Here's a funky Swift and Metal experiment that uses the Metal-texture-from-video technique I described in my recent blog post, Generating & Filtering Metal Textures From Live Video.
Here, I'm only using the luma component from the video feed as an input to the reaction diffusion compute shader I used in my ReDiLab app. The luma value at each pixel is used to tweak the parameters by an amount I've reached through some tinkering.
The application has only been tested on my iPad Air 2 under OS 9 beta 3. Because it's just a quick-and-dirty experiment, there's not much in the way of defensive coding, but if you're yearning for a psychedelic experience, give it a go! The video above was generated by pointing my iPad at a screen playing one of my favourite films, Napoleon Dynamite.
As a point of interest, the initial noise texture is created using Model I/O's MDLNoiseTexture. To convert to a MTLTexture, I use the noise texture's texel data to replace a region in my metal texture, textureA:
let noise = MDLNoiseTexture(scalarNoiseWithSmoothness: 0.75,
name: nil,
textureDimensions: [Int32(Width), Int32(Height)],
channelCount: 4,
channelEncoding: MDLTextureChannelEncoding.UInt8,
grayscale: false)
let noiseData = noise.texelDataWithBottomLeftOrigin()
let region = MTLRegionMake2D(0, 0, Int(Width), Int(Height))
iflet noiseData = noiseData
{
textureA.replaceRegion(region,
mipmapLevel: 0,
withBytes: noiseData.bytes,
bytesPerRow: Int(bytesPerRow))
}
There are some other nice Model I/O procedural textures including a sky cube for creating a simulation of a sunlit sky and a color swatch texture based on color temperature.
As always, the source code for this project is available army GitHub repository here. Enjoy!