Monday, August 28, 2017

TECH | Histogram equalization using Apple Metal Performance Shaders and more...

There are a lot of developers (or would-be developers) interested in finding a foothold in the world of Apple Metal, but are stymied from the start by lack of adequate documentation and simple examples. While Apple's claim that MetalKit was built just for those developers, it has failed to provide sample code comprehensible to the newbie.
A dark room, as it appears in the Camera app for iPhone
The same room, after applying the histogram equalization Metal Performance Shader
This post should remedy that by showing beginners how to use MetalKit to display video frames from the camera on iPhone using Metal, both with Metal Performance Shaders (pre-packaged code) and Metal Shaders (hand-coded). The first app uses the histogram equalization Metal Performance Shader; the second app uses a Metal pass-through shader only.


A steep learning curve for all image-processing frameworks—vImage, OpenGL, etc—is what is extending the development time for Chroma (an app I'm developing that enables users to see the unseeable, which includes seeing in the dark). Metal has proven no different; however, unlike vImage and OpenGL, Metal will be worth the effort, as it is able to perform histogram equalization in real-time, allowing users to do other things with their phone that were impossibilities when using vImage or OpenGL for such an intensive operation.

Both apps do the same thing from the user perspective, i.e., start the video camera when the app launches, and displays the Metal-processed output to the screen. Each contains the de minimus code required to process video frames.

The basic structure of both apps are illustrated using a Quartz Composition document (below); it identifies their three major components and the image-processing workflow. The three components are the video camera and frame provider, the Metal shader for processing each frame, and the MTKView for display; however, for the Metal App, substitute Video Input for AVFoundation, Core Image Filter for the Metal shader (shaders.metal) and Image for MTKView:

A Quartz Composition, showing the three major components of both Metal apps

All of the above-described functionality is provided on a single view controller. The code contained therein is the absolute minimum required to achieve the intended functionality, and can be divided into these primary parts:
  1. Setting up the camera using AVFoundation, and implementing the delegate methods for handling the video output
  2. Setting up the MetalKit view
  3. Converting the sample buffers (video-frame data) output to Metal textures
  4. Processing the output using Metal Performance Shaders or Metal shader
  5. Rendering the texture to the MetalKit view
Metal Sample App #1: Processing video using a Metal Performance Shader
The following is the code for the initial (and only) view controller:

ViewController.h:

@import UIKit;
@import AVFoundation;
@import CoreMedia;
@import CoreVideo;
#import <MetalKit/MetalKit.h>
#import <Metal/Metal.h>
#import <MetalPerformanceShaders/MetalPerformanceShaders.h>

@interface ViewController : UIViewController <MTKViewDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>

@property (retain, nonatomic) AVCaptureSession *avSession;

@end

ViewController.m:

#import "ViewController.h"

@interface ViewController () {
    MTKView *_metalView;
    id<MTLDevice> _device;
    id<MTLCommandQueue> _commandQueue;
    MTKTextureLoader *_textureLoader;
    id<MTLTexture> _texture;
    CVMetalTextureCacheRef _textureCache;
}

@property (strong, nonatomic) AVCaptureDevice *videoDevice;
@property (nonatomic) dispatch_queue_t sessionQueue;

@end

@implementation ViewController

//
// Setting up the Metal View
// - (void)viewDidLoad {
    NSLog(@"%s", __PRETTY_FUNCTION__);
    [super viewDidLoad];
    
    _device = MTLCreateSystemDefaultDevice();
    _metalView = [[MTKView alloc] initWithFrame:self.view.bounds];
    [_metalView setContentMode:UIViewContentModeScaleAspectFit];
    _metalView.device = _device;
    _metalView.delegate = self;
    _metalView.clearColor = MTLClearColorMake(1, 1, 1, 1);
    _metalView.colorPixelFormat = MTLPixelFormatBGRA8Unorm;
    _metalView.framebufferOnly = NO;
    _metalView.autoResizeDrawable = NO;
    CVMetalTextureCacheCreate(NULL, NULL, _device, NULL, &_textureCache);
    [self.view addSubview:_metalView];
    
    
    if ([self setupCamera]) {
        [_avSession startRunning];
    }
}

//
// Setting up the video camera using AVFoundation
// - (BOOL)setupCamera {
    NSLog(@"%s", __PRETTY_FUNCTION__);
    @try {
        NSError * error;
        
            _avSession = [[AVCaptureSession alloc] init];
            [_avSession beginConfiguration];
            [_avSession setSessionPreset:AVCaptureSessionPreset640x480];
            
            // get list of devices
            self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
            if (self.videoDevice == nil) return FALSE;
            
            AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:self.videoDevice error:&error];
            [_avSession addInput:input];
            
            dispatch_queue_t sampleBufferQueue = dispatch_queue_create("CameraMulticaster", DISPATCH_QUEUE_SERIAL);
            
            AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
            [dataOutput setAlwaysDiscardsLateVideoFrames:YES];
            [dataOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)}];
            [dataOutput setSampleBufferDelegate:self queue:sampleBufferQueue];
            
            [_avSession addOutput:dataOutput];
            [_avSession commitConfiguration];
            
    } @catch (NSException *exception) {
        NSLog(@"%s - %@", __PRETTY_FUNCTION__, exception.description);
        return FALSE;
    } @finally {

    }
    
}


//
// The video camera output delegate method for capturing each frame, where they are converted to Metal textures
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    {
        size_t width = CVPixelBufferGetWidth(pixelBuffer);
        size_t height = CVPixelBufferGetHeight(pixelBuffer);
        
        CVMetalTextureRef texture = NULL;
        CVReturn status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, MTLPixelFormatBGRA8Unorm, width, height, 0, &texture);
        if(status == kCVReturnSuccess)
        {
            _metalView.drawableSize = CGSizeMake(width, height);
            _texture = CVMetalTextureGetTexture(texture);
            _commandQueue = [_device newCommandQueue];
            CFRelease(texture);
        }
    }
}

// NOTE | A better way to display the Metal texture would 
// be to remove the view controller as a delegate to the MTKView so 
// that you are required to call the drawInMTKView method
// explicitly, and call that method each time a new texture is 
// created from a sample buffer.

//
// Processing the texture, i.e., equalizing its histogram (shown in red), and displaying it with the Metal view (MTKView) (shown in green)
// - (void)drawInMTKView:(MTKView *)view {
    if (_texture) {
        // creating command encoder
        id<MTLCommandBuffer> commandBuffer = [_commandQueue commandBuffer];
        id<MTLTexture> drawingTexture = _metalView.currentDrawable.texture;
        
        // Histogram equalization
        MPSImageHistogram *calculation;
        MPSImageHistogramEqualization *equalization;
        
        // Information to compute the histogram for the channels of an image.
        MPSImageHistogramInfo histogramInfo;
        histogramInfo.numberOfHistogramEntries = 256;
        histogramInfo.histogramForAlpha = FALSE;
        histogramInfo.minPixelValue = (vector_float4){0,0,0,0};
        histogramInfo.maxPixelValue = (vector_float4){1,1,1,1};
        
        /* Performing histogram equalization requires two filters:
         - An MPSImageHistogram filter which calculates the image's current histogram
         - An MPSImageHistogramEqualization filter which calculates and applies the equalization.
         */
        calculation = [[MPSImageHistogram alloc] initWithDevice:_device histogramInfo:&histogramInfo];
        
        equalization = [[MPSImageHistogramEqualization alloc] initWithDevice:_device histogramInfo:&histogramInfo];
        
        /* The length of the histogram buffer is calculated as follows:
         Number of Histogram Entries * Size of 32-bit unsigned integer * Number of Image Channels
         However, it is recommended that you use the histogramSize(forSourceFormat:) method to calculate the buffer length.
         The buffer storage mode is private because its contents are only written by the GPU and never accessed by the CPU.
         */
        NSUInteger bufferLength = [calculation histogramSizeForSourceFormat:_texture.pixelFormat];
        MTLResourceOptions options;
        options = MTLResourceStorageModePrivate;
        id<MTLBuffer> histogramInfoBuffer = [_device newBufferWithLength:bufferLength options:options];
        
        // Performing equalization with MPS is a three stage operation:
        
        // 1: The image's histogram is calculated and passed to an MPSImageHistogramInfo object.
        [calculation encodeToCommandBuffer:commandBuffer sourceTexture:_texture histogram:histogramInfoBuffer histogramOffset:0];
        
        // 2: The equalization filter's encodeTransform method creates an image transform which is used to equalize the distribution of the histogram of the source image.
        [equalization encodeTransformToCommandBuffer:commandBuffer sourceTexture:_texture histogram:histogramInfoBuffer histogramOffset:0];
        
        // 3: The equalization filter's encode method applies the equalization transform to the source texture and and writes the output to the destination texture.
        [equalization encodeToCommandBuffer:commandBuffer sourceTexture:_texture destinationTexture:drawingTexture];
        
        // committing the drawing
        [commandBuffer presentDrawable:_metalView.currentDrawable];
        [commandBuffer commit];
        
        _texture = nil;
    }
    
}

- (void)mtkView:(MTKView *)view drawableSizeWillChange:(CGSize)size {
    
}

@end
NOTE | You can substitute the histogram equalization Metal Performance Shader (shown in red) for any other; however, the surrounding code (shown in green) must remain intact.
Compare the amount of code required to convert a video frame to a Metal texture in the didCaptureOutput... method to the code traditionally used to convert frames to an image format that can be displayed in a view:

    NSData *data = [NSData dataWithBytes:CFBridgingRetain(_texture) length:_texture.allocatedSize];
    NSError *err;
    [_session.session sendData:data toPeers:_session.session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
    
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGDataProviderRef provider = CGDataProviderCreateWithData((void *)sampleBuffer, baseAddress, bytesPerRow * height, nil);
    CGImageRef cgImage = CGImageCreate(width, height, 8, 4*8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, provider, NULL, NO, kCGRenderingIntentDefault);
    CGColorSpaceRelease(colorSpace);
    CGDataProviderRelease(provider);

    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    UIImage *image = [[UIImage alloc] initWithCGImage:cgImage scale:1 orientation:UIImageOrientationUp];
    CGImageRelease(cgImage);

    if (image) {
        [self.view.layer setContents:(__bridge id)image.cgImage];
    }

NOTE | A better way to render textures in the above setup would be to remove the view controller as a delegate to the MTKView, replace the drawInRect: override with another method (say, render), and call that method each time a new texture is created from a sample buffer.

Processing video with an Apple Metal Shader
This is the code for the view controller in the second sample app; it is followed by the source for the Metal shader:

ViewController.h
@import UIKit;
@import AVFoundation;
@import CoreMedia;
@import CoreVideo;
#import <MetalKit/MetalKit.h>
#import <Metal/Metal.h>

@interface ViewController : UIViewController <MTKViewDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>

@end

ViewController.m
#import "ViewController.h"

#define degreesToRadians( degrees ) ( ( degrees ) / 180.0 * M_PI )

@interface ViewController () {
    AVCaptureSession *_avSession;
    AVCaptureDevice *_videoDevice;
}

@end

@implementation ViewController {
    MTKView *_metalView;
    // Non-transient objects
    id<MTLDevice> _device;
    id<MTLCommandQueue> _commandQueue;
    id<MTLLibrary> _library;
    id<MTLBuffer> _vertexBuffer;
    id<MTLComputePipelineState> _computePipelineState;
    CVMetalTextureCacheRef _textureCache;
    MTKTextureLoader *_textureLoader;
    id<MTLTexture> _texture;
}

- (void)viewDidLoad {
    [super viewDidLoad];
    
    // creating MTLDevice and MTKView
    _device = MTLCreateSystemDefaultDevice();
    _metalView = [[MTKView alloc] initWithFrame:self.view.frame device:_device];
    _metalView.device = _device;
    _metalView.delegate = self;
    _metalView.clearColor = MTLClearColorMake(1, 1, 1, 1);
    _metalView.framebufferOnly = NO;
    _metalView.autoResizeDrawable = NO;
    _metalView.colorPixelFormat = MTLPixelFormatBGRA8Unorm;
    _metalView.drawableSize = CGSizeMake(self.view.frame.size.width, self.view.frame.size.height);
    [_metalView setContentMode:UIViewContentModeScaleAspectFit];
    [_metalView setTransform:CGAffineTransformRotate(_metalView.transform, degreesToRadians(90.0))];
    [_metalView setContentScaleFactor:[[UIScreen mainScreen] scale]];
    [_metalView.layer setFrame:[[UIScreen mainScreen] bounds]];
    [_metalView.layer setContentsScale:[[UIScreen mainScreen] scale]];
    
    // creating command queue and shader functions
    _commandQueue = [_device newCommandQueue];
    _library = [_device newDefaultLibrary];
    id<MTLFunction> computeShader = [_library newFunctionWithName:@"compute_shader"];
    
    // creating compute pipeline state
    NSError *err = nil;
    _computePipelineState = [_device newComputePipelineStateWithFunction:computeShader error:&err];
    NSAssert(!err, [err description]);
    
    CVMetalTextureCacheCreate(NULL, NULL, _device, NULL, &_textureCache);
    
    [self.view addSubview:_metalView];
    
    if ([self setupCamera]) {
        [_avSession startRunning];
    }
}

- (BOOL)setupCamera {
    NSLog(@"%s", __PRETTY_FUNCTION__);
    @try {
        NSError * error;
        
        _avSession = [[AVCaptureSession alloc] init];
        [_avSession beginConfiguration];
        [_avSession setSessionPreset:AVCaptureSessionPreset3840x2160];
        
        // get list of devices
        _videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        if (_videoDevice == nil) return FALSE;
        
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
        [_avSession addInput:input];
        
        dispatch_queue_t sampleBufferQueue = dispatch_queue_create("CameraMulticaster", DISPATCH_QUEUE_SERIAL);
        
        AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [dataOutput setAlwaysDiscardsLateVideoFrames:YES];
        [dataOutput setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)}];
        [dataOutput setSampleBufferDelegate:self queue:sampleBufferQueue];
        
        [_avSession addOutput:dataOutput];
        [_avSession commitConfiguration];
        
    } @catch (NSException *exception) {
        NSLog(@"%s - %@", __PRETTY_FUNCTION__, exception.description);
        return FALSE;
    } @finally {
        return TRUE;
    }
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"%s %lu", __PRETTY_FUNCTION__, CMSampleBufferGetTotalSampleSize(sampleBuffer));
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    {
        size_t width = CVPixelBufferGetWidth(pixelBuffer);
        size_t height = CVPixelBufferGetHeight(pixelBuffer);
        
        CVMetalTextureRef texture = NULL;
        CVReturn status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, MTLPixelFormatBGRA8Unorm, width, height, 0, &texture);
        if(status == kCVReturnSuccess)
        {
            _metalView.drawableSize = CGSizeMake(width, height);
            _texture = CVMetalTextureGetTexture(texture);
            if (_texture)
                [_metalView draw];
            CFRelease(texture);
        }
    }
}

- (void)drawInMTKView:(MTKView *)view {
    // creating command encoder
    id<MTLCommandBuffer> commandBuffer = [_commandQueue commandBuffer];
    id<MTLTexture> drawingTexture = view.currentDrawable.texture;
    
    // set texture to command encoder
    id<MTLComputeCommandEncoder> encoder = [commandBuffer computeCommandEncoder];
    [encoder setComputePipelineState:_computePipelineState];
    [encoder setTexture:_texture atIndex:0];
    [encoder setTexture:drawingTexture atIndex:1];
    
    // dispatch thread groups
    MTLSize threadGroupCount = MTLSizeMake(16, 16, 1);
    MTLSize threadGroups = MTLSizeMake(drawingTexture.width / threadGroupCount.width, drawingTexture.height / threadGroupCount.height, 1);
    [encoder dispatchThreadgroups:threadGroups threadsPerThreadgroup:threadGroupCount];
    [encoder endEncoding];
    
    // committing the drawing
    [commandBuffer presentDrawable:view.currentDrawable];
    [commandBuffer commit];
}

- (void)mtkView:(MTKView *)view drawableSizeWillChange:(CGSize)size {
    
}

@end

shader.metal


#include <metal_stdlib>
using namespace metal;

kernel void compute_shader(texture2d<float, access::read> input [[texture(0)]],
                    texture2d<float, access::write> output [[texture(1)]],
                    uint2 gid [[thread_position_in_grid]])
{
    float4 color = input.read(gid);
    output.write(float4(color.r, color.g, color.b, color.a), gid);
}

Sample Apps Source Code
You can download the source for both sample apps here (build with Xcode 9 and run on iOS 11):




Murder, Satan wrote | “They popped a cork on you…”

"Rock him off" and "Cash him in" are both phrases used to describe murder, or at least acts likely to lead to death, in ...