Custom Camera on iOS – AVCaptureSession, AVCaptureVideoPreviewLayer tutorial

Things are going great here at Flatiron School.  We are in week 4 of the program, and I must say, this is one of the best experiences I’ve had.  When I think about how much trouble I had understanding the concept of classes and methods just 4 weeks ago, I am amazed by how I can now understand code from various sources around the internet.  It’s almost like reading english at this point.

Now that we have some of the core fundamentals down, we have been tasked with creating a custom app as our side project.  I’ve got a couple of ideas I wanted to try, but I wanted to find one that was both challenging and I knew would keep me on my toes.  I decided to do something with a custom camera.  At first, I thought, UIImagePickerController was the way to go, but I soon found out that it’s functionality is way too limited.  Essentially, the UIImagePickerController is the most basic view of a camera that Apple provides in their framework.  I wanted to control more than just the ability to take a picture.  I want to be able to tap into the phones settings and manipulate different features.  This is when I cam across AVCaptureSessions.  This is a new class provided by Apple to give developers the ability to get data from the phones camera for audio and video/photo recording capabilities.

A quick note on how custom cameras work.  Essentially, what you are looking at when you open the camera is in fact a video.  The camera opens up a video layer that takes as an input what the camera is seeing.  When you snap a picture, it is taking a screen shot of that moment, pausing the video, saving the image, then resuming the video preview playback.

Below I describe a simple way to get the preview up and running.  I found a lot of resources with elaborate code on how to build a camera out with buttons and features, but all I wanted to do to get started is get a working prototype.  Once I had that, I can worry about the hard… harder part.

create a single view application in Xcode

add a ViewController to the storyboard and create a UIViewController that it can talk to.  Make sure to link them up using the class feature in storyboard.

Then, replace the view did load with the following code, and you’ll have your working camera.  You can see it’s basically following apples guidelines, with a couple of other resources in there that helped me to get this up and running.

- (void)viewDidLoad
{
    [super viewDidLoad];
// Do any additional setup after loading the view.

    //Capture Session
    AVCaptureSession *session = [[AVCaptureSession alloc]init];
    session.sessionPreset = AVCaptureSessionPresetPhoto;

    //Add device
    AVCaptureDevice *device =
    [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    //Input
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];

    if (!input)
    {
       NSLog(@"No Input");
    }

    [session addInput:input];

    //Output
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [session addOutput:output];
    output.videoSettings =
    @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };

    //Preview Layer
    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    UIView *myView = self.view;
    previewLayer.frame = myView.bounds;
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer:previewLayer];

    //Start capture session
    [session startRunning];

}

Considering all of this code is being written in the viewController, I decided to refactor by creating a new Object named “CaptureSession” where I placed the above code into separate methods. I have a method for starting a new session, adding input, and adding a preview layer. This makes it super simple to come along later and change what kinds of inputs I want to add without having to refactor a bunch of different spaces.

Here is what the final code looked like in my viewController.

- (void)viewDidLoad
{
    [super viewDidLoad];
// Do any additional setup after loading the view.

    //start Session
    CaptureSession *startSession = [[CaptureSession alloc]init];
    [startSession addInput];
    [startSession startVideoPreviewLayer];

    //Create bounds for video
    CGRect layerRect = [self.view.layer bounds];
    [startSession.previewLayer setBounds:layerRect];
    [startSession.previewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect), CGRectGetMidY(layerRect))];

    //Add bounds layer to view
    [self.view.layer addSublayer:startSession.previewLayer];

    //Start running the current session with above settings
    [startSession.captureSession startRunning];
}

2 thoughts on “Custom Camera on iOS – AVCaptureSession, AVCaptureVideoPreviewLayer tutorial

Leave a comment