📸 Swift Camera — Part 3

Use custom camera view to detect QR code

Photo by Nadine Shaabana on Unsplash

As usual, project code is available in Github from all parts.

Before we dive in, you need to check the code from Part 1 where we discuss about creating custom camera view. Below steps will use the same view controller as other part and add detecting QR code feature to it.

AVFoundation provides set of API’s to detect the barcodes from our custom camera. To check the list of supported barcode type check AVMetadataObject.ObjectType . In this post we will learn about detecting QR (Quick Response) code.

Step 1

Create an instance of AVCaptureMetaDataOutput in our viewDidLoad method inside do…catch loop and add it to our session as output like below.

// Initialize a AVCaptureMetadataOutput object and set it as the
// input device
let captureMetadataOutput = AVCaptureMetadataOutput()

To make a note our captureSession can add more then one output. In our example we are having AVCapturePhotoOuput and AVCaptureMetadataOutput to simultaneously take photo or read meta data from single camera view.

Step 2

Set delegate for MetadataObjectsDelegate and metadataObjectTypes like below.

// Set delegate and use the default dispatch queue to execute the
// call back
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
captureMetadataOutput.metadataObjectTypes =[AVMetadataObjectTypeQRCode]

Here, meta data object delegate requires a dispatch queue. To simplify it, we are going with Main queue. But for production application it’s better to create a serial queue for our delegate.

Let captureMetadataOutput know what type of object types we need to detect. It accepts array of meta data types. If you want to detect all supported types just list them in an array.

Step 3

AVCaptureMetadataOutputObjectsDelegate methods gives us meta data objects which was detected from our capture session.

Create an extension for ViewController and extends from AVCaptureMetadataOutputObjectsDelegate like below

extension ViewController : AVCaptureMetadataOutputObjectsDelegate {
  func captureOutput(_ captureOutput: AVCaptureOutput!,
didOutputMetadataObjects metadataObjects: [Any]!,
from connection: AVCaptureConnection!) {
// get detected meta data

didOutputMetaDataObjects delegate method will give us detected meta data objects from our connection.

let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
if metadataObj.type == AVMetadataObjectTypeQRCode {
if metadataObj.stringValue != nil {

Here, we are casting our meta data object to AVMetaDataMeachineReadableCodeObject first and checking whether it is type of QR code or not. stringValue from meta data object is our decoded QR code message.

Meta data object does provide the detected QR code frames. So we can show the detected frames in our camera preview as well. To get the frames use transformedMetaDataObject method from AVVideoPreviewLayer class.

let barCodeObject = videoPreviewLayer?.transformedMetadataObject(for: metadataObj)

barCodeObject.bounds will give us the frames of detected code.

  • If you want to detect the Face on camera view, we can follow the same method as QR code and add face type for metadataobjects. Check out m Github repo branch for more info.
  • iOS 11 has new Vision API’s to detect QR codes as well. Documentation is not mature enough yet. Still trying to figure out how to read QR with Vision API
  • Example project on Github has more functions like display the detected QR code in a UILabel and draw a square on camera preview by using detected meta data bounds.

Though Swift Camera series ends here. I will write tiny post in future for sure. So follow me on here or twitter to get updates. I would like to thank ComplieSwift editor and everyone who recommended and followed me on medium.