iOS — AVCaptureSession Record Video With Audio
Hi All,
This tutorial for real time Capture Video with Audio Using Capture session in Swift language. Before to start this aware of AVFoundation framework.
AVFoundation used for Create,Edit ,Play , etc… and build effective media functionality.It contains number of classes but our functoinality no need all. we used AVCaptureSession, AVCaptureVideoPreviewLayer, AVCaptureMovieFileOutput,AVCaptureDevice and AVCaptureDeviceInput. if you want more just study about these classes.
Step 1 : import AVFoundation framework in your viewcontroller class.
import AVFoundation
Step 2 : Declare these properties
@IBOutlet weak var videoView: UIView!
var captureSession = AVCaptureSession()
var previewLayer = AVCaptureVideoPreviewLayer()
var movieOutput = AVCaptureMovieFileOutput()
var videoCaptureDevice : AVCaptureDevice?
Step 3 : ViewDidLoad need setup all avcapturedevice , media type, input and output.
AVCaptureDevice for get the Mediatype for examble its a video or audio.
let device = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice]
This is to Get Device in AVMediaTypeVideo . It device may be a front camera or back camera . we focused on back camera.
if device .position == AVCaptureDevicePosition.back {
// do stuff
}
AVCaptureInput : A capture input that provides media from a capture device to a capture session.
Before add input remove all input in capture session else app get crashed unable to add multiple input.
if you want video Just add Video input or add audio input + video input.
do {
try self.captureSession.addInput(AVCaptureDeviceInput(device: videoCaptureDevice))
}
catch {
}
This is final code for add input and output and run session.
func avCaptureVideoSetUp(){
if let devices = AVCaptureDevice.devices(withMediaType: AVMediaTypeVideo) as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaTypeVideo) {
if device .position == AVCaptureDevicePosition.back{
videoCaptureDevice = device
}
}
}
if videoCaptureDevice != nil {
do {
// Add Video Input
try self.captureSession.addInput(AVCaptureDeviceInput(device: videoCaptureDevice))
// Get Audio Device
let audioInput = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
//Add Audio Input
try self.captureSession.addInput(AVCaptureDeviceInput(device: audioInput))
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.portrait
self.videoView.layer.addSublayer(self.previewLayer)
//Add File Output
self.captureSession.addOutput(self.movieOutput)
captureSession.startRunning()
}catch {
print(error)
}
}
}
}
Set frame for VideoPreview Layer while use Auto Layouts:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
let bounds: CGRect = videoView.layer.bounds
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.bounds = bounds
previewLayer.position = CGPoint(x: bounds.midX, y: bounds.midY)
}
Start/Stop Video Record:
@IBAction func recordVideoAction(_ sender: UIButton) {
if movieOutput.isRecording {
movieOutput.stopRecording()
} else {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let fileUrl = paths[0].appendingPathComponent(“output.mov”)
try? FileManager.default.removeItem(at: fileUrl)
movieOutput.startRecording(toOutputFileURL: fileUrl, recordingDelegate: self as AVCaptureFileOutputRecordingDelegate)
}
}
Finally Save Video in your Gallery :
// MARK: — AVCaptureFileOutputRecordingDelegate
extension ViewController : AVCaptureFileOutputRecordingDelegate {
func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!){
// save video to camera roll
if error == nil {
UISaveVideoAtPathToSavedPhotosAlbum(outputFileURL.path, nil, nil, nil)
}
}
}