Leverage Metal and Core Image to implement fast and efficient filters for your app’s camera

Image source: Author

We’ve all seen custom cameras in one form or another in iOS. Normally you’d want to implement your own to wrap a personal UI around it rather than using Apple’s baked-in camera option. But how do we take a custom camera one step further? Filters! Many apps use filters on their cameras for extended functionality.

In this tutorial, we’ll be going over how to add filters to your camera. This tutorial is meant to expose you to a fast and efficient way to filter your camera live video feed and the pictures it takes.

This tutorial assumes you have a…


We’ve all seen custom cameras in one form or another in iOS, but how can we make one ourselves?

This tutorial is going to cover the basics, while at the same time talk about more advanced implementations and options. As you will soon see, options are plenty when it comes to audio/visual hardware interactions on iOS devices! As always, I aim to develop an intuition behind what we are doing rather than just provide code to copy-paste.

Already know how to make a camera app in iOS? Looking for more of a challenge? …


Signal processing

We’ve all seen audio visualization in one form or another, but how do we implement one in our Cocoa application? If you have little to no understanding of signal processing or audio in a digital format, this tutorial is perfect for you. This tutorial will help you navigate and understand Apple’s Accelerate framework and how to render graphics to the screen.


Visualization of audio

Welcome back to part two. If you haven’t done part one, go do part one 😉

To recap what we did in the last part:

We defined the project requirements, input, and output. The requirements were to use data markers from an audio signal to render visuals to the screen in a fast and efficient manner. The input was defined as getting a scalar value for an audio sample to represent the average loudness (level metering), and getting frequency energies for the frequency metering. …


First off, yes, I was an iOS developer for the Royal Canadian Air Force (you know, the military one with planes). It was my second co-op term and I will now be finishing off my second year of undergrad in Computer Engineering at the University of Waterloo.

From L to R: Vikki, Me, Justin Trudeau, Eddie, Kieran

The Royal Canadian Air Force Aerospace Warfare Centre Innovation Lab: Flight Deck

That’s quite a handful, essentially we’re known as the “Flight Deck” inside of Communitech, Canada’s largest innovation centre:


Introduction to Apple’s Metal

Metal Shaders? Render Pipeline? Vertex Shaders? Fragment Shaders? If you were anything like me, these words and phrases are meaningless or confusing. This tutorial is meant to help you get an easy footing on how it all works and allow you to build off from there.

Setup

We’re going to be starting off with a macOS application. The reason for this is so that we can use our Mac’s GPU in the simulator. If you want to do it for an iOS application, you’ll have to run it on a physical device, since the iOS simulators do not support Metal.


I’ve recently had a condensed 10 day experience with the playground development suite by working on my WWDC 2019 Student Scholarship application BeatMatch (check it out here). I will be using my project for the examples in this article. If you don’t know what a playground looks like on an iPad feel free to watch my youtube video here.

For those that aren’t familiar with the student scholarship you need to submit a creative playground (can be anything) alongside some written responses. There are a few thousand applicants every year and this year Apple says in their terms and conditions…

Alex Barbulescu

Creating experiences in iOS | alexs.ca

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store