How I created Apple’s Voice Memos clone
Several months ago, I really wanted to create a voice recorder with an audio visualizer like Apple’s Voice Memos app. As a beginner, I googled a lot but I couldn’t find anything that worked or did what I really wanted. Therefore I decided to go for it and try to build it myself.
Follow my learning path and see how to create a voice recording app, that has audio visualization as well as a record button just like Apple’s Voice Memos app.
The basic setup.
Of course, I’m going to start by creating a new project. Call it on your own taste, I’m going to call mine VoiceMemosClone. Make sure the app is working and you get this satisfying white screen. Now everything looks great, but before we go any farther let’s break the original Voice Memos app into smaller pieces.
Personally, I find it very difficult to start building/ coding something without preparing first, also I find it very intimidating. Therefore, I like cracking everything into smaller pieces and solve every small task alone.
So if you look at Apple’s Voice Memos app, you’ll find several key features:
1. The draggable bottom view.
2. The animation of the recording/ stop button.
3. The audio visualizer while recording.
4. Actual recording.
Now, let’s build them together!
Prepare the basic User Interface.
In Main.storyboard in the ViewController, embed in a NavigationController, check PrefersLargeTitles, and give it “Voice Memos” as a title.
1. The Card View.
First, add two containerViews to the View Controller. One of them is a RecordingsViewController and the other one is that RecorderViewController. Create a new UIViewController Classes from the File -> New menu, choose UI Cocoa Touch Class file.
Change the background color of the RecorderViewController to ViewFlipsideBackgroundColor, change the size property to Freeform. And In the Size Inspector make the hight 150. Add also a UIView and make it the same size as the ViewController and give it a black color with 45% alpha. In RecordingsViewController add to it a tableView and add also a UIView just like you did in RecorderViewController, and in both UIViews check the Hidden property. Finally, connect the outlets of the UIViews you just created.
I like following Sean Allen approach of building the UI, it’s called “Skeletal Storyboards”. If you don’t know what I’m talking about check out his video here. Therefore let’s jump in the RecorderViewController.
In the RecorderViewController we will be doing almost everything. But first, if you took a look at what we are trying to build and analyze it, you will find the following.
So, let’s start with the handle view. In RecordViewController.swift add a new property (call it handleView) and a private method (call it setupHandleView). Also, setup the layout constrains and don’t forget to call setupHandleView() in viewDidLoad().
You will get an error saying “Argument labels ‘(r:, g:, b:)’ do not match any available overloads”. That’s because I’ve made an extension file for the UIColor. So, create a new Swift file, call it Extensions and add the following code in it. Some of the code below we will be using it in the future. Everything should work fine now.
2. The Record Button.
I wanted to create a recording button just like the app and I did a search and found out a great article here. In this article Mark Alldritt go through how he created the button animation. I really recommended you to read it. Anyways, just copy the RecordButton.swift and the RecordButtonKit.swift files alongside with copying the StartRecording and the StopRecording sound files.
To make it work you will have to install the PRTween pod. So, create a pod file in your project directory and install it. The pod is in Objective-C. Therefore, we will have to create an Objective-C Bridge file.
To create the Objective-C Bridge file: from File -> New -> File, choose Header File, it’s really important what you will name your header file so pay attention here “nameOfYourProject-Bridging-Header.h”. I will name mine “VoiceMemosClone-Bridging-Header.h”, add it to your project folder. Finally, choose your app target and go to Build Settings, make sure All is selected on top, find Objective-C Bridging Header and give it the path of your header file.
Open the header file that you just created, delete everything from it, and just type this following line of code.
Now, in your RecorderViewController.swift file add the button, its constraints, and add your handleRecording function. As always don’t forget to call it in your viewDidLoad method.
Now we need to show a couple of things when you click on the recordButton; the audioView and the timeLabel. So, add a label and a uiview(for now) to the recorderViewController. When we tap on the button we need some animation on the card view, here is what I did.
Create a UIView class and the following to it, to create the audio visualizer.
Now, in RecorderViewController.swift change the audioView to:
var audioView = AudioVisualizerView()
Now you just need to set up the recorder and actually record!
The final result.
I’m really happy with the final result… however, there are a few enhancements that maybe you could make it to this project, like:
1. Adding a deleting functionality.
2. Add auto layout, because it’s now only working on iPhone X or XS.
3. Make the cardView (Recorder) draggable.
Ultimately I’m really happy with what I was able to achieve. Not only is it functional, but I had a lot of fun doing it!
I’d love to hear about any experiences you may have about a similar project. I’d also like to see what you may do differently to achieve a better result.
I’m now looking forward to the next blog, what should I try to build next?
The full project is on GitHub.
Feel free to use it!
I couldn't have ever gotten to this results without these guys great work. So, please check them out.