Stream Video Calling: How To Build FaceTime Clone With SwiftUI

Amos Gyamfi
11 min readJul 12, 2023

--

This article demonstrates and guides you in building a FaceTime clone using SwiftUI and the iOS Video SDK from Stream to chat face-to-face with friends and family.

SwiftUI FaceTime clone

Like many others, I enjoy using FaceTime to chat with family and friends. The app makes it easy to have real-time one-to-one or group audio and video conversations on any of Apple’s devices.

As a user, it’s often easy to forget the sheer complexity that lives under the surface of a seemingly common application like FaceTime.

To explore some underlying complexities of building a calling application, let’s attempt to recreate Apple’s magical FaceTime experience using Swift UI and Stream’s Video API.

To get started, you will need a free account on Stream, a new XCode project, and a fresh cup of your favorite coffee ☕️!

Customize for 1–1, group calls, and meeting apps

Resources

Explore the following resources to learn more about Stream Video SDKs and how to use them to build great video apps for different use cases.

Overview of Stream Video and Key Features

iOS Video SDK: UIKit, SwiftUI, WebRTC

The Stream’s iOS Video SDk consists of three separate SDKs, and these are:

  • UIKit SDK: A UIKit wrapper for SwiftUI components
  • Low-level client: A WebRTC implementation responsible for establishing calls
  • SwiftUI SDK: SwiftUI components for building call flows such as live streaming, drop-in audio, and voice/video calling.

The separation into multiple layers allows developers to choose a flavor of the SDK that best works for their use case. For example, if you are building a simple calling or meetings application with little customizability, the high-level UI Kit makes the integration process as simple as copying a few lines of code and rebuilding your application.

For cases where the use case requires lots of bespoke designs and behavior, the low-level client provides API for developers to have more granular control over the look and feel of their experience.

Project Setup

iOS calling app supporting picture-in-picture and group calls

Let’s begin by cloning the sample project from GitHub. We will use this project as a starting point for our integration. Since FaceTime is a relatively complex application, we’ve already started on some of the basics of the UI and the app.

Next, if you have not done so already, sign up and register for a free account to obtain a Stream API key. We will use the API key later in the tutorial to help initialize the SDK and make our first API call. For detailed instructions on how to use the Dashboard, consider checking out our companion guide.

At the end of this blog post, users can create and join a call from our app, similar to the Video below. With the boring stuff out, let’s dive into some code 🛠️!

The image below demonstrates when there is one person on the call.

One person in an active call

The image below demonstrates when there are two people on the call. You can invite more participants to join the call. Also, the grid icon (four dots) allows you to change the layout of the active call screen to a grid, full screen, or spotlight when multiple participants are on the call.

Local and remote videos for call participants

Create a New SwiftUI Project

Launch Xcode and create a new SwiftUI application. Name the project as you want. This demo uses Face2Face as the project name.

Step 1: Fetch the SwiftUI Video SDK

Once you have a blank SwiftUI project, you can install the video SDK. Select File -> Add Packages ( Note: Add Package Dependencies if using Xcode 15+)in the Xcode’s toolbar. Copy and paste this https://github.com/GetStream/stream-video-swift.git URL into the search bar and click Add Package. Follow the next instructions to complete the installation.

Understanding the Package Dependencies

Package dependencies

The video SDK has the following packages as dependencies:

Step 2: Set Privacies — Camera and Microphone Usage Descriptions

Since you are building an iOS calling app that involves access to the user’s protected resources, such as microphone and camera, it is required to set privacies for camera and microphone usage. Read Setting Background Modes and Device Capability Privacies in iOS Apps to learn more.

To configure privacy to access the user’s camera and microphone,

  1. Select the name of your app in the Xcode Project Navigator and click the Info tab.
  2. Click the project’s name under Targets. The name of this app is Face2Face. However, yours may be different.
  3. Under the Key category, click the + button on the right side of any of the Key items and scroll to the privacy section.
  4. Click Privacy — Camera Usage Description and add a string that explains why your app needs camera access. This demo uses Face2FaceApp would like to access your camera as the string.
  5. Repeat step 4 above to add privacy for microphone access shown in the image below.
Set camera and microphone usage in Xcode

Step 3: Configure the Video SDK

  1. Create a User Object
    To access the video SDK and the SwiftUI components, you need a user to connect to the SDK’s back end. The user can be authenticated, anonymous, or guest. The user’s credentials must be used to initialize the Stream Video client. In this demo, we provide you with the credentials of the user.
let user = User(
id: userId,
name: "Martin", // name and imageURL are used in the UI
imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp")
)

2. Initialize the Stream Video Client
You should initialize the video client with the user, API key, and token. You can find the API key from your Stream dashboard account. You can sign up for a new account if you are new to Stream.

// Initialize Stream Video client
self.client = StreamVideo(
apiKey: apiKey,
user: user,
token: .init(stringLiteral: token)
)

3. Putting It All Together in Face2FaceApp.swift
When integrating Stream Video with a production iOS app, always configure the Stream Video object early in your application’s lifecycle. You can set it up and initialize it as soon as possible in AppDelegate.swift’s application:didFinishLaunchingWithOptions method for UIKit-based applications. SwiftUI apps do not implement AppDelegate.swift by default. Therefore, you should initialize the Stream Video object within the declaration of the [App conformer](https://developer.apple.com/documentation/swiftui/app/main() "App conformer"). This will make Stream Video immediately available when the app launches.

Since SwiftUI does not implement AppDelegate, we will put all the SDK’s initialization and configuration in the app’s conformer file Face2FaceApp.swift. That is YourAppName_App.swift.

import SwiftUI
import StreamVideo
import StreamVideoSwiftUI

@main
struct VideoCallApp: App {
@ObservedObject var viewModel: CallViewModel

private var client: StreamVideo
private let apiKey: String = "mmhfdzb5evj2" // The API key can be found in the Credentials section
private let userId: String = "Jacen_Solo" // The User Id can be found in the Credentials section
private let token: String = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoiSmFjZW5fU29sbyIsImlzcyI6InByb250byIsInN1YiI6InVzZXIvSmFjZW5fU29sbyIsImlhdCI6MTY5MTEzMTkzMSwiZXhwIjoxNjkxNzM2NzM2fQ.g9OYWrBFfNPYhuVIEvmyyZFNxl7qApsAD2WixxmZhCg" // The Token can be found in the Credentials section
let callId: String = "imirKfxKjuXC" // The CallId can be found in the Credentials section

init() {
let user = User(
id: userId,
name: "Martin", // name and imageURL are used in the UI
imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp")
)

// Initialize Stream Video client
self.client = StreamVideo(
apiKey: apiKey,
user: user,
token: .init(stringLiteral: token)
)

self.viewModel = .init()
}

var body: some Scene {
WindowGroup {
NavigationStack {
ZStack {
VStack {
if viewModel.call != nil {
CallContainer(viewFactory: DefaultViewFactory.shared, viewModel: viewModel)
} else {
NewFace2Face(viewModel: viewModel)
}
}
}
}
}
}
}

Step 4: Add UI to Initiate a Call

  1. Create and Join a New Call
    After creating a user, you can create and join a call with the sample code below. If you want the call to start immediately after running the app, you can add the implementation below in your app’s conformer file Face2FaceApp.swift described in the previous section. In this demo, we want the call to start with a tap gesture. So, let’s add the “create and join a call” functionality in the next step.
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}

2. Creating the Start Call UI
For the sake of simplicity, you will need only the following home screen to establish the calling functionality for this demo.

The FaceTime clone UI

The home screen consists of a button to initiate the call and a blurred background that displays the live iOS device camera feed. The live iOS camera feed display on the home screen’s background is not part of the video SDK. However, you can find the code for the camera feed in the Xcode project under the folder LiveCameraView.

Add a new file in the Project navigator to contain the composition of the home screen. In the demo Xcode project, you will find NewFace2Face.swift, but you can name yours as you want. Replace the content of the file you created with the code below.

import SwiftUI
import StreamVideoSwiftUI
struct NewFace2Face: View {
@ObservedObject var viewModel: CallViewModel
private let callId: String = "imirKfxKjuXC"

var body: some View {
NavigationStack {
ZStack {
HostedViewController()
.ignoresSafeArea()
.blur(radius: 8)
.blendMode(.plusLighter)

VStack {
HStack{
NavigationLink {

} label: {
VStack {
Image(systemName: "link")
Text("Create Link")
.lineLimit(1)
}
.padding(EdgeInsets(top: 7, leading: 42,
bottom: 7, trailing: 42))
}
.buttonStyle(.plain)
.background(.ultraThinMaterial)
.cornerRadius(12)

NavigationLink {
//
} label: {
VStack {
Image(systemName: "video.fill")
Text("New Face2Face")
.lineLimit(1)
}
.padding(.horizontal)
.accessibilityAddTraits(.isButton)
.onTapGesture {
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}
}

}
.buttonStyle(.borderedProminent)
}
.padding(.bottom, 44)

List {
Section{

} header: {
Text("Today")
}
NavigationLink {

} label: {
HStack {
Image(systemName: "h.circle.fill")
.font(.largeTitle)

VStack(alignment: .leading) {
Text("Harrison")
HStack {
Image(systemName: "video.fill")
Text("Face2Face Video")
}
}

Spacer()

Text("12:03")
}
}
}
.scrollContentBackground(.hidden)
}
.padding()
.navigationTitle("Face2Face")
.toolbar {
ToolbarItem(placement: .navigationBarLeading) {
Button {

} label: {
Text("Edit")
}
}
}
}
}
}
}

3. Add a Button to Make an Outgoing Call
In the code above, we attach the join call action to the button with a camera icon to initiate the call viewModel.joinCall(callType: .default, callId: callId).

Outgoing call flow

An outgoing call begins with a call intent from the user using the call recipient’s information. The video SDK’s CallViewModel then creates a start call event using the joinCall method and publishes the call event to the system. The system publishes the event call event to the call object that displays information about the participants and the call.

.onTapGesture {
Task {
guard viewModel.call == nil else { return }
viewModel.joinCall(callType: .default, callId: callId)
}
}

Step 5: Run the App on Two iPhones

You should enable developer mode on each device to establish or test the call on two iOS devices.

  1. Launch the Settings app on your iOS device. This demo uses two iPhones
  2. Tap Privacy & Security.
  3. Scroll to the bottom to find Security. Then, select Developer Mode and toggle the Switch to on.
Turn on developer mode on iPhone.

Now, ensure to connect and select your device’s name from the available device options in Xcode. Run the app in Xcode on the selected device. Recompile it on the other device. Enter the same call ID on both devices to establish the call. Add, for example, abcd as the call ID on both devices and initiate the call. To avoid typing out the call ID each time, you can quickly fill the recipient text field by tapping the recipients under the Called Recently section.
Bravo!!. You now have a feature-rich and fully functional iOS/SwiftUI audio/video calling app that supports group calls, fullscreen mode, and picture-in-picture (PIP).

Run the Xcode app on two iPhones

Test the App Using an iPhone and Our Web App

As the saying goes, seeing is believing. Let’s take the app you built in this tutorial, run it on an iPhone and let other call participants join from the web. The Video SDK provides well connected and seamless testing experience by allowing you to preview your apps with iOS devices and the web. Follow the steps below to join multiple participants to call with the app you just built.

  1. First, copy the user credentials from the video call tutorial in our documentation. Run the app in Xcode on your iPhone.
  2. To join other call participants from the web, click the Join Call button below the user credentials in Step 1 above.
Test the App Using an iPhone and Our Web App: EXPLORE DEMO

Using the web app, you can join as many call participants as you want. This is feasible because the call ID is the same when you run the app on an iPhone and the web.

How to Start a New Group Call
During an active call, you can add more participants to take part in the active call. To have a group calling for this demo, tap the back button < Face2Face in the leading toolbar and pick the same call recipient you initially called with. Alternatively, you can add an implementation in your app so that tapping the person.2 icon on the top right of the screen will invite several people to join the action.

Group calling

Where Do I Go From Here?

Thanks for reading this tutorial that shows you how to build a FaceTime clone using the Stream Video SDK with maximum flexibility and minimal effort. The SwiftUI Video SDK allows you to do more than the use case you discovered in this tutorial. Check out the following resources to learn more about what you can do with the video SDK.

Originally published at https://getstream.io.

--

--

Amos Gyamfi

iOS Developer Advocate @getstream.io | visionOS Content Creator