Using C++ to Implement a Video Chat Feature in Unreal Engine

Joel Thomas
Agora.io
25 min readAug 21, 2020

--

Hello devs! Today I’m going to walk you through the steps needed to implement the Agora Real-Time-Engagement service into Unreal Engine using C++!

For this example, I’ll be using Unreal Engine 4.25 and the current Agora SDK.

Getting Started

Create a Project

Now, let’s build a project from scratch!

  1. Open the Unreal Editor and click New project.
  2. On the New Project panel, choose C++ as the project type, enter the project name, choose the project location, and click Create Project.

Uncomment the PrivateDependencyModuleNames line in the [your_project]/Source/[project_name]/[project_name].Build.cs file. Unreal Engine comments this line out by default and causes a compile error if the line is left commented out.

// Uncomment if you are using Slate UI
PrivateDependencyModuleNames.AddRange(new string[] { "UMG", "Slate", "SlateCore" });

Installation

Follow these steps to integrate the Agora plugin into your project.

  1. Copy the plugin to [your_project]/Plugins.
  2. Add the plugin dependency into [your_project]/Source/[project_name]/[project_name].Build.cs file, Private Dependencies section PrivateDependencyModuleNames.AddRange(new string[] { “AgoraPlugin” });
  3. Restart Unreal Engine (if it is running).
  4. Go to Edit > Plugins. Find the Project > Other category and make sure the plugin is enabled.

Create a Level

Next, we will create a level to build our game environment. There are several different ways to create a new level. Here, you use the File menu method, which lists level-selection options.

In the Unreal Editor, select File > New Level:

The New Level dialog box opens:

Click the Empty Level to select it, and save it to a folder named Levels.

Create Core Classes

Now it’s time to create your first C++ classes, which will handle communication with the Agora SDK:

  • VideoFrameObserver
  • VideoCall

Create VideoFrameObserver

VideoFrameObserver implements

agora::media::IVideoFrameObserver

The methods in the VideoFrameObserver class manage video frames callbacks, and should be registered in :

agora::media::IMediaEngine 

using the registerVideoFrameObserver function.

To create your VideoFrameObserver, you will:

  1. Create the VideoFrameObserver class interface
  2. Override the onCaptureVideoFrame and onRenderVideoFrame methods
  3. Add the setOnCaptureVideoFrameCallback and setOnRenderVideoFrameCallback methods

In the Unreal Editor, select File > Add New C++ Class.

Select None as a parent class and click Next:

Name the class VideoFrameObserver and click Create Class.

Create the VideoFrameObserver class interface.

Open the VideoFrameObserver.h file and the interface:

//VideoFrameObserver.h#include "CoreMinimal.h"#include <functional>#include "AgoraMediaEngine.h"class AGORAVIDEOCALL_API VideoFrameObserver : public agora::media::IVideoFrameObserver
{
public:
virtual ~VideoFrameObserver() = default;
public:
bool onCaptureVideoFrame(VideoFrame& videoFrame) override;
bool onRenderVideoFrame(unsigned int uid, VideoFrame& videoFrame) override; void setOnCaptureVideoFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> callback);
void setOnRenderVideoFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> callback);
virtual VIDEO_FRAME_TYPE getVideoFormatPreference() override { return FRAME_TYPE_RGBA; }

private:
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnCaptureVideoFrame;
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRenderVideoFrame;
};

Note: AGORAVIDEOCALL_API is a project dependent define. Use your own define generated by Unreal Engine instead.

Override the onCaptureVideoFrame and onRenderVideoFrame Methods
The onCaptureVideoFrame function retrieves the camera captured image, converts it to ARGB format, and triggers the OnCaptureVideoFrame callback. The function onRenderVideoFrame converts the received image of the specified user to ARGB format and triggers the OnRenderVideoFrame callback.

//VideoFrameObserver.cppbool VideoFrameObserver::onCaptureVideoFrame(VideoFrame& Frame)
{
const auto BufferSize = Frame.yStride*Frame.height;
if (OnCaptureVideoFrame)
{
OnCaptureVideoFrame( static_cast< uint8_t* >( Frame.yBuffer ), Frame.width, Frame.height, BufferSize );
}

return true;
}
bool VideoFrameObserver::onRenderVideoFrame(unsigned int uid, VideoFrame& Frame)
{
const auto BufferSize = Frame.yStride*Frame.height;
if (OnRenderVideoFrame)
{
OnRenderVideoFrame( static_cast<uint8_t*>(Frame.yBuffer), Frame.width, Frame.height, BufferSize );
}
return true;
}

Add the setOnCaptureVideoFrameCallback and setOnRenderVideoFrameCallback methods.

These are callbacks to retrieve the camera captured image/the received image of the remote user.

//VideoFrameObserver.cppvoid VideoFrameObserver::setOnCaptureVideoFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> Callback)
{
OnCaptureVideoFrame = Callback;
}
void VideoFrameObserver::setOnRenderVideoFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> Callback)
{
OnRenderVideoFrame = Callback;
}

Create the VideoCall C++ Class

The VideoCall class manages communication with the Agora SDK.

Create the Class Interface

Return to the Unreal Editor, create a C++ class as you did in the previous step, and name it VideoCall.h.

Go to the VideoCall.h file and add:

//VideoCall.h#pragma once#include "CoreMinimal.h"#include <functional>
#include <vector>
#include "AgoraRtcEngine.h"
#include "AgoraMediaEngine.h"
class VideoFrameObserver;class AGORAVIDEOCALL_API VideoCall
{
public:
VideoCall();
~VideoCall();
FString GetVersion() const; void RegisterOnLocalFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnLocalFrameCallback);
void RegisterOnRemoteFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRemoteFrameCallback);
void StartCall(
const FString& ChannelName,
const FString& EncryptionKey,
const FString& EncryptionType);
void StopCall(); bool MuteLocalAudio(bool bMuted = true);
bool IsLocalAudioMuted();
bool MuteLocalVideo(bool bMuted = true);
bool IsLocalVideoMuted();
bool EnableVideo(bool bEnable = true);private:
void InitAgora();
private:
TSharedPtr<agora::rtc::ue4::AgoraRtcEngine> RtcEnginePtr;
TSharedPtr<agora::media::ue4::AgoraMediaEngine> MediaEnginePtr;
TUniquePtr<VideoFrameObserver> VideoFrameObserverPtr; //callback
//data, w, h, size
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnLocalFrameCallback;
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRemoteFrameCallback;
bool bLocalAudioMuted = false;
bool bLocalVideoMuted = false;
};

Create Initializing Methods

Go to the VideoCall.cpp file and add the required includes.

//VideoCall.cpp#include "AgoraVideoDeviceManager.h"
#include "AgoraAudioDeviceManager.h"
#include "MediaShaders.h"#include "VideoFrameObserver.h"

Next, you add the methods to VideoCall.cpp which will create and initialize the Agora engine:

//VideoCall.cpp

VideoCall::VideoCall()
{
InitAgora();
}

VideoCall::~VideoCall()
{
StopCall();
}

void VideoCall::InitAgora()
{
RtcEnginePtr = TSharedPtr<agora::rtc::ue4::AgoraRtcEngine>(agora::rtc::ue4::AgoraRtcEngine::createAgoraRtcEngine());

static agora::rtc::RtcEngineContext ctx;
ctx.appId = "aab8b8f5a8cd4469a63042fcfafe7063";
ctx.eventHandler = new agora::rtc::IRtcEngineEventHandler();

int ret = RtcEnginePtr->initialize(ctx);
if (ret < 0)
{
UE_LOG(LogTemp, Warning, TEXT("RtcEngine initialize ret: %d"), ret);
}
MediaEnginePtr = TSharedPtr<agora::media::ue4::AgoraMediaEngine>(agora::media::ue4::AgoraMediaEngine::Create(RtcEnginePtr.Get()));
}

FString VideoCall::GetVersion() const
{
if (!RtcEnginePtr)
{
return "";
}
int build = 0;
const char* version = RtcEnginePtr->getVersion(&build);
return FString(ANSI_TO_TCHAR(version));
}

Create Callbacks

Set the callback function to return local and remote frames:

//VideoCall.cpp

void VideoCall::RegisterOnLocalFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnFrameCallback)
{
OnLocalFrameCallback = std::move(OnFrameCallback);
}

void VideoCall::RegisterOnRemoteFrameCallback(
std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnFrameCallback)
{
OnRemoteFrameCallback = std::move(OnFrameCallback);
}

Create Call Methods

The methods in this section manage joining or leaving a channel

Add the StartCall Function
Create the VideoFrameObserver object and register the following callbacks according to your scenarios:

-`OnLocalFrameCallback`: Occurs each time the SDK receives a video frame captured by the local camera.

-`OnRemoteFrameCallback`: Occurs each time the SDK receives a video frame sent by the remote user.

In the InitAgora function, register the VideoFrameObserver object in the MediaEngine object with the registerVideoFrameObserver method. If EncryptionType and EncryptionKey are not empty, set EncryptionMode and EncryptionSecret for RtcEngine, then set the channel profile according to your needs and call joinChannel.

//VideoCall.cpp

void VideoCall::StartCall(
const FString& ChannelName,
const FString& EncryptionKey,
const FString& EncryptionType)
{
if (!RtcEnginePtr)
{
return;
}
if (MediaEnginePtr)
{
if (!VideoFrameObserverPtr)
{
VideoFrameObserverPtr = MakeUnique<VideoFrameObserver>();

std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnCaptureVideoFrameCallback
= [this](std::uint8_t* buffer, std::uint32_t width, std::uint32_t height, std::uint32_t size)
{
if (OnLocalFrameCallback)
{
OnLocalFrameCallback(buffer, width, height, size);
}
else { UE_LOG(LogTemp, Warning, TEXT("VideoCall OnLocalFrameCallback isn't set")); }
};
VideoFrameObserverPtr->setOnCaptureVideoFrameCallback(std::move(OnCaptureVideoFrameCallback));

std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRenderVideoFrameCallback
= [this](std::uint8_t* buffer, std::uint32_t width, std::uint32_t height, std::uint32_t size)
{
if (OnRemoteFrameCallback)
{
OnRemoteFrameCallback(buffer, width, height, size);
}
else { UE_LOG(LogTemp, Warning, TEXT("VideoCall OnRemoteFrameCallback isn't set")); }
};
VideoFrameObserverPtr->setOnRenderVideoFrameCallback(std::move(OnRenderVideoFrameCallback));
}

MediaEnginePtr->registerVideoFrameObserver(VideoFrameObserverPtr.Get());
}

int nRet = RtcEnginePtr->enableVideo();
if (nRet < 0)
{
UE_LOG(LogTemp, Warning, TEXT("enableVideo : %d"), nRet)
}

if (!EncryptionType.IsEmpty() && !EncryptionKey.IsEmpty())
{
if (EncryptionType == "aes-256")
{
RtcEnginePtr->setEncryptionMode("aes-256-xts");
}
else
{
RtcEnginePtr->setEncryptionMode("aes-128-xts");
}

nRet = RtcEnginePtr->setEncryptionSecret(TCHAR_TO_ANSI(*EncryptionKey));
if (nRet < 0)
{
UE_LOG(LogTemp, Warning, TEXT("setEncryptionSecret : %d"), nRet)
}
}

nRet = RtcEnginePtr->setChannelProfile(agora::rtc::CHANNEL_PROFILE_COMMUNICATION);
if (nRet < 0)
{
UE_LOG(LogTemp, Warning, TEXT("setChannelProfile : %d"), nRet)
}
//"demoChannel1";
std::uint32_t nUID = 0;
nRet = RtcEnginePtr->joinChannel(NULL, TCHAR_TO_ANSI(*ChannelName), NULL, nUID);
if (nRet < 0)
{
UE_LOG(LogTemp, Warning, TEXT("joinChannel ret: %d"), nRet);
}
}

Note: This project is meant for reference purposes and development environments, it is not intended for production environments. Token authentication is recommended for all RTE apps running in production environments. For more information about token based authentication within the Agora platform please refer to this guide: https://bit.ly/3sNiFRs

Add the StopCall Function
Call the leaveChannel method to leave the current call according to your scenario — for example, when leaving a current call because a call ends, when you need to close the app, or when your app runs in the background. Call registerVideoFrameObserver with the nullptr argument to cancel the registration of the VideoFrameObserver.

//VideoCall.cpp

void VideoCall::StopCall()
{
if (!RtcEnginePtr)
{
return;
}
auto ConnectionState = RtcEnginePtr->getConnectionState();
if (agora::rtc::CONNECTION_STATE_DISCONNECTED != ConnectionState)
{
int nRet = RtcEnginePtr->leaveChannel();
if (nRet < 0)
{
UE_LOG(LogTemp, Warning, TEXT("leaveChannel ret: %d"), nRet);
}
if (MediaEnginePtr)
{
MediaEnginePtr->registerVideoFrameObserver(nullptr);
}
}
}

Video Methods

Add the EnableVideo() Method
The EnableVideo() method enables the video for the sample application. Initialize nRet with a value for 0. If bEnable is true, enable the video using RtcEnginePtr->enableVideo(). Otherwise, disable the video using RtcEnginePtr->disableVideo().

//VideoCall.cpp

bool VideoCall::EnableVideo(bool bEnable)
{
if (!RtcEnginePtr)
{
return false;
}
int nRet = 0;
if (bEnable)
nRet = RtcEnginePtr->enableVideo();
else
nRet = RtcEnginePtr->disableVideo();
return nRet == 0 ? true : false;
}

Add the MuteLocalVideo() Method
The MuteLocalVideo() method turns local video on or off. Ensure thatRtcEnginePtr is not nullptr before completing the remaining method actions. If the mute or unmute local video is successful, set bLocalVideoMuted to bMuted.

//VideoCall.cpp

bool VideoCall::MuteLocalVideo(bool bMuted)
{
if (!RtcEnginePtr)
{
return false;
}
int ret = RtcEnginePtr->muteLocalVideoStream(bMuted);
if (ret == 0)
bLocalVideoMuted = bMuted;

return ret == 0 ? true : false;
}

Add the IsLocalVideoMuted() Method
The IsLocalVideoMuted() method indicates whether the local video is on or off for the sample application, returning bLocalVideoMuted.

//VideoCall.cpp

bool VideoCall::IsLocalVideoMuted()
{
return bLocalVideoMuted;
}

Create Audio Methods

Add the MuteLocalAudio() Method
The MuteLocalAudio() method mutes or unmutes the local audio. Ensure that RtcEnginePtr is not nullptr before completing the remaining method actions. If the mute or unmute local audio is successful, set bLocalAudioMuted to bMuted.

//VideoCall.cpp

bool VideoCall::MuteLocalAudio(bool bMuted)
{
if (!RtcEnginePtr)
{
return false;
}
int ret = RtcEnginePtr->muteLocalAudioStream(bMuted);
if (ret == 0)
bLocalAudioMuted = bMuted;

return ret == 0 ? true : false;
}

Add the IsLocalAudioMuted() Method
The IsLocalAudioMuted() method indicates whether local audio is muted or unmuted for the sample application, returning bLocalAudioMuted.

//VideoCall.cpp

bool VideoCall::IsLocalAudioMuted()
{
return bLocalAudioMuted;
}

Create the GUI

Now you will create the graphical user interface (GUI) for the one-to-one call in your project with these classes:

  • VideoCallPlayerController
  • EnterChannelWidget
  • VideoViewWidget
  • VideoCallViewWidget
  • VideoCallWidget
  • BP_EnterChannelWidget blueprint asset
  • BP_VideoViewWidget asset
  • BP_VideoCallViewWidget asset
  • BP_VideoCallWidget asset
  • BP_VideoCallPlayerController blueprint asset
  • BP_AgoraVideoCallGameModeBase asset

Create the VideoCallPlayerController

To add our widget blueprints to the Viewport, you create a custom player controller class.

In the Content browser, click the Add New button and select New C++ Class. In the Add C++ Class window, click the Show All Classes button, and type PlayerController. Click the Next button and name the class VideoCallPlayerController. Click the Create Class button.

//VideoCallPlayerController.h#include "CoreMinimal.h"
#include "GameFramework/PlayerController.h"
#include "VideoCallPlayerController.generated.h"
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
};

This class is a base class for the BP_VideoCallPlayerController blueprint asset, which will be created at the end.

Add Required Includes
At the top of the VideoCallPlayerController.h file, include the required header files:

//VideoCallPlayerController.h#include "CoreMinimal.h"
#include "GameFramework/PlayerController.h"
#include "Templates/UniquePtr.h"
#include "VideoCall.h"#include "VideoCallPlayerController.generated.h"
...
//VideoCallPlayerController.cpp#include "Blueprint/UserWidget.h"#include "EnterChannelWidget.h"
#include "VideoCallWidget.h"

Class Declaration
Add forward declaration of the next classes:

//VideoCallPlayerController.hclass UEnterChannelWidget;
class UVideoCallWidget;

Later you will follow up on the creation of two of them: UEnterChannelWidget and UVideoCallWidget.

Add Member Variables
Now add the member references to the UMG asset in the Unreal Editor:

//VideoCallPlayerController.h...UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Widgets")
TSubclassOf<class UUserWidget> wEnterChannelWidget;
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Widgets")
TSubclassOf<class UUserWidget> wVideoCallWidget;

...
};

Add variables to hold the widgets after creating a pointer to VideoCall:

//VideoCallPlayerController.h...UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
... UEnterChannelWidget* EnterChannelWidget = nullptr; UVideoCallWidget* VideoCallWidget = nullptr; TUniquePtr<VideoCall> VideoCallPtr;

...
};

Override BeginPlay and EndPlay

//VideoCallPlayerController.h...UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
...

void BeginPlay() override;
void EndPlay(const EEndPlayReason::Type EndPlayReason) override;

...
};
//VideoCallPlayerController.cppvoid AVideoCallPlayerController::BeginPlay()
{
Super::BeginPlay();
//initialize widgets
if (wEnterChannelWidget) // Check if the Asset is assigned in the blueprint.
{
// Create the widget and store it.
if (!EnterChannelWidget)
{
EnterChannelWidget = CreateWidget<UEnterChannelWidget>(this, wEnterChannelWidget);
EnterChannelWidget->SetVideoCallPlayerController(this);
}
// now you can use the widget directly since you have a reference for it.
// Extra check to make sure the pointer holds the widget.
if (EnterChannelWidget)
{
//let add it to the view port
EnterChannelWidget->AddToViewport();
}
//Show the Cursor.
bShowMouseCursor = true;
}
if (wVideoCallWidget)
{
if (!VideoCallWidget)
{
VideoCallWidget = CreateWidget<UVideoCallWidget>(this, wVideoCallWidget);
VideoCallWidget->SetVideoCallPlayerController(this);
}
if (VideoCallWidget)
{
VideoCallWidget->AddToViewport();
}
VideoCallWidget->SetVisibility(ESlateVisibility::Collapsed);
}
//create video call and switch on the EnterChannelWidget
VideoCallPtr = MakeUnique<VideoCall>();
FString Version = VideoCallPtr->GetVersion();
Version = "Agora version: " + Version;
EnterChannelWidget->UpdateVersionText(Version);
SwitchOnEnterChannelWidget(std::move(VideoCallPtr));
}
void AVideoCallPlayerController::EndPlay(const EEndPlayReason::Type EndPlayReason)
{
Super::EndPlay(EndPlayReason);
}

You may notice that the EnterChannelWidget and VideoCallWidget methods are marked as errors. This is because they are not implemented yet. You implement them in the next steps.

Add StartCall and EndCall Methods

//VideoCallPlayerController.h...UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
...

void StartCall(
TUniquePtr<VideoCall> PassedVideoCallPtr,
const FString& ChannelName,
const FString& EncryptionKey,
const FString& EncryptionType
);
void EndCall(TUniquePtr<VideoCall> PassedVideoCallPtr);

...
};
//VideoCallPlayerController.cppvoid AVideoCallPlayerController::StartCall(
TUniquePtr<VideoCall> PassedVideoCallPtr,
const FString& ChannelName,
const FString& EncryptionKey,
const FString& EncryptionType)
{
SwitchOnVideoCallWidget(std::move(PassedVideoCallPtr));
VideoCallWidget->OnStartCall(
ChannelName,
EncryptionKey,
EncryptionType);
}
void AVideoCallPlayerController::EndCall(TUniquePtr<VideoCall> PassedVideoCallPtr)
{
SwitchOnEnterChannelWidget(std::move(PassedVideoCallPtr));
}

Add Switch On Another Widget Methods
By managing the visibility of a widget and passing the VideoCall pointer, we define an active widget.

//VideoCallPlayerController.h...UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
GENERATED_BODY()

public:
...

void SwitchOnEnterChannelWidget(TUniquePtr<VideoCall> PassedVideoCallPtr);
void SwitchOnVideoCallWidget(TUniquePtr<VideoCall> PassedVideoCallPtr);
...
};
//VideoCallPlayerController.cppvoid AVideoCallPlayerController::SwitchOnEnterChannelWidget(TUniquePtr<VideoCall> PassedVideoCallPtr)
{
if (!EnterChannelWidget)
{
return;
}
EnterChannelWidget->SetVideoCall(std::move(PassedVideoCallPtr));
EnterChannelWidget->SetVisibility(ESlateVisibility::Visible);
}
void AVideoCallPlayerController::SwitchOnVideoCallWidget(TUniquePtr<VideoCall> PassedVideoCallPtr)
{
if (!VideoCallWidget)
{
return;
}
VideoCallWidget->SetVideoCall(std::move(PassedVideoCallPtr));
VideoCallWidget->SetVisibility(ESlateVisibility::Visible);
}

Create EnterChannelWidget C++ Class

The EnterChannelWidget class manages UI element interactions (from the corresponding blueprint asset) with the application.

Create a class of UserWidget type. In the Content browser, click the Add New button and select New C++ Class. Then click the Show All Classes button, and type “UserWidget.” Click the Next button and set a name for the class: EnterChannelWidget.

When you create the channel widget, you get something like this:

//EnterChannelWidget.h#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "EnterChannelWidget.generated.h"
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()

};

Add Required Includes
At the top of the EnterChannelWidget.h file, include the required header files and forward declarations:

//EnterChannelWidget.h#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/TextBlock.h"
#include "Components/RichTextBlock.h"
#include "Components/EditableTextBox.h"
#include "Components/ComboBoxString.h"
#include "Components/Button.h"
#include "Components/Image.h"
#include "VideoCall.h"#include "EnterChannelWidget.generated.h"class AVideoCallPlayerController;//EnterChannelWidget.cpp#include "Blueprint/WidgetTree.h"#include "VideoCallPlayerController.h"

Add Member Variables
Now add the next member variables:

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
public: UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UTextBlock* HeaderTextBlock = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UTextBlock* DescriptionTextBlock = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UEditableTextBox* ChannelNameTextBox = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UEditableTextBox* EncriptionKeyTextBox = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UTextBlock* EncriptionTypeTextBlock = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UComboBoxString* EncriptionTypeComboBox = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UButton* JoinButton = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UButton* TestButton = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UButton* VideoSettingsButton = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UTextBlock* ContactsTextBlock = nullptr;
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
UTextBlock* BuildInfoTextBlock = nullptr;

...
};

These variables are needed to control the corresponding UI elements in the blueprint asset. The most important here is the BindWidget meta property.
By marking a pointer to a widget as BindWidget, you can create an identically. named widget in a Blueprint subclass of your C++ class and at run time access it from the C++.

Add the Next Members:

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
...

public:
AVideoCallPlayerController* PlayerController = nullptr;
TUniquePtr<VideoCall> VideoCallPtr;

...
};

Add Constructor and Construct/Destruct Methods

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

UEnterChannelWidget(const FObjectInitializer& objectInitializer);
void NativeConstruct() override;

...
};
//EnterChannelWidget.cppUEnterChannelWidget::UEnterChannelWidget(const FObjectInitializer& objectInitializer)
: Super(objectInitializer)
{
}
void UEnterChannelWidget::NativeConstruct()
{
Super::NativeConstruct();
if (HeaderTextBlock)
HeaderTextBlock->SetText(FText::FromString("Enter a conference room name"));
if (DescriptionTextBlock)
DescriptionTextBlock->SetText(FText::FromString("If you are the first person to specify this name, \
the room will be created and you will\nbe placed in it. \
If it has already been created you will join the conference in progress"));
if (ChannelNameTextBox)
ChannelNameTextBox->SetHintText(FText::FromString("Channel Name"));
if (EncriptionKeyTextBox)
EncriptionKeyTextBox->SetHintText(FText::FromString("Encription Key"));
if (EncriptionTypeTextBlock)
EncriptionTypeTextBlock->SetText(FText::FromString("Enc Type:"));
if (EncriptionTypeComboBox)
{
EncriptionTypeComboBox->AddOption("aes-128");
EncriptionTypeComboBox->AddOption("aes-256");
EncriptionTypeComboBox->SetSelectedIndex(0);
}
if (JoinButton)
{
UTextBlock* JoinTextBlock = WidgetTree->ConstructWidget<UTextBlock>(UTextBlock::StaticClass());
JoinTextBlock->SetText(FText::FromString("Join"));
JoinButton->AddChild(JoinTextBlock);
JoinButton->OnClicked.AddDynamic(this, &UEnterChannelWidget::OnJoin);
}
if (ContactsTextBlock)
ContactsTextBlock->SetText(FText::FromString("agora.io Contact support: 400 632 6626"));
if (BuildInfoTextBlock)
BuildInfoTextBlock->SetText(FText::FromString(" "));
}

Add Setter Methods
Initialize the PlayerController and VideoCallPtr variables:

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController);
void SetVideoCall(TUniquePtr<VideoCall> PassedVideoCallPtr);

...
};
//EnterChannelWidget.cppvoid UEnterChannelWidget::SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController)
{
PlayerController = VideoCallPlayerController;
}
void UEnterChannelWidget::SetVideoCall(TUniquePtr<VideoCall> PassedVideoCallPtr)
{
VideoCallPtr = std::move(PassedVideoCallPtr);
}

Add BlueprintCallable Methods
To react on the corresponding onButtonClick button event:

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
public: ... UFUNCTION(BlueprintCallable)
void OnJoin();

...
};
//EnterChannelWidget.cppvoid UEnterChannelWidget::OnJoin()
{
if (!PlayerController || !VideoCallPtr)
{
return;
}
FString ChannelName = ChannelNameTextBox->GetText().ToString(); FString EncryptionKey = EncriptionKeyTextBox->GetText().ToString();
FString EncryptionType = EncriptionTypeComboBox->GetSelectedOption();
SetVisibility(ESlateVisibility::Collapsed); PlayerController->StartCall(
std::move(VideoCallPtr),
ChannelName,
EncryptionKey,
EncryptionType);
}

Add Update Methods

//EnterChannelWidget.h...UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void UpdateVersionText(FString newValue);

...
};
//EnterChannelWidget.cppvoid UEnterChannelWidget::UpdateVersionText(FString newValue)
{
if (BuildInfoTextBlock)
BuildInfoTextBlock->SetText(FText::FromString(newValue));
}

Create the VideoViewWidget C++ Class

VideoViewWidget is a class to store the dynamic texture and update it, using the RGBA buffer, received from VideoCall OnLocalFrameCallback/OnRemoteFrameCallback functions.

Create the Class and Add Required Includes
Create the widget C++ class as you did before and add the required includes:

//VideoViewWidget.h#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/Image.h"#include "VideoViewWidget.generated.h"//VideoViewWidget.cpp#include "EngineUtils.h"
#include "Engine/Texture2D.h"
#include <algorithm>

Add Member Variables
Buffer — Variable to store the RGBA buffer, Width, Height and BufferSize , which are params of the video frame.
RenderTargetImage — The image widget that allows you to display a Slate Brush, texture, or material in the UI.
RenderTargetTexture — The dynamic texture, which you will update using the Buffer variable.
FUpdateTextureRegion2D — Specifies an update region for a texture.
Brush — A brush that contains information about how to draw a Slate element. You will use it to draw RenderTargetTexture on RenderTargetImage.

//VideoViewWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
GENERATED_BODY()
public:
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UImage* RenderTargetImage = nullptr;
UPROPERTY(EditDefaultsOnly)
UTexture2D* RenderTargetTexture = nullptr;
UTexture2D* CameraoffTexture = nullptr; uint8* Buffer = nullptr;
uint32_t Width = 0;
uint32_t Height = 0;
uint32 BufferSize = 0;
FUpdateTextureRegion2D* UpdateTextureRegion = nullptr;
FSlateBrush Brush; FCriticalSection Mutex;

...
};

Override the NativeConstruct() Method
In the NativeConstruct you initialize the image with a default color. To initialize our RenderTargetTexture you need to create the dynamic texture (Texture2D ) using a CreateTransient call, then allocate Buffer with BufferSize calculated as Width * Height * 4 (to store the RGBA format, where each pixel can be represented using 4 bytes).

To update our texture you can use a call to UpdateTextureRegions. One of the input parameters to this function is the pixel data buffer. Whenever you modify the pixel data buffer you need to call this function to make the change visible in the texture.
Now initialize the Brush variable with the RenderTargetTexture, and then set this Brush in RenderTargetImage widget.

//VideoViewWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
GENERATED_BODY()
public:... void NativeConstruct() override; ...
};
//VideoViewWidget.cppvoid UVideoViewWidget::NativeConstruct()
{
Super::NativeConstruct();
Width = 640;
Height = 360;
RenderTargetTexture = UTexture2D::CreateTransient(Width, Height, PF_R8G8B8A8);
RenderTargetTexture->UpdateResource();
BufferSize = Width * Height * 4;
Buffer = new uint8[BufferSize];
for (uint32 i = 0; i < Width * Height; ++i)
{
Buffer[i * 4 + 0] = 0x32;
Buffer[i * 4 + 1] = 0x32;
Buffer[i * 4 + 2] = 0x32;
Buffer[i * 4 + 3] = 0xFF;
}
UpdateTextureRegion = new FUpdateTextureRegion2D(0, 0, 0, 0, Width, Height);
RenderTargetTexture->UpdateTextureRegions(0, 1, UpdateTextureRegion, Width * 4, (uint32)4, Buffer);
Brush.SetResourceObject(RenderTargetTexture);
RenderTargetImage->SetBrush(Brush);
}

Override the NativeDestruct() Method

//VideoViewWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
GENERATED_BODY()
public: ... void NativeDestruct() override; ...
};
//VideoViewWidget.cppvoid UVideoViewWidget::NativeDestruct()
{
Super::NativeDestruct();
delete[] Buffer;
delete UpdateTextureRegion;
}

Override the NativeTick() Method
In case UpdateTextureRegion Width or Height are not equal to the member Width Height values, you need to re-create RenderTargetTexture to support the updated values, and repeat initialization as in the Native Construct member. Otherwise, just call UpdateTextureRegions with Buffer.

//VideoViewWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void NativeTick(const FGeometry& MyGeometry, float DeltaTime) override;

...
};
//VideoViewWidget.cppvoid UVideoViewWidget::NativeTick(const FGeometry& MyGeometry, float DeltaTime)
{
Super::NativeTick(MyGeometry, DeltaTime);
FScopeLock lock(&Mutex); if (UpdateTextureRegion->Width != Width ||
UpdateTextureRegion->Height != Height)
{
auto NewUpdateTextureRegion = new FUpdateTextureRegion2D(0, 0, 0, 0, Width, Height);
auto NewRenderTargetTexture = UTexture2D::CreateTransient(Width, Height, PF_R8G8B8A8);
NewRenderTargetTexture->UpdateResource();
NewRenderTargetTexture->UpdateTextureRegions(0, 1, NewUpdateTextureRegion, Width * 4, (uint32)4, Buffer);
Brush.SetResourceObject(NewRenderTargetTexture);
RenderTargetImage->SetBrush(Brush);
//UClass's such as UTexture2D are automatically garbage collected when there is no hard pointer references made to that object.
//So if you just leave it and don't reference it elsewhere then it will be destroyed automatically.
FUpdateTextureRegion2D* TmpUpdateTextureRegion = UpdateTextureRegion; RenderTargetTexture = NewRenderTargetTexture;
UpdateTextureRegion = NewUpdateTextureRegion;
delete TmpUpdateTextureRegion;
return;
}
RenderTargetTexture->UpdateTextureRegions(0, 1, UpdateTextureRegion, Width * 4, (uint32)4, Buffer);
}

Add the UpdateBuffer() Method
You expect the new value to be received from the Agora SDK thread, so due to a UE4 limitation you save the value into the Buffer variable, update the texture in the NativeTick method, and don’t call UpdateTextureRegions here.

//VideoViewWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void UpdateBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size );
void ResetBuffer();
...
};
//VideoViewWidget.cpp void UVideoViewWidget::UpdateBuffer(
uint8* RGBBuffer,
uint32_t NewWidth,
uint32_t NewHeight,
uint32_t NewSize)
{
FScopeLock lock(&Mutex);
if (!RGBBuffer)
{
return;
}
if (BufferSize == NewSize)
{
std::copy(RGBBuffer, RGBBuffer + NewSize, Buffer);
}
else
{
delete[] Buffer;
BufferSize = NewSize;
Width = NewWidth;
Height = NewHeight;
Buffer = new uint8[BufferSize];
std::copy(RGBBuffer, RGBBuffer + NewSize, Buffer);
}
}
void UVideoViewWidget::ResetBuffer()
{
for (uint32 i = 0; i < Width * Height; ++i)
{
Buffer[i * 4 + 0] = 0x32;
Buffer[i * 4 + 1] = 0x32;
Buffer[i * 4 + 2] = 0x32;
Buffer[i * 4 + 3] = 0xFF;
}
}

Create the VideoCallViewWidget C++ Class

The VideoCallViewWidget class serves to display the local and remote user video. You need two VideoViewWidget widgets: one to display video from the local camera and another to display video received from the remote user (assume you support only one remote user).

Create Class and Add Required Includes
Create the widget C++ class as you did before and add the required includes:

//VideoCallViewWidget.h #include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/SizeBox.h"
#include "VideoViewWidget.h"#include "VideoCallViewWidget.generated.h"//VideoCallViewWidget.cpp#include "Components/CanvasPanelSlot.h"

Add Member Variables

//VideoCallViewWidget.h ...UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
GENERATED_BODY()
public:

UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UVideoViewWidget* MainVideoViewWidget = nullptr;
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
USizeBox* MainVideoSizeBox = nullptr;
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UVideoViewWidget* AdditionalVideoViewWidget = nullptr;
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
USizeBox* AdditionalVideoSizeBox = nullptr;
public:
int32 MainVideoWidth = 0;
int32 MainVideoHeight = 0;

...
};

Override the NativeTick() Method
In NativeTick you update the widgets geometry:

//VideoCallViewWidget.h ...UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void NativeTick(const FGeometry& MyGeometry, float DeltaTime) override;

...
};
//VideoCallViewWidget.cppvoid UVideoCallViewWidget::NativeTick(const FGeometry& MyGeometry, float DeltaTime)
{
Super::NativeTick(MyGeometry, DeltaTime);
auto ScreenSize = MyGeometry.GetLocalSize(); if (MainVideoHeight != 0)
{
float AspectRatio = 0;
AspectRatio = MainVideoWidth / (float)MainVideoHeight;
auto MainVideoGeometry = MainVideoViewWidget->GetCachedGeometry();
auto MainVideoScreenSize = MainVideoGeometry.GetLocalSize();
if (MainVideoScreenSize.X == 0)
{
return;
}
auto NewMainVideoHeight = MainVideoScreenSize.Y;
auto NewMainVideoWidth = AspectRatio * NewMainVideoHeight;
MainVideoSizeBox->SetMinDesiredWidth(NewMainVideoWidth);
MainVideoSizeBox->SetMinDesiredHeight(NewMainVideoHeight);
UCanvasPanelSlot* CanvasSlot = Cast<UCanvasPanelSlot>(MainVideoSizeBox->Slot);
CanvasSlot->SetAutoSize(true);
FVector2D NewPosition;
NewPosition.X = -NewMainVideoWidth / 2;
NewPosition.Y = -NewMainVideoHeight / 2;
CanvasSlot->SetPosition(NewPosition);
}
}

Add the Update UpdateMainVideoBuffer and UpdateAdditionalVideoBuffer Methods

//VideoCallViewWidget.h ...UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void UpdateMainVideoBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size);
void UpdateAdditionalVideoBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size);

void ResetBuffers();
...
};
//VideoCallViewWidget.cppvoid UVideoCallViewWidget::UpdateMainVideoBuffer(
uint8* RGBBuffer,
uint32_t Width,
uint32_t Height,
uint32_t Size)
{
if (!MainVideoViewWidget)
{
return;
}
MainVideoWidth = Width;
MainVideoHeight = Height;
MainVideoViewWidget->UpdateBuffer(RGBBuffer, Width, Height, Size);
}
void UVideoCallViewWidget::UpdateAdditionalVideoBuffer(
uint8* RGBBuffer,
uint32_t Width,
uint32_t Height,
uint32_t Size)
{
if (!AdditionalVideoViewWidget)
{
return;
}
AdditionalVideoViewWidget->UpdateBuffer(RGBBuffer, Width, Height, Size);
}
void UVideoCallViewWidget::ResetBuffers()
{
if (!MainVideoViewWidget || !AdditionalVideoViewWidget)
{
return;
}
MainVideoViewWidget->ResetBuffer();
AdditionalVideoViewWidget->ResetBuffer();
}

Create the VideoCallWidget C++ Class.

The VideoCallWidget class serves as the audio/video call widget for the sample application. It contains the following controls, bound with UI elements in the blueprint asset:

  • The local and remote video view (represented by VideoCallViewWidget)
  • The end-call button (EndCallButton variable)
  • The mute-local-audio button (MuteLocalAudioButton variable)
  • The video-mode button (VideoModeButton variable)

Create the Class and Required Includes
Create the widget C++ class as you did before, and add required includes and forward declarations:

//VideoCallWidget.h
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Templates/UniquePtr.h"
#include "Components/Image.h"
#include "Components/Button.h"
#include "Engine/Texture2D.h"
#include "VideoCall.h"#include "VideoCallViewWidget.h"#include "VideoCallWidget.generated.h"class AVideoCallPlayerController;
class UVideoViewWidget;
//VideoCallWidget.cpp#include "Kismet/GameplayStatics.h"
#include "UObject/ConstructorHelpers.h"
#include "Components/CanvasPanelSlot.h"
#include "VideoViewWidget.h"#include "VideoCallPlayerController.h"

Add Member Variables

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public:
AVideoCallPlayerController* PlayerController = nullptr;
public:
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UVideoCallViewWidget* VideoCallViewWidget = nullptr;
//Buttons
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UButton* EndCallButton = nullptr;
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UButton* MuteLocalAudioButton = nullptr;
UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
UButton* VideoModeButton = nullptr;
//Button textures
int32 ButtonSizeX = 96;
int32 ButtonSizeY = 96;
UTexture2D* EndCallButtonTexture = nullptr;
UTexture2D* AudioButtonMuteTexture = nullptr;
UTexture2D* AudioButtonUnmuteTexture = nullptr;
UTexture2D* VideomodeButtonCameraoffTexture = nullptr;
UTexture2D* VideomodeButtonCameraonTexture = nullptr;
TUniquePtr<VideoCall> VideoCallPtr;

...
};

Initialize VideoCallWidget
Find the asset image for each button and assign it to the corresponding texture. Then initialize each button with textures:

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public: ... UVideoCallWidget(const FObjectInitializer& ObjectInitializer); void NativeConstruct() override;
void NativeDestruct() override;
private:
void InitButtons();

...
};
//VideoCallWidget.cpp void UVideoCallWidget::NativeConstruct()
{
Super::NativeConstruct();
InitButtons();
}
void UVideoCallWidget::NativeDestruct()
{
Super::NativeDestruct();
if (VideoCallPtr)
{
VideoCallPtr->StopCall();
}
}
UVideoCallWidget::UVideoCallWidget(const FObjectInitializer& ObjectInitializer)
: Super(ObjectInitializer)
{
static ConstructorHelpers::FObjectFinder<UTexture2D>
EndCallButtonTextureFinder(TEXT("Texture'/Game/ButtonTextures/hangup.hangup'"));
if (EndCallButtonTextureFinder.Succeeded())
{
EndCallButtonTexture = EndCallButtonTextureFinder.Object;
}
static ConstructorHelpers::FObjectFinder<UTexture2D>
AudioButtonMuteTextureFinder(TEXT("Texture'/Game/ButtonTextures/mute.mute'"));
if (AudioButtonMuteTextureFinder.Succeeded())
{
AudioButtonMuteTexture = AudioButtonMuteTextureFinder.Object;
}
static ConstructorHelpers::FObjectFinder<UTexture2D>
AudioButtonUnmuteTextureFinder(TEXT("Texture'/Game/ButtonTextures/unmute.unmute'"));
if (AudioButtonUnmuteTextureFinder.Succeeded())
{
AudioButtonUnmuteTexture = AudioButtonUnmuteTextureFinder.Object;
}
static ConstructorHelpers::FObjectFinder<UTexture2D>
VideomodeButtonCameraonTextureFinder(TEXT("Texture'/Game/ButtonTextures/cameraon.cameraon'"));
if (VideomodeButtonCameraonTextureFinder.Succeeded())
{
VideomodeButtonCameraonTexture = VideomodeButtonCameraonTextureFinder.Object;
}
static ConstructorHelpers::FObjectFinder<UTexture2D>
VideomodeButtonCameraoffTextureFinder(TEXT("Texture'/Game/ButtonTextures/cameraoff.cameraoff'"));
if (VideomodeButtonCameraoffTextureFinder.Succeeded())
{
VideomodeButtonCameraoffTexture = VideomodeButtonCameraoffTextureFinder.Object;
}
}
void UVideoCallWidget::InitButtons()
{
if (EndCallButtonTexture)
{
EndCallButton->WidgetStyle.Normal.SetResourceObject(EndCallButtonTexture);
EndCallButton->WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
EndCallButton->WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
EndCallButton->WidgetStyle.Hovered.SetResourceObject(EndCallButtonTexture);
EndCallButton->WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
EndCallButton->WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
EndCallButton->WidgetStyle.Pressed.SetResourceObject(EndCallButtonTexture);
EndCallButton->WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
EndCallButton->WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
}
EndCallButton->OnClicked.AddDynamic(this, &UVideoCallWidget::OnEndCall);
SetAudioButtonToMute();
MuteLocalAudioButton->OnClicked.AddDynamic(this, &UVideoCallWidget::OnMuteLocalAudio);
SetVideoModeButtonToCameraOff();
VideoModeButton->OnClicked.AddDynamic(this, &UVideoCallWidget::OnChangeVideoMode);
}

Add Button Textures
Find the Content/ButtonTextures directory in the demo application. (You don’t have to open the project. Simply find this folder in the file system.) All button textures are stored there. In your project content create a directory called ButtonTextures, and drag and drop all button images there to make them available in your project.

Add Setters

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
...

public:
void SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController);
void SetVideoCall(TUniquePtr<VideoCall> PassedVideoCallPtr);
...
};
//VideoCallWidget.cppvoid UVideoCallWidget::SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController)
{
PlayerController = VideoCallPlayerController;
}
void UVideoCallWidget::SetVideoCall(TUniquePtr<VideoCall> PassedVideoCallPtr)
{
VideoCallPtr = std::move(PassedVideoCallPtr);
}

Add Methods to Update Buttons View

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
...private:

void SetVideoModeButtonToCameraOff();
void SetVideoModeButtonToCameraOn();
void SetAudioButtonToMute();
void SetAudioButtonToUnMute();

...
};
//VideoCallWidget.cppvoid UVideoCallWidget::SetVideoModeButtonToCameraOff()
{
if (VideomodeButtonCameraoffTexture)
{
VideoModeButton->WidgetStyle.Normal.SetResourceObject(VideomodeButtonCameraoffTexture);
VideoModeButton->WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
VideoModeButton->WidgetStyle.Hovered.SetResourceObject(VideomodeButtonCameraoffTexture);
VideoModeButton->WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
VideoModeButton->WidgetStyle.Pressed.SetResourceObject(VideomodeButtonCameraoffTexture);
VideoModeButton->WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
}
}
void UVideoCallWidget::SetVideoModeButtonToCameraOn()
{
if (VideomodeButtonCameraonTexture)
{
VideoModeButton->WidgetStyle.Normal.SetResourceObject(VideomodeButtonCameraonTexture);
VideoModeButton->WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
VideoModeButton->WidgetStyle.Hovered.SetResourceObject(VideomodeButtonCameraonTexture);
VideoModeButton->WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
VideoModeButton->WidgetStyle.Pressed.SetResourceObject(VideomodeButtonCameraonTexture);
VideoModeButton->WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
VideoModeButton->WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
}
}
void UVideoCallWidget::SetAudioButtonToMute()
{
if (AudioButtonMuteTexture)
{
MuteLocalAudioButton->WidgetStyle.Normal.SetResourceObject(AudioButtonMuteTexture);
MuteLocalAudioButton->WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
MuteLocalAudioButton->WidgetStyle.Hovered.SetResourceObject(AudioButtonMuteTexture);
MuteLocalAudioButton->WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
MuteLocalAudioButton->WidgetStyle.Pressed.SetResourceObject(AudioButtonMuteTexture);
MuteLocalAudioButton->WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
}
}
void UVideoCallWidget::SetAudioButtonToUnMute()
{
if (AudioButtonUnmuteTexture)
{
MuteLocalAudioButton->WidgetStyle.Normal.SetResourceObject(AudioButtonUnmuteTexture);
MuteLocalAudioButton->WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
MuteLocalAudioButton->WidgetStyle.Hovered.SetResourceObject(AudioButtonUnmuteTexture);
MuteLocalAudioButton->WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
MuteLocalAudioButton->WidgetStyle.Pressed.SetResourceObject(AudioButtonUnmuteTexture);
MuteLocalAudioButton->WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
MuteLocalAudioButton->WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
}
}

Add the OnStartCall Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

void OnStartCall( const FString& ChannelName, const FString& EncryptionKey, const FString& EncryptionType );
...
};
//VideoCallWidget.cppvoid UVideoCallWidget::OnStartCall(
const FString& ChannelName,
const FString& EncryptionKey,
const FString& EncryptionType)
{
if (!VideoCallPtr)
{
return;
}
auto OnLocalFrameCallback = [this](
std::uint8_t* Buffer,
std::uint32_t Width,
std::uint32_t Height,
std::uint32_t Size)
{
VideoCallViewWidget->UpdateAdditionalVideoBuffer(Buffer, Width, Height, Size);
};
VideoCallPtr->RegisterOnLocalFrameCallback(OnLocalFrameCallback);
auto OnRemoteFrameCallback = [this](
std::uint8_t* Buffer,
std::uint32_t Width,
std::uint32_t Height,
std::uint32_t Size)
{
VideoCallViewWidget->UpdateMainVideoBuffer(Buffer, Width, Height, Size);
};
VideoCallPtr->RegisterOnRemoteFrameCallback(OnRemoteFrameCallback);
VideoCallPtr->StartCall(ChannelName, EncryptionKey, EncryptionType);
}

Add the OnEndCall Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

UFUNCTION(BlueprintCallable)
void OnEndCall();

...
};
//VideoCallWidget.cpp void UVideoCallWidget::OnEndCall()
{
if (VideoCallPtr)
{
VideoCallPtr->StopCall();
}
if (VideoCallViewWidget)
{
VideoCallViewWidget->ResetBuffers();
}
if (PlayerController)
{
SetVisibility(ESlateVisibility::Collapsed);
PlayerController->EndCall(std::move(VideoCallPtr));
}
}

Add the OnMuteLocalAudio Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

UFUNCTION(BlueprintCallable)
void OnMuteLocalAudio();
...
};
//VideoCallWidget.cppvoid UVideoCallWidget::OnMuteLocalAudio()
{
if (!VideoCallPtr)
{
return;
}
if (VideoCallPtr->IsLocalAudioMuted())
{
VideoCallPtr->MuteLocalAudio(false);
SetAudioButtonToMute();
}
else
{
VideoCallPtr->MuteLocalAudio(true);
SetAudioButtonToUnMute();
}
}

Add the OnChangeVideoMode Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
GENERATED_BODY()
public: ...

UFUNCTION(BlueprintCallable)
void OnChangeVideoMode();

...
};
//VideoCallWidget.cppvoid UVideoCallWidget::OnChangeVideoMode()
{
if (!VideoCallPtr)
{
return;
}
if (!VideoCallPtr->IsLocalVideoMuted())
{
VideoCallPtr->MuteLocalVideo(true);
SetVideoModeButtonToCameraOn();
}
else
{
VideoCallPtr->EnableVideo(true);
VideoCallPtr->MuteLocalVideo(false);
SetVideoModeButtonToCameraOff();
}
}

Create Blueprint Classes

Make sure the C++ code compiles properly. Without a successfully compiled project you cannot move on to the next steps. If you’ve compiled the C++ code successfully and still don’t see required classes in the Unreal Editor, reopen the project.

Create the BP_EnterChannelWidget Blueprint Asset

Create a Blueprint of UEnterChannelWidget. Right-click Content and select select Widget Blueprint from the User Interface menu.

Change the parent of the class of this new User Widget.
When you open the blueprint, the UMG Editor Interface appears and by default opens to the Designer tab.

Click the Graph button (right-top corner button) and select Class Settings. On the Details panel, click the Parent Class drop-down list and select the C++ class previously created: UEnterChannelWidget.

Return to the Designer tab. The Palette window contains several different types of widgets that you can use to construct your UI elements. Find Text, Editable Text, Button and ComboBox (String) elements and drag them to the workspace as in the screenshot. Then go to the definition of UEnterChannelWidget in the EnterChannelWidget.h file to see the names of the member variables with the corresponding types(UTextBlock, UEditableTextBox, UButton, and UComboBoxString).

Return to the BP_VideoCallViewWidget editor and set identical names to the UI elements that you have dragged into your widget. You can do this by clicking the element and changing the name in the Details panel. Try to compile the blueprint. You will see an error if you forgot to add something, or if there is a widget name/type mismatch inside your UserWidget class.

Save it to the preferred folder. For example: /Content/Widgets/BP_EnterChannelWidget.uasset

Create the BP_VideoViewWidget Asset.

Create the BP_VideoViewWidget asset, set the parent class to UVideoViewWidget, and name the Image element RenderTargetImage.

It’s important to set image anchor here:

Create the BP_VideoCallViewWidget Asset

Create the BP_VideoCallViewWidget asset, set the parent class to UVideoCallViewWidget, and add UI elements MainVideoViewWidget and AdditionalVideoViewWidget of BP_VideoViewWidget type. Also add MainVideoSizeBox and AdditionalVideoSizeBox UI elements of SizeBox type.

Create the BP_VideoCallWidget Asset:

Create the BP_VideoCallWidget asset, set the parent class to UVideoCallWidget, find in the Palette the UI element BP_VideoCallViewWidget UI element and add it with the name VideoCallViewWidget, and add the EndCallButton, MuteLocalAudioButton and VideoModeButton buttons.

Create the BP_VideoCallPlayerController Blueprint Asset

Now it’s time to create the BP_VideoCallPlayerController blueprint asset, based on the AVideoCallPlayerController class that was described earlier.

Create a Blueprint of AVideoCallPlayerController

Right-click Content, click the Add New button, and select the Blueprint Class. In the Pick parent class window, go to the All Classes section and find the VideoCallPlayerController class.

Now assign the previously created widgets to the PlayerController as shown:

Save it to the preferred folder (for example,
/Content/Widgets/BP_VideoCallPlayerController.uasset).

Create the BP_AgoraVideoCallGameModeBase Asset

Next, you create a blueprint of AVideoCallPlayerController.
Click the Add New button, select Blueprint Class, and choose Game Mode Base Class.

Modify GameMode

Now you need to set your custom GameMode class and Player Controller. Go to the world settings and set the specified variables:

Specify Project Settings

Go to Edit > Project settings and open the Maps & Modes tab. Specify Default parameters:

Run the Game

Windows

Select File > Package Project > Windows > Windows(64-bit), select a folder where you want to place the package, and wait for the result.

Mac

Select File > Package Project > Mac and specify a Builds folder you want to build to. Before running the game, you have to add permissions.

Mac Build Setup

Add Permissions in the info.plist File for Device Access

Note: To access the .plist file, right click the <YourBuildName>.app file and select Show Package Contents. The info.plist file is inside Contents.

Add these permissions to the file:

  • Privacy — Camera Usage Description
  • Privacy — Microphone Usage Description

Add the AgoraRtcKit.framework Folder to Your Build

  1. Go back to your project directory and open the Plugins folder.
  2. From Plugins/AgoraPlugin/Source/ThirdParty/Agora/Mac/Release, copy the AgoraRtcKit.framework file.
  3. Paste AgoraRtcKit.framework into your newly built project folder:<Packaged_project_dir>/MacNoEditor/[project_name]/Contents/MacOS/

iOS Packaging

To package the project for iOS, you need to generate a signing certificate and a provisioning profile. Follow the instructions in UE4 documentation: iOS Provisioning

Tip: I recommend going to ProjectSettings > Build > and selecting Automatic Signing checkbox.

Then you need to add the certificate and the provisioning profile to your project:

Select Edit > Project Settings > Platforms: iOS, and then select the certificate and the provisioning profile you created before.

If you don’t see one of them in the table, click Import Certificate or Import Provision, choose the correct file in the Finder, and click Open.

Then enter a Bundle Identifier: it must be the Bundle ID you used during certificate creation.

iOS Permissions

For testing in iOS, I recommend testing by clicking the Launch button in the top bar of the Unreal Editor, with your iOS device selected in the launch settings.

Add the following permissions in the info.plist file for device access:

  • Privacy — Camera Usage Description
  • Privacy — Microphone Usage Description

To add the permissions in the info.plist file, select Edit > Project Settings > Platforms: iOS and add the following line to Additional Plist Data:<key>NSCameraUsageDescription</key><string>AgoraVideoCall</string> <key>NSMicrophoneUsageDescription</key><string>AgoraVideoCall</string>

Now you are ready to package your project for iOS or launch it on an iOS device.

If you would like to reach out to our DE team for support, find us here on Slack!

--

--