10 Best Face Filter SDKs: Ultimate Comparison Guide. Part 2

Dave Gordon
10 min readJan 23, 2019

--

Also See: Face Filter SDKs Comparison Guide. Part 1

In my previous post, I reviewed 10 computer vision companies (Banuba, BinaryFace, DeepAR, Youmask by Luxand, Mood-me, Image Metrics, Visage Technologies, SeersLab, SenseTime and XZIMG) that provide augmented reality SDKs with face filters.

I also explained why choosing AR SDK may be a challenge for software developers and things that worth their attention. Now, let’s compare each face filter SDK based on its performance, quality of the effects, and functionality.

www.banuba.com

Table of Contents

Face filter SDKs: Performance compared

#1. Multi-face + landscape

Multi-face detection allows applying face filters to several people simultaneously. And when taking a group photo, we naturally hold our mobile devices in a landscape orientation. The combination of both increases user engagement in your AR app as people will have fun trying on filters together with friends and family.

To test this, I pointed developers’ apps to a photo with two faces and rotated the device from portrait to landscape mode.

Summary

  • SenseTime, SeersLab, BinaryFace and Banuba face filter SDKs support both multi-face and landscape modes.
  • Image Metrics, Youmask and XZIMG support only one feature — either multi-face or landscape.
  • The rest showed support of none.
Multi-face + landscape AR SDK support

#2. Jitter

In the context of real-world usage, jitter is another thing to look at. Users pay attention to quality performance, but high jitter may spoil the entire experience.

There are no technology parameters that would measure the level of jitter. My estimations base on the visual perception of face filters in demo apps.

Banuba, Image Metrics and SeersLab filters were stable and stuck well without trembling. Others required a few second of proper device positioning and stable face. In addition, the filters would disappear for short moments and trembled when appearing again.

AR SDKs jitter comparison

Face filter SDKs: Effect quality compared

#1. Rendering

The renderer module is responsible for the visualization of AR models and its behaviour. To put it simply, rendering enables beautiful and high-quality face filters.

All face filter SDKs have the renderer module, however, its capabilities to provide quality visualization of AR experiences differ a lot. By “quality” I mean the realistic behaviour of face filters from different aspects like geometry, viewpoint, texture and shading.

For example, if a user inclines or shakes the head, the virtual hair moves accordingly. It also makes possible for virtual objects to throw shadows and creating dynamic reactions with objects and much of the other stuff that makes user experience immersive.

Rendering features. Must-have:

  • Hardware skinning to create models with live animation.
  • Rich physical-based materials for the realistic visualization of objects.
  • Post process effects to overlay an effect on the entire image, e.g. noir lighting.
  • IBL (Image-based lighting) to add real-world lighting to objects and images.
  • Animation billboards for the textures to stick front-face even if a user turns the head. Same feature as in Snapchat filters.
  • HDR (High dynamic range imaging) to show a great range of colors and brightness levels.

Rendering features. Nice-to-have:

  • PhysX for creating rigid or soft body dynamics, e.g cloth simulation including tearing and pressurized cloth, ragdolls and character controllers, vehicle dynamics, etc.
  • Sprite animation for AR objects looked not frozen but dynamic.
  • Video textures to stream a video effect on a face.
  • Multisample anti-aliasing for eliminating jagged edges and improving the overall image quality.
  • Face morphing to modify the shape of a face, e.g. slim down the cheeks, enlarge nose, align two faces etc.
  • Shadow support for objects could throw shadows onto the face.

The more features a renderer module supports, more freedom you have with implementing augmented reality applications for different domains, e.g. virtual try-ons where the AR items need to look as natural as the real ones. Or for fancy face masks and effects that increase engagement.

Summary

  • Banuba’s renderer has all the must-haves for realistic effects and even more. It makes a strong focus on physics and lighting. In terms of possibilities, it doesn’t limit you to create high-quality 3D objects, face masks or image effects for a variety of industries.
  • DeepAR render is also good. The website state it allows for real-time drawing and face painting masks which is a nice tweak too.
  • SenseTime rendering is powerful too featuring all the needed functionalities for creating beautiful AR experiences.
  • Mood-me, SeersLab and Youmask feature the basic functionality in terms of rendering allowing for 3D modelling and light improvement.
AR SDKs: Rendering features compared

#2. Beautification

Face filters are all about magic experience, therefore, it’s essential to maintain the visual beauty in users.

Such factors as camera distortion, improper lighting or facial defects spoil the magic. Even if users like the filter but don’t like themselves with it — your AR application will be a waste of time and effort.

Beautification technology fixes camera distortions and visual defects enhancing the overall image and improving user’s appearance. Advanced functionality allows modifying separate parts of the face, e.g. change the color of the eyes or hair.

Face beautification features:

  • Smooth skin
  • Correct skin tone
  • Emphasize eyes
  • Whiten teeth
  • Morph face — change face shape making it slimmer, wider, changing eye size, the shape of the nose and head proportions.
  • Improve face symmetry
  • Fix defects, e.g. pimples, under-eye circles

Summary

  • More than half of the AR SDKs including Banuba, BinaryFace, Image Metrics, SeersLab, SenseTime and Visage Technologies provide real-time beautification.
  • Banuba’s beautification SDK technologies are applicable both for camera apps fixing camera distortions and e-commerce services for modifying users’ faces with makeup or by facial features separately, e.g eyebrows, eyes. Snapchat utilizes the same technology, however, Banuba allows for the natural skin look and makes beautification far less intrusive. The website gives a preview of some interesting features e.g. hair recoloring that may soon get into production.
  • BinaryFace, Image Metrics and Visage beautification capabilities seem basic at the moment. The websites prompt it can enlarge eyes and mouths.
AR SDKs: beautification support

#3. Background subtraction

Background subtraction technology aka ‘green screen’ allows putting away or replacing the background. This makes filters far more dynamic and interesting — you can often see such filters in Snapchat.

It also allows creating a blurry background effect which is popular among Instagrammers. The homogeneous background reduces the amount of network bandwidth consumed by video sessions which may improve the quality of connection in the regions with low-quality internet.

Besides, it contributes to privacy in video chats if users want to hide the background during a video call.

Of all the face filter SDK vendors, only SenseTime and Banuba SDKs support background subtraction.

#4. SFX/Audio effects

Adding sound effects to filters increase engagement. So far, I haven’t seen face filters that support sound effects except for Snapchat ones. However, this feature has lots of potential for contributing to the overall experience.

Moreover, the advanced functionality of SFX allows changing the user’s voice. Users can sound like a robot, bear or famous cartoon character.

Only Banuba SDK supports audio effects and a voice changer in face filters. SeersLab provides background audio too.

#5. Scripting

Scripting is what allows face filters react to users actions. Users can interact with effects using their face or trigger effects with facial expressions, i.e. open mouth or smile. This functionality makes it possible to create truly engaging AR effects.

Speaking of face-triggered filters, Snapchat was the first to introduce this feature. Currently, it supports 3 triggers namely Mouth Open, Brows Raised and Touch Events triggers.

Summary

  • More than half of face filter SDKs that support interactive scripting are Banuba, BinaryFace, DeepAR, Mood-me, SeersLab and SenseTime.
  • From what I visually can estimate, DeepAR has only Open Mouth script, so as Mood-me.
  • Banuba SDK supports 4 face triggers (Open Mouth, Smile, Browse Raised, and Brows Down). Playing mode vary as well (once, reverse, loop or fixed).
AR SDKs scripting support

#6. Gamified filters

Powerful scripting features allow for creating small yet fully functional AR games similar to Snapchat or Facebook that would run in video chats.

This is a good way to increase the time users spend with the app. With his facial expressions, a user can interact with the sound, 3D animation, video parameters and textures and play mini-games inside a filter. Or you can create touch events and make filter react to user taps.

Of all the face filter SDK vendors, only Banuba SDK supports gamification feature.

Face filter SDK comparison. Functionality.

As a developer, you want to have the code integrated easily, iterate quickly and on multiple cross-platform devices.

With this in mind, the features that make an SDK functional for a developer are as follows:

#1. Face Filter Studio

Creating a good-looking filter requires time, efforts and some coding + designing skills. If you have an in-house design team, the Studio component will allow them to crank out face filters and ship to your app quickly.

Those who can’t boast of deep coding skills can iterate filters faster by testing them right in the camera and getting real-time feedback. This adds up to the overall face filter quality and significantly speeds up the process.

Studio benefits

  • Iterate quickly on multiple cross-platform devices
  • See live feedback
  • No coding needed
  • Convenient for designers
  • Effort and time saving

3 AR SDKs including Banuba, DeepAR and Visage Technologies offer Studio component.

Studio tool support

#2. Unity3D support

If your audience are gamers or you have other Unity-based projects, Unity3D support comes in handy.

With this, you can easily create AR scenarios for any kind of a Unity game.

Unity 3D support

#3. Device and platform support

Cross-platform support makes your filters accessible for a broader audience. It’s also important to take into account the graphics API versions and device support. Most face filter SDKs run perfectly fine on high-end devices like iPhone 7 or 8 yet fail to work on low-end ones, e.g. iPhone 5s or Android Nexus 6p.

Achieving quality face tracking, the technology at the core of face filters on low-level devices is still a challenge that most SDKs face.

AR SDKs system requirements

Full comparison table of face filter SDKs

Finally, here they are. The major providers of face filter SDKs listed A/Z.

Face Filters SDK comparison table

Summing up

  1. When choosing a face filter SDK, it’s important to estimate not only its performance but the ability to create varied and graphically rich AR effects.
  2. While the most common features like beautification, powerful scripting and rendering are in place at most SDKs, finding the unique functionality like sound effects or real-time background subtraction that can give your app a competitive advantage is a challenge.
  3. This challenge doubles if you want to find all the features in one SDK since from what’s available on the market today only Banuba can offer it.
  4. Banuba has come as a winner to me as it surpassed the other SDK provides in all 10 criteria. Its rendering, scripting and beautification capabilities are close to Snapchat. Besides, it offers unique features that none of other face filter SDK supports mainly background subtraction, voice changer, sound effects and gamification feature. If you want to have the same filters as Snapchat but in your own app, Banuba’s FaceAR SDK is the top choice.
  5. SenseTime is also good with rendering and functionality allowing for the diverse face-based AR experiences
  6. BinaryVR, Image Metrics, Mood-me and DeepAR are good only at one specific point. If they are good at functionality, they fail to offer rich graphics capabilities and vice versa.
  7. Face filter SDKs from XZIMG, Visage Technologies, and Youmask seem immature for now and offer just the basics.

Links to demo apps

📝 Read this story later in Journal.

🗞 Wake up every Sunday morning to the week’s most noteworthy Tech stories, opinions, and news waiting in your inbox: Get the noteworthy newsletter >

--

--