How we implemented 3D cards in Revolut

At Revolut we put customer experience at the heart of everything we do, with the aim of bringing pleasure through simple designs and careful execution. You can probably imagine our excitement, then, as we introduce an update to our card order flow. In the latest version of the Revolut app, you’ll be able to choose your card from an interactive 3D model.

This was an interesting challenge for us, since it was our first time using a 3D physics based engine to create a feature. We think it turned out pretty well!

Entering the card order section of the app, you’ll be able to choose from two materials — plastic and metal. From there you’ll be able to choose a colour, and whether you want Visa or Mastercard (depending on your country).

The new card order flow

Let’s take a look at how we reached this milestone, and explore some of the challenges along the way.

Rendering

Where to begin? First, we tried taking a GLSurfaceView, creating our own Renderer and drawing the card using OpenGL ES. But this approach has some drawbacks:

  • Not all mobile developers are familiar with OpenGL, meaning more time spent on training and problems with sustainability
  • Android supports OpenGL ES 3.1 already, but still has a terrible API. The result? Lots of boilerplate, mathematics and hair pulling

So we thought we’d find a better solution. A bit of searching led us to a few options:

  • min3d— is a lightweight 3d library/framework for Android using Java with OpenGL ES targeting compatibility with Android v1.5/OpenGL ES 1.0 and higher. Plus, min3d has a better API, but was built in 2010 and is no longer supported
  • Libgdx— is a fully Java game development framework. It provides a lot of features but it’s too large for a FinTech application that wants only to rotate a 3D card
  • Filament — is a real-time physics based rendering engine for Android, iOS, Windows, Linux, macOS and WASM/WebGL. It provides a set of tools and APIs for developers to help them easily create high-quality 2D and 3D rendering. You can render incredible images with it, and we highly recommend having a play with it.

Filament is currently used in the Sceneform library which together with ARCore, helps to render realistic 3D scenes in AR and non-AR apps, without having to learn OpenGL. Let’s give it a try!


Sceneform supports 3D assets in the following formats:

low poly card.obj in the preview of Sceneform Android Studio Plugin

To include our card model to project, we need to linkify our assets and convert them to .sfbfile through Android Studio Plugin.

As a part of assets, we should create our own material. A material defines the visual appearance of a surface. It’s kind of shaders.

Our .mat looks like this:

  • baseColor — defines the perceived colour of an object
  • roughness — controls the perceived smoothness of the surface.
  • metallic — defines whether the surface is metallic or non-metallic
  • reflectance — this property can be used to control the specular intensity. It only affects non-metallic surfaces.

Having defined this, we created textures for the UV mapping for each attribute, you can see one of them below:

diffuse texture

To create each texture, we used the Texture.builder() class, whereby you need to pass the source of your asset and usage type with one of Texture.Usage constants: COLOR, NORMAL, DATA

internal fun Context.loadTexture(
sourceUri: Uri,
usage: Texture.Usage
): Texture.Builder =
Texture.builder()
.setSource(this, Uri.parse(uri))
.setUsage(usage)
.setSampler(
Texture.Sampler.builder()
.setMagFilter(Texture.Sampler.MagFilter.LINEAR)
.setMinFilter(Texture.Sampler.MinFilter.LINEAR_MIPMAP_LINEAR)
.build()
)

Next we can collect all of the necessary textures and apply them to our loaded card model:

val cardTextures = availableTextures.map { texture -> loadTexture(texture.path, texture.usage) }
ModelRenderable.builder()
.setSource(context, Uri.parse(MODEL_SFB_PATH))
.build()
.thenApply { model ->
cardTextures.forEach { result -> model.material.setTexture(result.name, result.texture) }
}

And that’s it! Now we are ready to build our own scene.

Define layout.xml

<com.google.ar.sceneform.SceneView
android:id="@+id/sceneView"
android:layout_width="match_parent"
android:layout_height="match_parent"
/>

And add card node to an existing scene:

private val card3dNode = Node().apply {
localPosition = Vector3(CARD_POSITION_X_AXIS, CARD_POSITION_Y_AXIS, CARD_POSITION_Z_AXIS)
localRotation = getRotationQuaternion(CARD_STARTING_Y_AXIS_ANGLE.toFloat())
name = CARD_ID
}
fun addCardToScene(modelRenderable: ModelRenderable, currentCard: CardRender) {
modelRenderable.material = currentCard.value
with(card3dNode) {
setParent(sceneView.scene)
renderable = modelRenderable
localScale = modelRenderable.computeScaleVector(targetSize = 1.5f)
currentCard.renderCard()
}
with(sceneView.scene) {
camera.localScale = Vector3(CAMERA_SCALE_WIDTH, CAMERA_SCALE_HEIGHT, CAMERA_FOCAL_LENGTH)
camera.localPosition = Vector3(CAMERA_POSITION_X_AXIS, CAMERA_POSITION_Y_AXIS, CAMERA_POSITION_Z_AXIS)
sunlight?.let {
it
.worldPosition = Vector3.back()
it.light = cardSceneSunLight
}
addChild(card3dNode)
}
}

Virtual Cards

virtual cards

For virtual cards, we wanted to achieve the appearance of transparency. For this, we needed to create a custom material:

virtual_card.mat

The most interesting part here is blending . Transparent defines that the material’s output is alpha composited with the render target, using Porter-Duff’s source over rule. This blending mode assumes pre-multiplied alpha.

As you may have noticed on the disposable cards, the card numbers (or PAN) have a number changing animation. For this trick, we changed the diffuse texture for a card every second. There are 3 of them.


Everything can’t be perfect, so we’re faced with some problems and restrictions:

  • Sceneform requires minSdkVersion ≥ 24 in due loading of models with CompletableFuture.
  • Dynamic texturing. material.setTexture doesn’t allow changing textures at runtime. The working solution is to create a fake object and copy this material to the real object
  • Until v1.8 of SDK there were no ways to set the white colour of the background. We got around this with an additional node and custom materials.

Animation

As you noticed the card has a physic-basic animation. Android provides to do it via support library.

compile "com.android.support:support-dynamic-animation:28.0.0"

The FlingAnimation class lets you create a fling animation for an object. To build a fling animation, create an instance of the FlingAnimation class and provide an object and the object's property that you want to animate.

abstract class CardProperty(name: String) : FloatPropertyCompat<Node>(name)
private val rotationProperty: CardProperty = object : CardProperty("rotation") {
override fun setValue(card: Node, value: Float) {
card.localRotation = getRotationQuaternion(value)
}
override fun getValue(card: Node): Float = card.localRotation.y
}
private var animation: FlingAnimation = FlingAnimation(card3dNode, rotationProperty).apply {
friction = FLING_ANIMATION_FRICTION
minimumVisibleChange = DynamicAnimation.MIN_VISIBLE_CHANGE_ROTATION_DEGREES
}

In a fling gesture detector, we run animation in onFling without any update listener. Just set the velocity and you’re away.

class FlingGestureDetector : GestureDetector.SimpleOnGestureListener() {

override fun onScroll(e1: MotionEvent, e2: MotionEvent, distanceX: Float, distanceY: Float): Boolean {
val deltaX = -(distanceX / screenDensity) / CARD_ROTATION_FRICTION
card3dNode.localRotation = getRotationQuaternion(lastDeltaYAxisAngle + deltaX)
return true
}

override fun onFling(e1: MotionEvent, e2: MotionEvent, velocityX: Float, velocityY: Float): Boolean {
if (Math.abs(velocityX) > SWIPE_THRESHOLD_VELOCITY) {
val deltaVelocity = (velocityX / screenDensity) / CARD_ROTATION_FRICTION
startAnimation(deltaVelocity)
}
return true
}
}
private fun startAnimation(velocity: Float) {
if (!animation.isRunning) {
animation.setStartVelocity(velocity)
animation.setStartValue(lastDeltaYAxisAngle)
animation.start()
}
}

For card rotating we used alocalRotation property which utilises quaternion. Sceneform has a static method which uses Axis–angle representation and calculates quaternion via axisAngle and the desired vector. In our case that's Vector3(0.0f, 1.0f, 0.0f) .

But this creates redundant objects in each frame of animation, so we need to copy this method with the existing quaternion and vector:

private val quaternion = Quaternion()
private val rotateVector = Vector3.up()
private fun getRotationQuaternion(deltaYAxisAngle: Float): Quaternion {
lastDeltaYAxisAngle = deltaYAxisAngle
return quaternion.apply {
val
arc = toRadians(deltaYAxisAngle)
val axis = sin(arc / 2.0)
x = rotateVector.x * axis
y = rotateVector.y * axis
z = rotateVector.z * axis
w = cos(arc / 2.0)
normalize()
}
}

Conclusion

Sceneform is a pretty fresh library but it already has a broad set of functionalities: optimisation rendering, powerful API and small runtime. All of these features helped us quickly implement 3D without having to learn OpenGL.


Kudos to all who took part in this challenge, especially:

Denis Kovalev, incredible with UI/UX
Dmitry Kovalev, inspiring. He created the 3D model and textures
George Robson, genius owner of the Premium team.
Ilia Kisliakovskii, one of our back-end heroes
Mikhail Koltsov and Igor Dudenkov, our beloved iOS guys


We’re hiring👨‍💻👩‍💻