I use Flutter for pet projects. It saves my time by building cross-platform UI (and of course by awesome hot reloading feature). I have recently finished one of such projects. UI in the app is not interesting at all (tables, tables and more tables). Tricky part was adding 3D model to give users more context and details. The model should be interactive — users can changes state, rotate and scale it. Flutter is 2D framework without support 3D. I hadn’t (and still haven’t) much experience with 3D and started searching how to implement this feature.
First idea was to use any of game engines, e.g. Unity, Unreal Engine. But I didn’t try — Flutter has its own build system, the same for game engines. As a result integration looked scary to me, especially for my case with the only screen required 3D graphics.
Then I considered OpenGL. What’s about writing C\C++ code and integrating it with both iOS and Android apps? At first glance I found no problems. C/C++ can be easily integrated in iOS, there are some additional magic with NDK in Android, but nothing serious. I chose this way.
I started with very native components (and naive way) such as GLKView in iOS and GLSurfaceView in Android. Whenever needed UIViewController or Activity was presented from Flutter side and the render was displayed there. I built demo, it worked really well. Then I found obvious disadvantages — overlay (buttons, labels, etc), gesture handlers should be done twice. It wasn’t a problem for demo, but in a while this part became more complex and brought some pain. Of course for serious projects it isn’t an issue, companies build whole mobile products two times and it looks funny that building only one screen became a problem for me. But I am very lazy developer (as young adept of Flutter I don’t want to do same things multiply times). I continued investigation.
And happy me. Texture widget was added to Flutter framework just in time. The widget presents backend textures in Flutter view hierarchy. According to docs it can be used with OpenGL. Exactly the tool I was looking for — OpenGL plus “native” Flutter widgets/gestures. Thanks to Flutter team!
But. But. But. I found no details and examples. Flutter was (or even is) very young technology, so lack of docs didn’t scare (and surprise) me at all and I gave a try to Texture widget. So how to configure Texture to handle OpenGL?
Texture works well on Android Emulator and devices, but only on iOS device, not iOS Simulator.
First of all we need Flutter plugin that interacts with native code. Plugin should have at least two most important methods:
- initialize — returns from platform dependent part textureId for Texture widget,
- dispose — cleanups resources.
Now we can integrate it in Flutter app:
The widget should show some stub before the plugin returns textureId.
That’s all required on dark side.
Let’s continue with Android.
Android plugin has access to TextureRegistry, which creates TextureRegistry.SurfaceTextureEntry. This object provides textureId (for Texture widget) and SurfaceTexture (to render into). The most tricky part was configuring OpenGL stack to render it into given SurfaceTexture. After reading code of GLSurfaceView I found examples how do OpenGL in Android from scratch. So I passed SurfaceTexture as native window object to eglCreateWindowSurface and it started to work :
And sample worker that performs OpenGL drawing:
Then I connected everything in OpenGLTexturePlugin.
I lost whole day because of surfaceTexture.setDefaultBufferSize. There weren’t any rendered objects (only color of glClear), until size was setup.
Android app is ready:
iOS workflow is different from Android. The plugin has to provide implementation of FlutterTexture (to generate CVPixelBufferRef pixel buffer with rendered 3D model) and notify FlutterTextureRegistry whenever new frame is ready.
To get pixel buffer object I found two options. The simplest way is using glReadPixels to fulfill CVPixelBufferRef. But as I understood it makes copy of data. A bit advanced option is to use CVOpenGLESTextureCacheRef that links CVOpenGLESTextureRef to CVPixelBufferRef avoiding memory copy.
I created example of FlutterTexture object:
I added CVBufferRetain in copyPixelBuffer, because Flutter invoke additional release after copyPixelBuffer.
Some worker to do OpenGL:
And then linked everything together in plugin code:
iOS app is ready too:
You can find demo project on GitHub. OpenGL is done by Java on Android to keep example simple without NDK noise. If you have any ideas or improvements, feel free to comment or create issue/PR.
In the original project I shared C/C++ OpenGL code between platforms and added only platform dependent glue between Flutter widgets tree and OpenGL. That’s really awesome.