Developing Flutter Plugin for Android and iOS

Safalshrestha
codingmountain
Published in
5 min readFeb 1, 2024

Flutter Plugin is a package containing dart code and a wrapper around the native code. It provides a way to call platform-specific code to access native features and functionalities from Flutter like camera functionality, working with maps, integrating with payment gateways, and much more.

To know how the plugin works under the hood. Checkout Flutter platform Channels and Flutter: Communicating with the Native Platform

Developing Flutter Plugin

In this blog, we will learn to create a Flutter Plugin that will transcribe the audio spoken by the user.

To help us with some of the boilerplate while creating a plugin, we will use very_good_cli.

First, install very very_good_cli and type.

very_good create flutter_plugin audio_transcribe — desc “My new Flutter plugin” — platforms android,ios

While it’s entirely possible to create a plugin without this tool, very_good_cli offers convenience by generating boilerplate code.

After running the command above, four folders are created inside audio_transcribe. You have generated a federated plugin. Federated plugins are a way of splitting support for different platforms into separate packages. This modular approach enhances the manageability and organization of your plugin.

Now open audio_transcribe_platform_interface in the editor and open audio_transcribe_platform_interface.dart. You will see an abstract class extending PlatformInterface. PlatformInterface is a base class for platform interfaces of federated flutter plugins.

Now let’s add our method. To transcribe, we need a method to start listening, get text, and stop listening.

abstract class AudioTranscribePlatform extends PlatformInterface {
/// Constructs a AudioTranscribePlatform.
AudioTranscribePlatform() : super(token: _token);

static final Object _token = Object();

static AudioTranscribePlatform _instance = MethodChannelAudioTranscribe();

/// The default instance of [AudioTranscribePlatform] to use.
///
/// Defaults to [MethodChannelAudioTranscribe].
static AudioTranscribePlatform get instance => _instance;

/// Platform-specific plugins should set this with their own platform-specific
/// class that extends [AudioTranscribePlatform] when they register themselves.
static set instance(AudioTranscribePlatform instance) {
PlatformInterface.verify(instance, _token);
_instance = instance;
}

/// Return the current platform name.
Future<String?> getPlatformName();

/// Return the current platform name.
Future<String?> getAudioText();

/// Return the current platform name.
Future<void> startListening();

/// Return the current platform name.
Future<void> stopListening();
}

Now open audio_transcribe_ios and audio_transcribe_android as well. You will see an error in the classes which has extended AudioTranscribePlatform. Now add the following implementation to all the class that extends AudioTranscribePlatform.

@override
Future<String?> getPlatformName() {
return methodChannel.invokeMethod<String>('getPlatformName');
}

@override
Future<String?> getAudioText() {
return methodChannel.invokeMethod<String>('getAudioText');
}

@override
Future<void> startListening() async{
return methodChannel.invokeMethod<void>('startListening');
}

@override
Future<void> stopListening()async {
return methodChannel.invokeMethod<void>('stopListening');
}

Here, we are simply calling native API using the platform channel. To learn more ,checkout Flutter platform Channels

Now, let’s jump to the native side. First, we will implement functionality in Android followed by iOS.

Android

Open the android folder inside audio_transcribe_android in Android Studio.

Now open AudioTranscribePlutin.kt, and you will see unresolved references. You will see the FlutterPlugin interface implemented which allows Flutter developers to interact with a host platform and MethodCallHandler, a handler of incoming method calls.
As Android Studio doesn't recognize them we cannot continue further development.

Sure, technically, we can write code without the magic of code editor. But let’s be real — who wants to traverse the coding wilderness without the reassuring companionship of autocomplete, intelligent suggestions, and syntax highlighting?

To make our editor recognize these interfaces and use the magic of our editor. Let’s go to Project View and create a directory named tempLibs

Now find the flutter.jar file in flutter/bin/cache/artifacts/engine/android-x64 and copy flutter.jar to tempLibs. You can find the location of the flutter installed by running the command where flutterin terminal.

Now add compileOnly files(‘tempLibs/flutter.jar’) inside dependencies in the build.gradle file. You can also add other activities that you might require. Your dependencies should look like this in build.gradle.

dependencies {
implementation 'androidx.appcompat:appcompat:1.6.1' // Added For asking permission
compileOnly files('tempLibs/flutter.jar')
}

After that sync the project and you are ready to develop a plugin for Android.

Compose your code just as you would for any native Android project, and call them from onMethodCall obtained by implementing MethodCallHandler which can be called from the flutter side using MethodChannel.

Here, I have edited the AudioTranscribePlutin.kt to transcribe the audio.

Don’t change the name of the class, It will not work. If you really want to change name of class change in pluginClass: in pubspec.yaml file of audio_transcribe_android as well. It is use to generate plugin registry.

Some other interfaces that you might need are:

  1. PluginRegistry.ActivityResultListener: Delegate interface for handling activity results on behalf of the main Activity.
  2. ActivityAware: FlutterPlugin that is interested in Activity lifecycle events related to a FlutterEngine running within the given Activity.

You can view the code for Android here.

iOS

Open audio_transcribe_ios in the code editor, where our iOS Plugin code resides. However, you can’t open it in Xcode since it’s not an Xcode project.

Now open the iOS in Xcode found inside example folder of audio_transcribe.

Now drag the AudioTranscribePlugin.swift in Xcode.

If you see the error live Cannot find type ‘FlutterPlugin’ in scope, Run command pod install and pod update from terminal in the ios folder.

Customize the code within AudioTranscribePlugin.swift as you need for your iOS plugin and call them from the flutter side using the method channel which can be handled in handle function. I’ve made the necessary edits to AudioTranscribePlugin.swift enable audio transcription.

You can view the code for iOS here.

Ive reached the section in my write-up where you’re now equipped to develop your own plugin. A foundational understanding of both Android and iOS is crucial for navigating the intricacies of the code ahead. If you’re interested in examining the code firsthand, feel free to check it out on my GitHub.

With this, we have learned to create a flutter plugin for Android and iOS. Let's see the final result of the audio_transcribe plugin.

--

--

Safalshrestha
codingmountain

Curiosity killed the cat, but satisfaction brought it back.