How to Integrate Google Vision API With React Native and Expo

Yunice Xiao
The Startup
Published in
5 min readSep 10, 2020

In order to help you learn how to use the optical character recognition(OCR) API, we will be building a simple bill/receipt scanning application. There are a bunch of technologies out there, such as Tesseract and Google Cloud Vision. However, when I was researching on my own, I realized that some OCR tools only support web applications or only React Native without Expo, or the resources are too out-to-dated to refer. Therefore I decided to write this article to guide those in need on how to integrate OCR specifically in the React Native app that builds on Expo.

Here are the technologies we will use:

  • Firebase (for photo storage only, you can always have another database for other purposes)
  • Google Cloud Vision API (easy to start with that supports both React Native and Expo)
  • Expo
  • React Native

We will need 3 steps in this tutorial:

  1. Set up your Firebase project
  2. Set up your Google Cloud Vision API
  3. Build the app

You can find a video demo of the scanner at the end of this article.

Set up your Firebase project:

  1. Go to firebase and sign in with your Google account
  2. Click “Go to console” at the top right corner:

3. Create a project and enter your project info:

4. After you created the project, you will be redirected to the project dashboard, choose the app you want to create, in our case, “Web”:

5. Go to your “project settings”:

6. Find the following information in your project settings:

7. Put these keys in a secret.js file, because we don’t want to expose them. And also add secret.js into your .gitignore if you want to put your app on GitHub.

export const FIREBASE_API_KEY = ‘XXX’
export const FIREBASE_DATABASE_URL = “XXX”
export const FIREBASE_PROJECT_ID = ‘XXX’
export const FIREBASE_MESSAGING_SENDER_ID = ‘XXX'
export const FIREBASE_AUTH_DOMAIN = ‘XXX’
export const FIREBASE_STORAGE_BUCKET = ‘XXX’

8. Create your storage database with default rules (we will change it later)

9. Go to “rules”, copy and paste the following, then publish. It takes a few minutes for the new rule to propagate, The following rules allow us to upload photos to the Firebase database.

service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if request.auth == null;
}
}
}

Set up your Google Vision API:

  1. Go to Google Cloud Platform and search cloud vision API:

2. Create a project and then enable Cloud Vision api:

3. Create credentials and get your API key. To complete this process of enabling Vision API services, you are required to add billing information to your Google Cloud Platform account. But as long as you don’t hit the free trial limits, you won’t get charged. You can find the pricing detail in this article later on.

Build the app:

Now you’ve finished setting up and start building the app.

  1. Install firebase:
npm install -save firebase

2. Create a new folder called config, and under it create a new file environments.js, import the keys from secret.js. If you are interested in a detailed explanation of the following code, you can find it here.

//environment.jsimport {GOOGLE_CLOUD_VISION_API_KEY, FIREBASE_API_KEY, FIREBASE_AUTH_DOMAIN, FIREBASE_DATABASE_URL, FIREBASE_PROJECT_ID, FIREBASE_STORAGE_BUCKET, FIREBASE_MESSAGING_SENDER_ID} from “../secret.js”var environments = {
staging: {
FIREBASE_API_KEY,
FIREBASE_AUTH_DOMAIN,
FIREBASE_DATABASE_URL,
FIREBASE_PROJECT_ID,
FIREBASE_STORAGE_BUCKET,
FIREBASE_MESSAGING_SENDER_ID,
GOOGLE_CLOUD_VISION_API_KEY
},
production: {
// Warning: This file still gets included in your native binary and is not a secure way to store secrets if you build for the app stores. Details: https://github.com/expo/expo/issues/83
}
};
function getReleaseChannel() {
let releaseChannel = Expo.Constants.manifest.releaseChannel;
if (releaseChannel === undefined) {
return ‘staging’;
} else if (releaseChannel === ‘staging’) {
return ‘staging’;
} else {
return ‘staging’;
}
}
function getEnvironment(env) {
console.log(‘Release Channel: ‘, getReleaseChannel());
return environments[env];
}
var Environment = getEnvironment(getReleaseChannel());
export default Environment;

3. Create another new file firebase.js under the config folder:

// firebase.jsimport * as firebase from ‘firebase’;
import Environment from “./environment”
firebase.initializeApp({
apiKey: Environment[‘FIREBASE_API_KEY’],
authDomain: Environment[‘FIREBASE_AUTH_DOMAIN’],
databaseURL: Environment[‘FIREBASE_DATABASE_URL’],
projectId: Environment[‘FIREBASE_PROJECT_ID’],
storageBucket: Environment[‘FIREBASE_STORAGE_BUCKET’],
messagingSenderId: Environment[‘FIREBASE_MESSAGING_SENDER_ID’]
});
export default firebase;

4. Create a new component Scanner.js and paste the code from here.

You can find an explanation of the above example here. One thing to notice is that the package uuid used in this example might not be compatible with your Expo SDK. Therefore, I replace the uuid with the package nanoid:

Line 15:

//old
import uuid from ‘uuid’
//new
import { nanoid } from ‘nanoid/non-secure’

Line 275:

//old
.child(uuid.v4());
//new
.child(nanoid());

Don’t forget to install the nanoid package:

npm install — save nanoid/non-secure

5. Each feature applied to an image is a billable unit. For example, if you apply Face Detection and Label Detection to the same image, you are billed for one unit of Label Detection and one unit for Face Detection. The first 1000 units used each month are free (The pricing detail can be found here). If you don’t want to make extra calls on features you don’t need, you can choose features you and comment out/ delete the rest from line 214 to line 223 in the above example App.js.

You can find a video demo of the scanner below, the component is slightly edited.

scanner demo

Credits/ Resources

Here are some additional resources that you may find helpful in your journey connecting these technologies.

Create a React Native Image Recognition App with Google Vision API: https://blog.jscrambler.com/create-a-react-native-image-recognition-app-with-google-vision-api/

Using Google Cloud Vision With Expo and React Native: https://medium.com/@mlapeter/using-google-cloud-vision-with-expo-and-react-native-7d18991da1dd

Demo: https://github.com/JscramblerBlog/google-vision-rn-demo

--

--