Using Google Cloud Vision With Expo and React Native

Mike LaPeter
5 min readJan 2, 2019

--

I’ve been wanting to play around with Google Cloud Vision for a while, and finally got a chance to try it in a react native app. I knew it was powerful, but it’s kinda like VR — you don’t really get it until you try it with your own hands. It took me a while to wade through all the configuration and setup, so here are my notes to hopefully help save others some time. You can also try out the cloud vision demo on expo or view the cloud vision react native example on github.

Here’s the simplest way I found to start playing with Google Cloud Vision inside a react native app:

Create a new React Native app using Expo:

If you’re not yet familiar with Expo, it’s an incredible set of tools that makes it easy to create and publish react native apps. Follow the excellent Get Started With Expo instructions and select the “tabs” template option to create your react native app.

Configuration/ Setup:

  • Setup a new Firebase project. For simplicity I temporarily made my firebase data public, more background about firebase and expo here (no need to follow all those steps).
  • Add the firebase sdk to your expo project: npm install — save firebase
  • Create a folder called config at the root of your application and create a file called environment.js in it. We’ll store all our api keys and env variables in environment.js and add it to .gitignore so it’s not included in our repo. Examples are pasted at the bottom of this post and also in the github repo I created. Note: this is just personal preference, there are many other ways and this only keeps your secrets out of source control, not the app binary.
  • Create a folder called utils and create a file in it called firebase.js. Similar to above, example at bottom of post and this is just my personal preference for organizing the firebase stuff.
  • Sign up for Google Cloud Platform, get your api key, and add it to your environment.js file. This part actually took me a while, you need to add billing info and avoid getting confused by the huge amount of apis and authentication options they have. You should hopefully end up at a url similar to https://console.cloud.google.com/apis/dashboard?project=your-project and be able to click “Credentials” to get your api key:

Coding:

Now, we’re ready to actually code! The first step is getting our app to take a picture and upload it to our firebase account. I found this incredibly helpful expo firebase storage upload example and started by deleting everything in the LinksScreen.js file and just pasting that bad boy in. Before going further you should make sure you understand what that code does — you’re basically using the built in expo tools to access the device camera, then uploading the image to your firebase project account.

Delete the import * as firebase from ‘firebase’; on line 16 and instead import the two files we created:

import Environment from "../config/environment";
import firebase from "../utils/firebase";

Delete lines 20–31 (we initialized firebase already). At this point, you should be able to click the “Links” tab in your app, take a photo, and upload it to your firebase account. If not, double check your api keys are correct and you’re correctly importing them in.

Now the fun part: let’s post that image to the cloud vision api and get back some slightly creepy google magic. First add googleResponse: null to your initial state (around line 25). This is where we’ll store the returned data from the api. Then add a button:

<Button
onPress={() => this.submitToGoogle()}
title="Analyze!"
/>

That posts our image to the cloud vision api:

submitToGoogle = async () => {
try {
this.setState({ uploading: true });
let { image } = this.state;
let body = JSON.stringify({
requests: [
{
features: [
{ type: "LABEL_DETECTION", maxResults: 10 },
{ type: "LANDMARK_DETECTION", maxResults: 5 },
{ type: "FACE_DETECTION", maxResults: 5 },
{ type: "LOGO_DETECTION", maxResults: 5 },
{ type: "TEXT_DETECTION", maxResults: 5 },
{ type: "DOCUMENT_TEXT_DETECTION", maxResults: 5 },
{ type: "SAFE_SEARCH_DETECTION", maxResults: 5 },
{ type: "IMAGE_PROPERTIES", maxResults: 5 },
{ type: "CROP_HINTS", maxResults: 5 },
{ type: "WEB_DETECTION", maxResults: 5 }
],
image: {
source: {
imageUri: image
}
}
}
]
});
let response = await fetch(
"https://vision.googleapis.com/v1/images:annotate?key=" +
Environment["GOOGLE_CLOUD_VISION_API_KEY"],
{
headers: {
Accept: "application/json",
"Content-Type": "application/json"
},
method: "POST",
body: body
}
);
let responseJson = await response.json();
console.log(responseJson);
this.setState({
googleResponse: responseJson,
uploading: false
});
} catch (error) {
console.log(error);
}
};
}

You should now be seeing all the returned data in your logs, and you can also display it on the screen:

{googleResponse && (
<Text
onPress={this._copyToClipboard}
onLongPress={this._share}
>
JSON.stringify(googleResponse.responses)}
</Text>
)}

While a huge blob of JSON is fun, if you want something more than a blank stare when showing it to a non programmer, you can display a list of the returned labels:

{this.state.googleResponse && (
<FlatList
data={this.state.googleResponse.responses[0].labelAnnotations}
extraData={this.state}
keyExtractor={this._keyExtractor}
renderItem={({ item }) => <Text>Item: {item.description}</Text>}
/>
)}

That’s it! There’s no styling and it ain’t pretty, but you should now be able to take a picture and get back an insane amount of info from google, such as the emotional state of your cat:

Checkout the full example on github for details: https://github.com/mlapeter/google-cloud-vision

DISCLAIMER: this is only meant as a quick and dirty example, and is not secure nor meant for production. Keep your api keys secret and make sure to disable public access to firebase when you’re done, otherwise anyone can view and upload files to your account.

Credits/ Resources

I found the following posts really helpful as I pulled everything together:

This post from @iammosespaulr helped get me started with the big picture and posting to the cloud vision api: https://blog.expo.io/how-i-built-my-first-react-native-app-using-expo-adc3f1bcd5e5

Excellent Expo firebase uploading example: https://github.com/expo/firebase-storage-upload-example/blob/master/App.js

This post from @wcandillon pointed me to that great example above: https://medium.com/@wcandillon/uploading-images-to-firebase-with-expo-a913c9f8e98d

This helps clarify a bit about storing env variables in expo apps: https://github.com/expo/expo/issues/83

Example files:

utils/firebase.js

import Environment from "../config/environment";
import * as firebase from "firebase";
firebase.initializeApp({
apiKey: Environment["FIREBASE_API_KEY"],
authDomain: Environment["FIREBASE_AUTH_DOMAIN"],
databaseURL: Environment["FIREBASE_DATABASE_URL"],
projectId: Environment["FIREBASE_PROJECT_ID"],
storageBucket: Environment["FIREBASE_STORAGE_BUCKET"],
messagingSenderId: Environment["FIREBASE_MESSAGING_SENDER_ID"]
});
export default firebase;

config/environment.js

Note: this is my setup for easily adding a future production environment, it’s overkill for now but makes it easier for you to add additional expo release channels in the future.

var environments = {
staging: {
FIREBASE_API_KEY: "blabla",
FIREBASE_AUTH_DOMAIN: "blabla.firebaseapp.com",
FIREBASE_DATABASE_URL: "https://blabla.firebaseio.com/",
FIREBASE_PROJECT_ID: "blabla",
FIREBASE_STORAGE_BUCKET: "blabla.appspot.com",
FIREBASE_MESSAGING_SENDER_ID: "blabla",
GOOGLE_CLOUD_VISION_API_KEY: "blabla"
},
production: {
// Warning: This file still gets included in your native binary and is not a secure way to store secrets if you build for the app stores. Details: https://github.com/expo/expo/issues/83
}
};
function getReleaseChannel() {
let releaseChannel = Expo.Constants.manifest.releaseChannel;
if (releaseChannel === undefined) {
return "staging";
} else if (releaseChannel === "staging") {
return "staging";
} else {
return "staging";
}
}
function getEnvironment(env) {
console.log("Release Channel: ", getReleaseChannel());
return environments[env];
}
var Environment = getEnvironment(getReleaseChannel());export default Environment;

--

--