IBM Watson IBM Machine Learning: Tone Analyzer

Watson is a question answering computer system capable of answering questions posed in natural language developed in IBM’s DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM’s first CEO, industrialist Thomas J. Watson.The computer system was specifically developed to answer questions on the quiz show Jeopardy! and, in 2011, the Watson computer system competed on Jeopardy! against former winners Brad Rutter and Ken Jennings winning the first place prize of $1 million — from wiki.

“The computer’s techniques for unraveling Jeopardy! clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson’s case, a 15-terabyte data bank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels “sure” enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.” ” — Ken Jennings

In this series about IBM Watson Machine Learning, we will learn how to use IBM’s services in android app.

Today we will start with the first tutorial about Tone Analyzer which can identify the emoticons in your text.

Prerequisites

1. Create Watson Service

Watson’s services can be accessed only through the IBM Bluemix cloud platform. The first you need log in to the console

You choose Watson → Tone Analyzer and Create

Open the Service credentials

Save the contents of the JSON document to a file on your computer. I suggest you name the file credentials.json

2. Android Project

We will create new project with basic activity in Android Studio.

We will add Watson JDK to our project. Open build.gradle file:

compile 'com.ibm.watson.developer_cloud:java-sdk:3.7.2'

And you add more library for Material Design and IO common:

compile 'commons-io:commons-io:2.5'
compile 'com.android.support:design:23.4.0'

Finally, you add permission INTERNET into AndroidManifest.xml file:

<uses-permission android:name="android.permission.INTERNET"/>

Add the following code to your activity’s layout XML file:

<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:padding="@dimen/fab_margin"
tools:context="com.thientvse.toneanalyzer.Main2Activity"
tools:layout_editor_absoluteY="81dp"
tools:layout_editor_absoluteX="0dp">

<android.support.design.widget.TextInputLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:id="@+id/container">

<EditText
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:hint="Say something"
android:id="@+id/user_input"
android:lines="3"/>
</android.support.design.widget.TextInputLayout>

<Button
android:text="Analyze Tone"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/container"
android:id="@+id/analyze_button"
tools:layout_editor_absoluteX="16dp"
android:layout_marginTop="20dp"
app:layout_constraintTop_toBottomOf="@+id/container" />

</android.support.constraint.ConstraintLayout>

In MainActivity.java, we can config Tone Analyzer:

final ToneAnalyzer toneAnalyzer =
new ToneAnalyzer("2017-07-01");
try {
JSONObject credentials = new JSONObject(IOUtils.toString(
getResources().openRawResource(R.raw.credentials), "UTF-8"
)); // Convert the file into a JSON object

// Extract the two values
String username = credentials.getString("username");
String password = credentials.getString("password");

toneAnalyzer.setUsernameAndPassword(username, password);
} catch (IOException e){

} catch (JSONException e){

}

Using Tone Analyzer:

private void analyzeText(){

// config tone analyzer
ToneOptions options = new ToneOptions.Builder()
.addTone(Tone.EMOTION)
.html(false).build();

textToAnalyze = userInput.getText().toString();

toneAnalyzer.getTone(textToAnalyze, options).enqueue(
new ServiceCallback<ToneAnalysis>() {
@Override
public void onResponse(ToneAnalysis response) {
// More code here
List<ToneScore> scores = response.getDocumentTone()
.getTones()
.get(0)
.getTones();

String detectedTones = "";
for(ToneScore score:scores) {
if(score.getScore() > 0.5f) {
detectedTones += score.getName() + " ";
}
}

final String toastMessage =
"The following emotions were detected:\n\n"
+ detectedTones.toUpperCase();

runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(getBaseContext(),
toastMessage, Toast.LENGTH_LONG).show();
}
});
}

@Override
public void onFailure(Exception e) {
e.printStackTrace();
}
});
}

If you run it, type in a few sentences, and start the analysis, you’ll be able to see Watson accurately identify the emotions present in your text.

You can download source code Github

Thank you very much for your time. And, until next time, have a great day!


Originally published at Code for fun.