Using Watson AIOps REST Interface quick and dirty, a tutorial

Julius Wahidin
IBM Cloud Pak for AIOps
9 min readJul 15, 2022

A few weeks ago, a coworker asked me how to insert alerts into the Watson AIOps Alerts database from a CSV file at the start of a new project. The customer gave us an export of their production alerts database. And we needed to inject as test alerts into the environment we were setting up.

One way is to use the OMNIbus probes, such as the standard input probes or the socket probe, which means installing the probes and converting the CSV into the format expected by the probes.

Another way is to use the OMNIbus client, such as nco_sql or nco_config, which means we need to install the client and convert the CSV into a SQL insert statement.

Is there any other way without changing the CSV file and without the need to install additional software?” He asked. Well, there is the OMNIbus REST interface. This alternative solution reminded me of a work I did for another customer a while ago, where they needed to query and update the OMNIbus Alerts from an external application. Similarly, they did not want to install additional software on the external application environment; I suggested the REST Interface at that time.

I searched the internet to find an article to give to my coworker. However, I could not find an article that he could use. So here we are, I am writing one.

This blog focuses on quickly working with the REST Interface; we will use Python because it has many built-in modules to work with CSV, JSON and HTTP requests. We also need to concentrate on getting the task done without worrying too much about error checking and security setup, a dirty way of doing things, hence the title quick and dirty.

One thing to note is that the Alerts database of Netcool/OMNIbus, Netcool Operation Insight, and Watson AIOPs is the same, so we will use the name interchangeably.

OMNIbus REST Interface.

The OMNIbus REST interface is disabled by default. To enable it, we edit the Object Server property file. Uncomment the following line in the $OMNIHOME/etc/<objectserver_name>.propsfile:

# To enable the interface
NRestOS.Enable: TRUE
# For HTTP only
NHttpd.EnableHTTP : TRUE
NHttpd.ListeningPort : 8080

For containerized OMNIbus, we can insert the same configuration into the agg-p-props-apend data element of the OMNIbus config map. The following Netcool Operation Insight documentation provides more information on how to do it: https://www.ibm.com/docs/en/noi/1.6.5?topic=maps-primary-netcoolomnibus-objectserver-configmap. Then, restart the OMNIbus process or pods for the setting to take effect.

Note in the above configuration, we have used the HTTP protocol. However, as a recommended practice, we need to use the HTTPS protocol in a production environment for security reasons.

The REST “get” method.

After enabling the REST interface, the first thing we want to do is to ensure that it is working. And, of course, the safest test is to query the interface.
We can use curl, but just easily, we can run a query through the Python interpreter. So, we use the REST get method to return the alerts table content. Let us start simple, get the content of the whole table. We just want to check that the interface works. Yes, I know we should never query the whole table in the production environment. It is a bad practice. This blog is a dirty tutorial, after all.

Let us use the Python interactive mode and type in the few command. We use the Python requests module to make the REST call easy.

[juliusw@noi165 ~]$ python3
Python 3.9.10 (main, Feb 9 2022, 00:00:00)
[GCC 11.2.1 20220127 (Red Hat 11.2.1-9)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> r = requests.get('http://noi165:8080/objectserver/restapi/alerts/status',auth=('netcool', 'netcool'))
>>> print(r.text)
{
"rowset": {
"osname": "NCOMS",
"dbname": "alerts",
"tblname": "status",
"affectedRows": 15,
"coldesc": [{
"name": "Identifier",
"type": "string",
"size": 255
}, {
"name": "Serial",
"type": "integer",
"size": 4
- - - - - cut for brevity - - - - - -
}, {
"name": "RowSerial",
"type": "integer",
"size": 4
}],
"rows": [{
"Identifier": "OMNIbus ObjectServer : Connections available for NCOMS:",
"Node": "noi165.ibm.com",
"Manager": "OMNIbus Self Monitoring @NCOMS",
"Agent": "OMNIbus SelfMonitoring",
"AlertGroup": "ConnectionStatus",
"Severity": 2,
- - - - - cut for brevity - - - - - -
"RowSerial": 26
}]
}
}
>>> quit()

Yay, it works. We now know that the REST Interface is responding.

A few points are worth mentioning from the interaction.

  1. We specify the host and port http://noi165:8080and the endpoint /objectserver/database/alerts/status.By the way, we can access the alerts.journal or the alerts.details table by changing the status in the endpoints to journal or details. We will see later all interactions with the alerts.status, whether query, insert, update or delete operation, use the same endpoint.
  2. We are using the basic authentication by specifying username “netcool” and password “netcool”.
  3. The returned JSON string looks like something that can be expressed using the Python built-in data types: Tuple, List, Set and Dictionary.
  4. The output is in JSON format containing one rowset element. The rowset has one “coldesc” (Column Description) and one “rows” section. The “rows” section itself is an array of JSON sets that looks like a Python Dictionary. Each set represents one row in the alerts.status table. Please note the JSON structure, as we will recreate this layout when we create alerts to insert later.

Bounded “get”.

When working with a query, we do not want to scroll too much to read the result. In the previous python interpreter example, we did not put any boundary on the get request, so we received all the fields and all the rows of the alerts table. Now, we will select specific alerts. We will also display a selection of fields. As we want to reuse the code, we will type the commands into a file, our first script. Let us name it “getAlerts.py”, and here is the content.

# getAlerts.pyimport requests
import json
url='http://noi165:8080/objectserver/restapi/alerts/status'
params={"filter":"AlertGroup='AGSim'"}
username="netcool"
password="netcool"
r = requests.get(url, auth=(username, password), params=params)data = json.loads(r.text)
rowset = data['rowset']
rows = rowset['rows']
for i in range(len(rows)):
print(rows[i]['Node'] + ' <> ' + rows[i]['Summary'])

For the filter, we specify a specific AlertGroup, AGSim in this case. You can replace it with any SQL where clause. We can modify the script to get the filter from the command line, but for this tutorial, we hard code it.

Executing the script produces the following output, which is more readable now.

[jwahidin@noi165]$ python3 getAlerts.py
NodeOne <> Summary Simulated 1
NodeTwo <> Summary Simulated 2
NodeThree <> Summary Simulated 3

In the script, we convert the JSON string to the Python data types by just calling the function “loads”, after which we can programmatically traverse the structure and get what we want. Please observe the output of our previous unbounded query to see the JSON structure: rowset and rows.

Let’s post some alerts.

Now back to my friend’s case. He wanted to inject alerts from a CSV file. Using a CSV as the alert source is a good example, as it is quite common to use a CSV file as the intermediary location for inter-application communication. Creating a CSV file is also easy. I usually use the luxury of a copy-and-paste spreadsheet operation such as MS Excel to create the alerts and then save the work as a CSV file. So if you want to experiment with that alerts correlation algorithm that you just invented, you can use a CSV file to provide the test alerts and use the Python script to inject it.

So here is an example of a CSV file, alerts.csv, containing three alerts. As the norm for a CSV file, we specify the field’s name as the first line.

Identifier,Node,Manager,AlertGroup,AlertKey,Severity,Type,Summary
"Id1","NodeOne","MgrSim","AGSim","AKKey",1,1,"Summary Simulated 1"
"Id2","NodeTwo","MgrSim","AGSim","AKKey",2,1,"Summary Simulated 2"
"Id3","NodeThree","MgrSim","AGSim","AKKey",3,1,"Summary Simulated 3"

We have used the standard alert fields: Identifier, Node, Manager, AlertGroup, AlertKey, Severity, Type, Summary, FirstOccurrence, and LastOccurrence. These fields are some of the most commonly used alert fields. In addition, if you are familiar with OMNIbus, you might have noted that we have included all the fields used by the OMNIbus generic clear trigger.

Now for the script. We will use the REST Interface post method to insert alerts to the WAIOps Alerts database. The REST endpoint is still the same. Our script reads the CSV file, stores the information into the Python tuple and dictionary, constructs the JSON strings from the Python data types, and then sends the JSON string to the REST endpoint.

The JSON string structure posted to the REST interface is the same as the structure we acquired when we ran the getAlerts.py. You can find read more of this here: https://www.ibm.com/docs/en/netcoolomnibus/8.1?topic=interface-http-request-response-examples.

Python makes writing the script easy; in this case, the module provides what you need.

# postAlerts.py
# Usage: python3 postAlerts.py alerts.csv
from csv import DictReader
import json
import sys
import time
import requests
inf = sys.argv[1]
logf= inf+".log"
print("Input file: ",inf)
ifile = open(inf,"r")
logfl = open(logf,"w")
url='http://noi165:8080/objectserver/restapi/alerts/status'
headers={'content-type': 'application/json'}
with open(inf,'r') as readobj:
csv_dict = DictReader(readobj)
cols_name = csv_dict.fieldnames
#
# Build the Header Structure
#
cols=[]
for colname in cols_name:
# Severity is expected as Integer rather than String
if (colname == "Severity") or (colname == "Type") :
item = {"type": "integer"}
else:
item = {"type": "string"}
item["name"] = colname
cols.append(item)
# Add FirstOccurrence and LastOccurrence
item = {"type": "utc"}
item["name"] = "FirstOccurrence"
cols.append(item)
item = {"type": "utc"}
item["name"] = "LastOccurrence"
cols.append(item)

coldesc_item = {"coldesc":cols}
#
# Build one Row Structure and POST to the URL
#
for row in csv_dict:
rows = []
# Cast the Severity and Type into integer
row["Severity"]=int(row["Severity"])
row["Type"]=int(row["Type"])
# First and Last Occurrence is Epoch time without the subsecond.
row["FirstOccurrence"]=int(time.time())
row["LastOccurrence"]=int(time.time())
rows.append(row)
rows_item = {"rows": rows}
rowset_item = {"coldesc":cols}
rowset_item["rows"] = rows
rowset = {"rowset": rowset_item}
jrowset = json.dumps(rowset, indent=2)
logfl.write(jrowset)
# Send the request
r = requests.post(url, headers=headers ,auth=('netcool', 'netcool'), data=jrowset)
print(r.text)

When we process the column description (coldesc) section, we need to specify the column’s data type. Most fields are of type strings, so that is the default value. We convert to other data types as required: Integer for the Severity and Type fields and UTC for the FirstOccurrence and LastOccurrence.

We did not specify any time stamp in the CSV file; instead, we specified the current timestamp when we generated the alert. This simulates real alerts. We use the Python current time function to assign the
FirstOccurrence and LastOccurrence.

Note that we have used the Python DictReader function to read the CSV into the Python dictionary. The script will still work if we add more columns to the CSV file. We still need to ensure that the type is cast correctly. If the added column is a string type, then we do not modify the script. It just happens that most OMNIbus fields are of type stings; don’t you love it?

Let’s there be alerts; run the script:

[juliusw@noi165]$ python3 postAlerts.py alerts.csv
Input file: alerts.csv
{
"entry": {
"affectedRows": 1,
"keyField": "31%3ANCOMS",
"uri": "http://noi165:8080/objectserver/restapi/alerts/status/kf/31%3ANCOMS"
}
}
{
"entry": {
"affectedRows": 1,
"keyField": "32%3ANCOMS",
"uri": "http://noi165:8080/objectserver/restapi/alerts/status/kf/32%3ANCOMS"
}
}
{
"entry": {
"affectedRows": 1,
"keyField": "33%3ANCOMS",
"uri": "http://noi165:8080/objectserver/restapi/alerts/status/kf/33%3ANCOMS"
}
}

That went well. Three alerts were created. To verify, we can use the earlier getAlerts.py. In fact, the previous output of getAlerts.py did just that. If you rerun the postAlerts.py, the alert count of the three alerts will be increased by 1, a consistent deduplication behaviour of alerts coming from a probe.

Those observant might have noticed that the alerts are inserted one by one rather than in bulk. This is the behaviour of the OMNIbus REST interface. So if you want to send alerts from another application, you might want to use another method, such as the massage bus through the OMNIbus message bus probe. For my friend’s requirement, this post method is still fast enough.

Delete operation.

Harlan Coben, an American writer of mystery novels, wrote,

“More than once, I’ve wished my real life had a delete key.” *

So, what does the saying do with REST API? Well, nothing. However, it mentioned the vital word “delete”. You see, to keep our alerts table clean, we also need a delete script to remove the alerts we have inserted :D.

Here is the deleteAlerts.py script.

# deleteAlerts.py
import requests
url='http://noi165:8080/objectserver/restapi/alerts/status'
params={"filter":"AlertGroup='AGSim'"}
username="netcool"
password="netcool"
r = requests.delete(url, auth=(username, password), params=params)
print(r.text)

And the following output shows there are three rows successfully deleted.

[juliusw@noi165]$ python3 deleteAlerts.py
{
"entry": {
"affectedRows": 3,
"uri": "http://noi165:8080/objectserver/restapi/alerts/status?filter=AlertGroup%3D%27AGSim%27"
}
}

Update operation.

The OMNIbus REST interface allows you to perform an update through the put or patch operation. However, a quick and dirty way of achieving an update is to delete and create the alert again; hence we are skipping creating an update script. You should be able to extend the scripts presented in this blog to perform the update operation. I believe in you; you can create one :D.

There you go; that small script should help my friend, and hopefully you. But, of course, the information here will be more valuable if it is used for another application to interact with the Alerts database. An application to use the Watson AIOps Alerts REST interface, just like the customer I mentioned at the blog’s beginning.

Summary

We have gone through a few Python scripts that use the Watson AIOps Alerts REST interface to read, create and delete alerts. These scripts exemplify a way to perform CRUD (Create, Read, Update, Delete) operations to the Watson AIOps Alerts database, opening up the possibility of using it from another application. I hope you can reuse the information presented here quickly but cleanly.

*https://www.brainyquote.com/quotes/harlan_coben_475878

--

--

Julius Wahidin
IBM Cloud Pak for AIOps

is a member of the IBM Watson AIOps Elite team. The team’s goal is to help design and implement Watson AIOps. All stories and comments are my own.