Migrate your python code for DO in WML v2 instances

A new version of Watson Machine Learning (WML) instances was deployed in early September which includes some enhancements and changes. Please refer to my previous post for details on the REST APIs changes.

An easy to use option for Data Scientists to deploy and integrate Decision Optimization (DO) models in WML is to use the WML Python client.

This post introduces the new version of this client and describes how to use it with DO models and the new v2 WML instances (which are using the v4 APIs).


A new package called ibm-watson-machine-learning is now available which works with the new WML v2 instances.

So you might want to uninstall the previous one:

pip uninstall watson-machine-learning-V4

And install the new one:

pip install ibm-watson-machine-learning

New v2 WML instances are using your general user apikey instead of WML specific credentials, so you must adapt the creation of the client, as follows:

from ibm_watson_machine_learning import APIClient
wml_credentials = {
"url": "https://us-south.ml.cloud.ibm.com"
client = APIClient(wml_credentials)

As assets and deployments are now grouped in deployment spaces, set the deployment space to be used:


The deployment space can be created from the User Interface or via Python code, as follow, using the Cloud Object Storage and WML instance CRNs:

cos_resource_crn = 'XXXXXXXXXXXX'
instance_crn = 'XXXXXXXXXXX'

metadata = {
client.spaces.ConfigurationMetaNames.NAME: space_name,
client.spaces.ConfigurationMetaNames.DESCRIPTION: space_name + ' description',
client.spaces.ConfigurationMetaNames.STORAGE: {
"type": "bmcos_object_storage",
"resource_crn": cos_resource_crn
client.spaces.ConfigurationMetaNames.COMPUTE: {
"name": "existing_instance_id",
"crn": instance_crn
space = client.spaces.store(meta_props=metadata)
space_id = client.spaces.get_id(space)

It is now possible to set detailed configurations for software and hardware, and the payload to create model and deployment has to be modified accordingly.

Software specifications in model creation.

model_metadata = {
client.repository.ModelMetaNames.NAME: model_name,
client.repository.ModelMetaNames.DESCRIPTION: model_name,
client.repository.ModelMetaNames.TYPE: "do-opl_12.10",
client.repository.ModelMetaNames.SOFTWARE_SPEC_UID: client.software_specifications.get_uid_by_name("do_12.10")

model_details = client.repository.store_model(model='./model.tar.gz', meta_props=model_metadata)

Hardware specifications in deployment creation.

deployment_props = {
client.deployments.ConfigurationMetaNames.NAME: deployment_name,
client.deployments.ConfigurationMetaNames.DESCRIPTION: deployment_name,
client.deployments.ConfigurationMetaNames.BATCH: {},
client.deployments.ConfigurationMetaNames.HARDWARE_SPEC: {'name': 'S', 'nodes': 1}

deployment_details = client.deployments.create(model_uid, meta_props=deployment_props)

In addition to tabular data (fields and values), it is now possible to provide non tabular data (such as an OPL .dat file or an .lp file) inline in the job creation instead of referencing external storage assets (e.g. Cloud Object Storage assets).

This can be done by providing the binary encoded string content as follows:

client.deployments.DecisionOptimizationMetaNames.INPUT_DATA: [
"id": dat_file,
"content": getfileasdata(dat_file)

And here is an example function to binary encode some file:

import base64
def getfileasdata(filename):
with open(filename, 'r') as file:
data = file.read();

data = data.encode("UTF-8")
data = base64.b64encode(data)
data = data.decode("UTF-8")

return data

You can refer to the following examples.

In the gallery, the notebook to deploy and run a model has been updated.

The following github repositories now also include a v2 version in oplrunonwml and cplexrunonwml.

Follow me on Twitter: https://twitter.com/AlainChabrier

Decision Optimization Senior Technical Staff Member at IBM Alain.chabrier@ibm.com @AlainChabrier https://www.linkedin.com/in/alain-chabrier-5430656/