Deploy and run OPL models

AlainChabrier
3 min readNov 18, 2019

--

When it comes to deployment, Decision Optimization for Watson Machine Learning offers lots of options including the type of models you can deploy. In this post is detailed how to deploy and run OPL models.

Watson Machine Learning offers many combinations to deploy and run optimization models:

  1. the type of model : model formulation can be a Python written with docplex or an OPL (Optimization Programming Language) model , a pure CPLEX model (e.g. in LP or MPS format), or a pure CP model. It can also be as shown and it can be .
  2. the APIs to deploy: directly using the REST APIs of WML, or using the WatsonMachineLearningClient python APIs. In the coming Cloud Pak for Data 2.5, it is also possible to deploy using a dedicated User Interface.
  3. the data connexion: directly including the data inlined in the job creation request, or providing some reference to data in a storage area, such as Cloud Object Storage (COS).

All combinations are possible, which could lead to many particular examples.

In the Watson Studio gallery example notebook, the illustrated combination is a docplex python model, deployed using WatsonMachineLearningClient python API, and using inlined data. The Decision Optimization for Watson Machine Learning documentation references all what is required to use other combinations. My initial post also provides plenty of general technical considerations, such as how to set environment size or the number of parallel executions.

The most commonly asked variation is to use OPL as the model formluation language. So I created a slightly different example notebook to highlight what needs tobe changed. This notebook still uses WatsonMachineLearningClient and still uses inlined input data.

Let’s go throught the notebook and highlight the differences.

All authentication, and model creation and upload works exactly the same way. Or course, the uploaded zip archive should contain some OPL .mod model file instead of some python .py file. In this example, the model is included inline in the notebook and copyed to the file using the %%writefile trick, but any other mechanism to add the file into the zip would work too.

%%writefile model/model.modtuple TProduct {
key string name;
float demand;
float insideCost;
float outsideCost;
};

tuple TResource {
key string name;
float capacity;
};

tuple TConsumption {
key string productId;
key string resourceId;
float consumption;
}

{TProduct} Products = ...;
{TResource} Resources = ...;
{TConsumption} Consumptions = ...;

/// solution
tuple TPlannedProduction {
key string productId;
float insideProduction;
float outsideProduction;
}

/// variables.
dvar float+ Inside [Products];
dvar float+ Outside[Products];
dexpr float totalInsideCost = sum(p in Products) p.insideCost * Inside[p];
dexpr float totalOutsideCost = sum(p in Products) p.outsideCost * Outside[p];
minimize
totalInsideCost + totalOutsideCost;

subject to {
forall( r in Resources )
ctCapacity:
sum( k in Consumptions, p in Products
: k.resourceId == r.name && p.name == k.productId )
k.consumption* Inside[p] <= r.capacity;
forall(p in Products)
ctDemand:
Inside[p] + Outside[p] >= p.demand;
}
{TPlannedProduction} plan = {<p.name, Inside[p], Outside[p]> | p in Products};execute {
writeln( plan );
}

The only important different is the model type in the model creation, which needs to be do-opl_12.9.

model_metadata = {
client.repository.ModelMetaNames.NAME: "Production",
client.repository.ModelMetaNames.DESCRIPTION: "Model for Production",
client.repository.ModelMetaNames.TYPE: "do-opl_12.9",
client.repository.ModelMetaNames.RUNTIME_UID: "do_12.9"
}

This is, in fact, the only difference as the deployment, the job creation and monitoring and then the solution retrieval is then exactly equivalent than with docplex.

alain.chabrier@ibm.com

@AlainChabrier

https://www.linkedin.com/in/alain-chabrier-5430656/

--

--

AlainChabrier

Former Decision Optimization Senior Technical Staff Member at IBM Opinions are my own and I do not work for any company anymore.