Optimization(OR) Models & Implementation Challenges

Satish Kumar Amirisetti
The Startup
Published in
8 min readOct 21, 2020
Optimization & It’s Meaning

Decision making is part of every business and Optimization is a widely used concept for decision making. Optimization concept when used in business context in any domain is not translating its true meaning. It is misunderstood by the business as a magic wand which gives the best decision or solution to all the problems. In most of the real life business scenarios, Optimization doesn’t lead to an optimal solution and users should be informed that it can’t solve anything and everything to optimal.

IT just means that it can help in decision making to improve the system that is modeled. If the modeled system is approximate replica of the physical system (which is the case mostly), then the decision made is not the solution for physical system, but for the model.

As we all know the quote of Art Geoffrion, “The Primary Purpose of Mathematical Programming is Insight, Not Numbers”.

I added the term “Primary” since numbers are also important and helpful in deriving the insight. Models are also helpful in proving or disproving our assumptions in a business scenario. And, we need to understand that the decision that any optimization system throws out is not easy to read or explain because of the complex nature of models representing today’s business and the resulting impact of each variable of the business. At the same time, we cannot simplify the model to understand the result as it won’t represent the business scenario to be solved. Trade-off is the term that is often associated with Optimization and that’s the reason we use optimization techniques.

Before starting any Optimization project, we need to understand the applicability and limitations of Optimization logic to the business scenario we are looking to model.

Knowing Optimization is important, knowing business is equally important, knowing applicability of Optimization to business is very important.

Below are the typical view points that are to be kept in mind either before taking up or while implementing an Optimization project. And the decisions made based on these views will affect the success of the project and this is true in most large scale projects.

Model Vs Reality

Any mathematical model is a prototype or representation of the physical system in the form of equations. In Optimization, business scenario is modeled as a math model by using an Objective and a set of constraints. If we take an example of production planning in a manufacturing plant, the objective is to minimize the total cost while fulfilling the customer demand on time. Physical constraints like resource capacity, manpower availability, material availability are modeled as constraints. This model is not an exact replica of physical system as physical system has lot of other aspects which will affect the execution. So, planner should keep in mind that the model output is for the scenario modeled and not for the physical system and he/she should use his/her intuition to make changes in the inputs/outputs wherever applicable.

Let us take another example. If you are a planner creating a dispatch plan between locations A and B as on today and you know that the standard lead time is 5 days. Optimization model will calculate the plan as per the lead time data provided. But, as a planner you got the information from ground that because of a bridge collapse, it will take 8 days instead of 5 days. You will have two options now, either change the input data or change the output data. In case of input, you can change lead time or remove the link between A and B so that model will search for some other route. In case of output, you can change the plan and arrange the dispatch from another location or using another mode. I just simplified the case whereas in reality lot of other parameters will come into picture.

Standard Problem Vs Customer specific problem

Optimization techniques are in use in the industry from many decades and many industrial problems are handled very well given the advancement of efficient algorithms and computational power. Many of the business planning scenarios are standardized and are comparatively easy to model and solve using commercial solvers. At the same time, its true that every industry or organization will have its own unique constraints which needs new models to be developed. So, since the model will be different now, solve will be different too and takes considerable effort in identifying the best settings and methods to use to arrive at the best solution in reasonable time.

For example, consider an aggregate production planning problem which generates the daily/weekly/monthly production plan by considering the future demand, resource capacities, production lead times etc. This is a standard problem for all the manufacturing plants. But if a specific customer has an idiosyncratic constraint like production line 2 must be up after production line 1 is running at more than 50% capacity. This must be modeled separately since this is not common scenario in every plant. Because of this additional constraint, Optimization model behavior might change in other scenarios as well. Planner needs to understand this aspect as part of the implementation and accordingly create scenarios to understand the model output. Custom constraints should be added only if they are critical for the business, otherwise they should be handled outside the optimization model by rules or heuristics before or after the model run. Without due diligence of the model output, it should not be applied to the business.

Black-box Vs Excel

Most of the companies use commercial solvers to solve their optimization models. Planners complain about the random and complex nature of these black box solvers output and go back to excel after some time since the implementation closure. Given the model complexity and data volume, it’s almost impossible to arrive at the reasoning of solver output.

Instead of worrying about what’s inside the black box solution, planners should try to analyze the result by solving it using small data so that it will be easy for them to analyze the output. They should model all their scenarios using small and controlled data. If the result is logical for your business scenarios, you can infer that it works for large data under same conditions. We all know that no software is error free and so we should make sure it works in most of the cases by doing due diligence before applying the results to business.

That’s why we recommend a pilot phase in every Optimization project so that affect on the business is minimal.

Constraints Vs Complexity

There is a trade-off between the number of practical constraints that can be applied using a math model versus the increase in complexity of the model itself resulting in huge run time. Focus should be on critical constraints which are tough to be applied manually and apply other non-critical ones on the optimization output. There are lot of ways by which we can reduce the complexity by applying the constraints in a phased manner solving each set at a time. This might result in a sub-optimal solution, but sometime this is good enough to reduce the effort and time spent. Every business thinks of getting the best solution that handles all their constraints, but in practice they need to break down and draw a line between various decisions.

Let us take an example. In an aggregate production planning for around 52 weeks, some of the customers want to add sequence dependent changeover constraints in order to optimize the production and setup time. Sequence dependent changeover constraint needs different type of modelling which will increase the variables and as well as run time. Ideally, at an aggregate level for longer horizon, average setup time is the best way to model the business scenario which will minimize the complexity and will have minimal affect on the output. Detailed changeover constraints can be applied at Scheduling level for a shorter horizon.

Best Vs Apt

Every business is looking for best solution for their scenario, and they need a single tool to solve all the problems. A math model is never a perfect representation of the physical world and there are lot of assumptions and approximations behind it. You might have heard the name “Digital Twin” which is nothing but replica of physical system in digital form. This is used to simulate and plan for future using different systems and models. We are far away from creating a truly digital twin of a physical system. And so, the best of the not exact math model is not actually the best. It’s the best under the given limits and assumptions. So, instead of focusing on it, we should focus on what is apt for the business by verifying the results under multiple scenarios.

Gap Vs Worth

In Optimization terminology, Gap is defined as the difference between optimal and current best solution which has multiple versions (relative/absolute). There is too much focus on the Gap of a model output from its best output especially on hard MIP problems. For most of the practical problems, it’s always impossible to get the gap of zero. For each problem the gap might not reach the expected value and you should be focusing on verifying the result given by the model. Typical gap requirement any analyst will keep is 0.01% whereas business requirement can be handled by less than 3–5% gap.

If the Gap value is very less, the model run will take more time but the improvement in terms of solution quality might not be worth the run time. Planner should look at both model output and run time to arrive at minimum possible gap to get good enough solution in reasonable time.

Tree Size Vs Data Size

Business users need to focus on the planning and execution of those elements of business where there are lot of deviations. That means they need to focus on critical products or resources which are very important to business. Planning those critical ones and handling them efficiently will be the best way to improve efficiency. It’s a great practice to reduce the data size by focusing only on the critical products/resources of the business instead of pushing everything to the optimization model. Remember that these are decision making systems, not automation systems or transaction systems.

Remember that these are decision making systems, not automation systems or transaction systems.

For example, if you are creating a production plan, it’s better to restrict the data to critical/shared resources and products with complex processes. Unnecessary data which doesn’t affect the decision making or which doesn’t need planning are to be excluded so that the tree size is in control.

Run Time Vs Man-Hours

Planners spend on an average around three days to 15 days every month for future planning. Most of them use excel as planning tool and use formulae to calculate their basic requirement and other users does the planning manually. This exercise results in error prone and sub-optimal plan and any changes in the plan will lead to the duplication of same exercise.

Optimization helps in improving the solution quality and reduces the planning time. It should be noted that Optimization always take less time than manual/excel planning time and at the same time gives better result. Planners expect the result in seconds which might not be possible in every case and it depends on the trade-offs that were built in model and the data volume.

If say production planning is happening on a monthly basis, an overnight run should not be an issue for the planner. If dispatch plan is generated every day, it should run faster compared to earlier case, say less than an hour. Any planner should first focus on the functionality rather than time.

Any optimization solution should be better in run time compared to man hours, solution quality compared to manual planning, otherwise no one will go for it.

All the points expressed above are based on my experience in modeling and implementing Supply Planning solutions using Operations Research models. Your comments are most welcome.

--

--