How to handle 100k rows decision table in Drools-Part2

Ryan Zhangcheng
Feb 21 · 4 min read

As described in my previous article, we are handling a performance issue when solving 100k row decision tables.

Solution 2: Precompile the SpreadSheet Rule

Image for post
Image for post

Follow the vertical thinking of solution 1, I think we can improve the situation corresponding to the problems:

  1. Don’t dynamically load Excel data at runtime, let’s precompile it at build time

When drools using rule template + Excel to fire rules, what it actually doing under hood is:

  1. Using ExternalSpreadsheetCompiler to compile rule template and rule data( ie the Excel file) into drl(Drools rule language)

So can we do the first step before the runtime? even better can we do even the first 2 steps of transformation?

Fortunately the answer is yes, drools already provided a friendly maven plugin (kie-maven-plugin) to precompile drl, or drools awareness rule format into java byte. It is called Drools Executable Model .

One stone two birds, it makes what is good even better. In order to apply the drools executable model solution, we need to convert the raw Excel format into drools awareness spreadsheet decision table. It can be managed by ‘kie-workbench’, so problem 2 is resolved. What we need to do is simply add the drools syntax “header” into the decision table.

Image for post
Image for post

As you can see that:

B7: f1: ClientObject, it is the Fact declaration;

B8: descr matches $param, it is the condition logic

C8: f1.setPass($param), it is the action logic;

That’s it.

Let’s have a quick review of the change , then test the performance improvements.

The solution 2 is stored in the precompile-rule-solution branch.

What kie-maven-plugin do for us is:

  1. Use drools-model-compiler to convert spreadsheet decision table to generate 10206 java code (Notice that it’s even better than drl file)

So when myapp client code loads the jar file, it would directly call the byte code without bothering to analyse the Excel file and do the drl parser etc.

What’s important is that the business logic is still wrapped as its own project, not leak into generic application code and lifecycle.

Let’s see the performance of solution 2:

Put the performance data into a table to compare:

Pro

Two obvious advantages we have gained by applying drools executable models.

  1. Runtime performance are obviously improved.

Sometimes this is not obvious to some users when they start to adopt rules oriented application framework. But it’s quite important from a rules governance perspective, such as version controlled your rules data, deploy testing and release your bussiness rules. With the help of kie workbench, all those features are already provided out of box.

Con

The solution 2 has two shortcomings, I think.

  1. Compilation time is quite long

For a 10k rows number, the compilation time of 1.5 mins seems acceptable. It actually generated and then compiled 10k small java files.

But for 100k rows numbers, it does not come out a reasonable compile time. It takes ~15 mins to complete. It would become very awkward no matter for dev experiences or CICD experiences. It just took too much effort when the rules became a certain level of amounts.

2. When the rows number is too big, like 100k rows, the performance improvement is very small.

Comparing the big effort to precompile it, the performance gain is not so big as we can see from the comparison data in the table.

For a certain number of decision tables, it seems that precompiling the rules can improve the runtime performance dramatically. And it’s worth a try, I think.

However, when decision tables come to 100k, it seems that it still does not produce very good results.

However in reality, it’s quite common that keywords, or conditions value become very large. So we still need some better solution to tackle 100k row decision tables.

In my next article, I would show a different approach(I called it lateral thinking) to transform the dimension of fact and rule to improve the performance.

Nerd For Tech

From Confusion to Clarification

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store