Automation Performance Testing- the generic approach

Orel Damari
Nerd For Tech
Published in
6 min readMay 2, 2021

For a lot of us, Performance Automation Testing can easily become a huge mess. In this post I’ll take you through a story that ends with the most beautiful design I have ever tackled.

Before everything, I would like to thank Elad Baram for mentoring me through this design. Elad is Head Of Automation at Cato-Networks, the first SASE product ever made. If you are somehow into computer networks, you should read more about this awesome company.

I’m really into dogs & computers pictures

Story time

Imagine for yourself a startup that run fast. Really fast. In 4 years you grew from 3 employee to 400. from 0$ to 1B$ .

That’s the exact moment you don’t want your startup to fall.

If you fall at 1M$- ok, none will hear about it. But now- even five seconds of downtime to an enterprise client can put you on the New York Times front page- but not for the good reasons.

At Cato Networks this was the story. We released a version every week, we ran extremely fast. From zero to hero — In 5 years Cato Networks become a unicorn and set the tone for SASE products.

That’s the exact moment the team starts to realize-

We need to know how well our product performs.WE NEED PERFORMANCE TESTING!

What is Performance Testing?

Your humble servant received a mission — “Orel, build us a performance testing roadmap”. So the first thing Orel did was research what exactly performance testing entails. After a lot of research, I came up with the following major types of performance testing:

  1. Load testing
  2. Spike testing
  3. Stress testing
  4. Endurance testing
  5. Volume testing
  6. Scalability testing

Since the goal of this post is the code design, and not to explain how to create a performance testing roadmap, I’m not going to talk more about this topic. You can read more about it here.

For a moment I thought it’s the same dog

The Problems

So I’ve built a huge roadmap that could give us great results after 6 months, but the management wanted results as fast as possible. We can’t afford to wait 6 months to find out when the next performance issue will occur. This leads me to the following problems:

  • Generic type of tests — we wanted to easily choose between Load Testing/Spike Testing/ Etc…
  • Generic testing tool — For some scenarios hping3 could be the best tool, and for others it could be iperf for example
  • Readability and ease of use — You want other people to easily use, change and configure your test. Maybe even people that are not into coding at all
  • Time — You want the results ASAP

This led us to build one test. One test with such a generic design, that it could meet all the above requirements.

It was like building a shoe for every foot size

The Algorithm

We quickly realized that the binary search algorithm could be used to test any performance test.

The binary search algorithm looks for a number or result by jumping in multiplicity of 2 or 0.5.

For example-if we wanted to search for our throughput performance:

  1. We started with a baseline of 100mbps
  2. If it worked — we jumped to 200mbps
  3. If the above step didn’t work — we tried 150mbps
  4. If it worked — now we know that our product is working at 150mbps and failing at 200mbps. If the precision of 50mbps is sufficient for us, we will stop here and write the results.

I would write the code in something close to python because for algorithms it’s easier to read (originally it was written using java):

pass_result=None
fail_result=None
def binary_search(baseline : float , precision : float):
is_passed = run_test(baseline)
if is_passed:
pass_result = baseline
if fail_result==None or fail_result-pass_result > precision:
baseline *= 2
binary_search(baseline,precision)
else:
write_results(pass_result, fail_result)
else:
fail_result = baseline
if pass_result==None or fail_result-pass_result > precision:
baseline /= 2
binary_search(baseline,precision)
else:
write_results(pass_result, fail_result)

The Design (finally)

We created a json file that looked like this:

{
"client": str,
"server_under_test": str,
"test_type": str,
"testing_tool": str,
"precision": float,
"baseline": float
}
  • client/server- ips of the the machines
  • test_type- one of the above that I mentions in the “what is performance testing?” part
  • testing_tool- hping3/iperf/ixia/something else
  • precision- The difference we search between the passed test to the failed test. Basically this is the condition to stop the test.
  • baseline- The baseline number to start the test

Here is the simplicity of the design not using a UML diagram:

Simplicity of the original design

As you can see there are 3 design patterns in here — Factory, Builder and Singleton. Also — using this design, with smart logic inside the PerformanceTest class — we could run all the scenarios requested.

Let’s deep dive into the code

Side note: Except the algorithm — from now on i’m going to write the code in this post using java. Peter Norvig demonstrates that 16 out of the 23 design patterns found in the GOF book are invisible or simpler in dynamic languages such as python. Also — it just feels against the zen of python. For me, Java is like a piano for design. When speaking about design, I feel like we should use java the same way when we compose music we use a piano.

IterationResult

A simple pretty straight forward builder pattern we will use going forward in our code:

public class IterationResult {public static class Builder {private String server;
private String client;
private double baseline;
private boolean isPassed;
public Builder() {
}
public Builder withServer(String server){
this.server = server;
return this; //By returning the builder each time, we can create a fluent interface.
}

public Builder withClient(String client){
this.client = client;
return this;
}

public Builder withBaseline(double baseline){
this.baseline = baseline;
return this;
}
public Builder withIsPassed(boolean isPassed){
this.isPassed = isPassed;
return this;
}
public IterationResult build(){
IterationResult iterationResult = new IterationResult();
iterationResult.server = this.server;
iterationResult.client = this.client;
iterationResult.baseline = this.baseline;
iterationResult.isPassed = this.isPassed;
return iterationResult;
}
}
private IterationResult() {
//Constructor is now private.
}
}

IterationFactory

A simple Factory pattern for creating iterations

Step one- Iteration interface:

public interface Iteration
{
IterationResult run(float baseline , String clientIp, String serverIp);
}

Step two- Creating Iterations:

Hping:

public class HpingIteration implements Iteration {
@Override
public IterationResult run(float baseline , String clientIp, String serverIp) {
String result = SshConnector.Instance.sendCommand("hping3 "+serverIp + " " +baseline , clientIp);
IterationResult ret = new IterationResult().Builder().withServer(serverIp).withClient(clientIp).withBaseline(baseline).withIsPassed(this.parseResultIsPassed(result)).build();
return ret;
}
}

Iperf:

public class IperfIteration implements Iteration {
@Override
public IterationResult run(float baseline , String clientIp, String serverIp) {
String result = SshConnector.Instance.sendCommand("Iperf -s" + serverIp + " -c " + clientIp " -amount “ + baseline, clientIp)
IterationResult ret = new IterationResult().Builder().withServer(serverIp).withClient(clientIp).withBaseline(baseline).withIsPassed(this.parseResultIsPassed(result)).build();
return ret;
}
}

Step three- Iteration Factory

(I stopped sending here the parameters (clientIp/serverIp/baseline/etc…), if you read until here, I hope you understand the idea)

public class IterationFactory {

public Iteration getIteration(String iterationType){
if(iterationType == null){
return null;
}
if(iterationType.equalsIgnoreCase("curl")){
return new CurlIteration();

} else if(shapeType.equalsIgnoreCase("hping3")){
return new Hping3Iteration();

} else if(shapeType.equalsIgnoreCase("iperf")){
return new IperfIteration();
}

return null;
}
}

The Test

Because we don’t like recursive code- we used the above infrastructure using the following way:

public class PerformanceTest extends Test{public void run(JsonObject json){

String clientIp = json.get("clientIp")
String serverIp = json.get("serverIp")
...etc...
BinarySearch binarySearch = new BinarySearch(baseline, precision, clientIp, serverIp, test_type, tool);
while(binarySearch.isShouldContinue()){
binarySearch.runNextIteration();
}
writeResults(runDate ,binarySearch.getPassedResults(), binarySearch.getFailedResults() );
}
}

Since the BinarySearch object is pretty straightforward to implement- I’ll leave it up to you to understand how I did it. I already wrote the algorithm for you here. Don’t forget to use Singleton.

Conclusions

Sometimes limitations can lead to brilliant ideas.

I need to sleep.

Bye. ❤

--

--

Orel Damari
Nerd For Tech

Senior Staff Software Engineer at Palo Alto Networks. I love security, computer networks, and pictures of dogs in front of a computer.