Comparing evaluations of Endpoint Detection and Response (EDR) solutions
Laptops and PC’s are an important point of entry for cyber attackers because they are operated by humans that can be easily tricked into clicking on malicious links or email attachments. When those laptops and PC’s are used in an enterprise they are called endpoints, and often run special software to mitigate cyber threats.
But which EDR solution is the best?
MITRE
Thanks to MITRE, an American non-profit organisation, we can now compare performance of various Endpoint Detection and Response solutions. This evaluation is unique because it puts a well documented cyber threat in a lab environment and tracks detection throughout the attack path. MITRE published the results, but deliberately without ranking, scoring or rating. Make up your own mind.
This is MITRE:
And apparently they’re not afraid to point out that “cyber” is the fifth domain in warfare besides land, see, air and space.
In the cyber domain they’re famous for creating the MITRE ATT&CK matrix, an information product that helps organisations think about their cyber defence in a more attacker oriented way: from initial access via privilege escalation and lateral movement to impact.
We approve of us
MITRE doesn’t assign scores in their EDR evaluation and in this ranking vacuum you can only imagine what most vendors did:
- “Carbon Black outperforms all other EDR solutions” (source)
- “CounterTack Platform leads with fast automated detections” (source)
- “CrowdStrike Falcon […] the most effective EDR solution” (source)
- “[…] Endgame as the first zero training endpoint protection” (source)
- “FireEye Endpoint Security […] the most effective EDR solution” (source)
- “[…] evaluation showcases the effectiveness of SentinelOne’s platform” (source)
- “[…] Windows Defender ATP demonstrated industry-leading optics […]” (source)
- “[…] Cybereason best enables defenders […]” (source)
- “Cortex XDR and Traps Outperform in MITRE Evaluation” (source)
Unbiased wall of charts
Loading the evaluation results into Splunk, via this Python script, leads to the charts below.
The evaluation simulated 136 steps of an advanced persistent threat. For example the first chart shows that 60 attacker steps on a total of 136 weren’t detected by the chart leader of main detection type “None”.
You can read up on the main detection types here.
Another dimension
Three other modifiers are included in the evaluation. Did you already read up on the detection types and modifiers here?
The ultimate noise generator
If you only care about raw data and do correlation on your own outside of the EDR solution, this chart ranks vendors based on Telemetry, Enrichment and IOC results:
The ultimate signal detector in a sea of noise
If you look for EDR solutions based on signal/noise ratio, this chart ranks vendors based on General or Specific Behaviour divided by the amount of Telemetry,Enrichment or IOC results:
Draw your own conclusions
- Download Splunk for free (Windows, MacOS, Linux)
- Download and install the free app for Splunkbase or Github to play with the data yourself.
There are endless possibilities to slice and dice the EDR evaluation results. Drop me a line with how and why if you did.