The power of Chrome Devtools Protocol — Part IV

German Bisogno
Published in
4 min readNov 1, 2022


Part IV — Performance domain

In the previous article, we covered the powerful features of the Runtime domain using CDP and how to implement it in Selenium tests. In this article, we will cover the Performance domain and how we can capture performance metrics from the page during our test run.

You can find the complete code at Let’s jump into it!

Using Performance domain

The main purpose of the Performance domain is to get metrics from the page. We will be focusing on thegetMetricsmethod will allow us to extract runtime metrics during the test execution.

Please, find below some benchmarks that can be used through this API; If you need more information about metrics, you can check the Puppeteer documentation here:

  • Timestamp: The timestamp when the metrics sample was taken.
  • Documents: Number of documents on the page.
  • Frames: Number of frames on the page.
  • JSEventListeners: Number of events on the page.
  • Nodes: Number of DOM nodes on the page.
  • LayoutCount: The total number of full or partial page layouts.
  • RecalcStyleCount: The total number of page style recalculations.
  • LayoutDuration: Combined durations of all page layouts.
  • RecalcStyleDuration: Combined duration of all page style recalculations.
  • ScriptDuration: Combined duration of JavaScript execution.
  • TaskDuration: Combined duration of all tasks performed by the browser.
  • JSHeapUsedSize : Used JavaScript heap size.
  • JSHeapTotalSize : Total JavaScript heap size.

If you want to get other web vitals KPIs like LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), or TBT (Total Blocking Time), you can use Google Lighthouse, which won’t be covered in this article.

How do we automate?

Let’s implement a classPerformancewritten in Typescript, which we will extend again fromTraceOperations, as in previous articles. All npm dependencies used in the last article will be required.

In this last class example, we used Performance.enableto enable collecting and reporting metrics and Performance.disableto disable and stop that. Also, we are adding getMetricsmethod to get the metrics to return in our test or store them in a file.

NoticeProtocol.Performance.GetMetricsResponseis the object that contains the metrics we will return in our getMetrics method. See what this object looks like:

export interface GetMetricsResponse {
* Current values for run-time metrics.
metrics: Metric[];

Then, theMetricinterface has the following format:

* Run-time execution metric.
export interface Metric {
* Metric name.
name: string;
* Metric value.
value: number;

For more information about TypeScript definitions for this protocol, please check ‘types/protocol.d.ts.’

Developing a test!

Now let’s write a test that will use the Performance domain and get performance metrics within a user action. The startTrace and stopTrace methods come into action again.

As a result, this test will output two files, startTrace.json, and endTrace.json: the former to log the metrics when the tracing starts and the latter when the trace ends. After running the test, you will have two files whose content will look similar to the following:

{ metrics:
[ { name: ‘Timestamp’, value: 1760243.235006 },
{ name: ‘AudioHandlers’, value: 3 },
{ name: ‘Documents’, value: 9 },
{ name: ‘Frames’, value: 4 },
{ name: ‘JSEventListeners’, value: 1455 },
{ name: ‘LayoutObjects’, value: 1745 },
{ name: ‘MediaKeySessions’, value: 0 },
{ name: ‘MediaKeys’, value: 0 },
{ name: ‘Nodes’, value: 2778 },
{ name: ‘Resources’, value: 56 },
{ name: ‘ContextLifecycleStateObservers’, value: 17 },
{ name: ‘V8PerContextDatas’, value: 3 },
{ name: ‘WorkerGlobalScopes’, value: 0 },
{ name: ‘UACSSResources’, value: 0 },
{ name: ‘RTCPeerConnections’, value: 0 },
{ name: ‘ResourceFetchers’, value: 9 },
{ name: ‘AdSubframes’, value: 0 },
{ name: ‘DetachedScriptStates’, value: 1 },
{ name: ‘ArrayBufferContents’, value: 42 },
{ name: ‘LayoutCount’, value: 37 },
{ name: ‘RecalcStyleCount’, value: 115 },
{ name: ‘LayoutDuration’, value: 0.25534 },
{ name: ‘RecalcStyleDuration’, value: 0.092492 },
{ name: ‘DevToolsCommandDuration’, value: 0.14261 },
{ name: ‘ScriptDuration’, value: 0.839995 },
{ name: ‘V8CompileDuration’, value: 0.063373 },
{ name: ‘TaskDuration’, value: 2.691882 },
{ name: ‘TaskOtherDuration’, value: 1.298072 },
{ name: ‘ThreadTime’, value: 1.760892 },
{ name: ‘ProcessTime’, value: 2.734375 },
{ name: ‘JSHeapUsedSize’, value: 17165052 },
{ name: ‘JSHeapTotalSize’, value: 34803712 },
{ name: ‘FirstMeaningfulPaint’, value: 0 },
{ name: ‘DomContentLoaded’, value: 1760241.873843 },
{ name: ‘NavigationStart’, value: 1760241.082286 } ]

With this information, you can compare the status of the performance at any moment during your Test Automation.

By having these metrics, you can measure your test step duration using, for exampleTimeStampto take the initial and end time or for getting JSHeapUsedSize which represents the memory usage by JavaScript or evaluates any other metric to ensure the expected performance of your app.


We went into the Performance domain and some of its different features. This domain will help to evaluate metrics from the page and perform the necessary assertions in your Test Automation using the benchmark values returned by this API.

And to wrap up, we completed a set of four articles about CDP and its powerful features. We have covered Tracing, Network, Runtime, and Performance domains.

I invite you to explore more domains by yourself and find how useful this protocol can be in your Test Automation!

I hope you have enjoyed this journey! See you next time!

Thanks for reading!