Non Functional Metrics evaluation of MakeMyTrip iOS App

Note: The objective of this blog is to share how we at MakeMyTrip evaluate key metrics of an iOS app in pre-production phase via automated solution. Here we are going to focus on how we capture Non Functional Metrics (Memory, CPU, Network) to validate app performance in development phase for iOS app.

1. Introduction:

Apple has various developer tools to check the performance of iOS apps. The most common are Instruments, Leaks, Profiler, Allocations etc.

But all these profiling tools needs manual intervention. We can’t manually check performance issues all the time as it is very cumbersome to maintain the same environment and precondition for test to run. There should be automated way to check the performance of an app on regular basis. Fortunately, there are few ways in which we can achieve automated performance testing of iOS apps.

  • Running Instruments check on periodic basis
  • Automated Performance Tests using XCUITest framework.

In this post, we will cover automated performance tests using XCUITest.

How iOS Performance Metrics is different from Android:

iOS is a closed-source operating system specifically developed by Apple for its mobile devices. In contrast, Android is a Google’s open-source operating system that offers wide customization to the developer’s community and third-party manufacturers. When iOS uses the XNU kernel developed in C/C++ and Objective-C, Android uses mostly Linux OS as a foundation of the kernel written in C/C++.
We can easily find performance metrics in android either via adb commands or google API available but for capturing insight in iOS app we have to deal with kernel first in objective C and then we can find information using library provided by apple.
In iOS memory is not fragmented further in native and dalvik, we only have to deal with running memory of an app, also for CPU utilization we only capture total CPU usage. In android CPU OS is further categorized in user and kernel space.

2. What to Measure: ⚙️

Non Functional Metrics to measure

Our Automation testing framework for iOS makes it easy to write reliable user interface tests to validate app components at story level. We identified key tests for which we would like to evaluate the app metrics and wrote those in XCUI test framework.

After every @test pass we compute memory used, CPU used and network data used by individual test case. After capturing these stats we dump this data in database after hitting an API (because we cannot access directly DB directly inside instrumentation test, so need this API for dumping data into DB) with device name and apk version etc as parameters.

To capture performance insight of each test case, below are some of the common fields captured for each test cases and stored in db.

* Test case name
* Device id
* Apk version
* Time-stamp when data was captured for a particular test case

2.1 Capture Memory Usage Metrics After Every Test Scenario:

  • How much of Memory is used by application, operating system, memory available for app etc.

👉 Method to Capture Memory consumption:

compute Memory usage after every @test pass

By using mach_task_basic_info() ,ProcessInfo.processInfo, (which comes under mach_host.h and mach.h package) API we captures run time memory utilization data.

MemoryNFR metrics captures below information for every test case we run for story level.

memUsedTotal: memory used by device
memUsedFree: memory available for app
memUsedWired: memory used by operating system
memUsedApp: memory used by application itself

captured memory usage data in DB:

Memory consumption data metrics in db

2.2 Capture CPU usage Metrics after Every Test Scenario:

  • How much of CPU used by application like (System Space CPU Usage time, User Space CPU usage time, CPU thread sleep time)

👉 Method to Capture CPU consumption:

compute CPU usage after every @test pass

We captures CPU usage, system space CPU time, and CPU thread sleep time per process and combine the data for all running threads. By using api like threadBasicInfo.cpu_usage) / Float(TH_USAGE_SCALE)), threadBasicInfo.user_time.microseconds,threadBasicInfo.system_time.microseconds, threadBasicInfo.sleep_time we captured CPU usage data per test case.

CPU metrics captures below information for every test case we run at an activity and scenario level.

cpuUsedTotal: total CPU used
usrSpaceTime: user space CPU time
systemSpaceTime: system space CPU time
threadSleepTime: CPU sleep time
cpu consumption data in % in db

2.3 Capture Network Data usage Metrics after Every Test Scenario:

  • How much of the Mobile Data and WiFi Data used (eg. in 2G, 3G, LTE network condition).
  • RX data(data received by app), TX data(data transmitted by app), Total mobile data usage.
  • Wi-Fi RX Data(Wi-Fi data received by app), TX Data(Wi-Fi data transmitted by app), Total Wi-Fi data usage.

👉 Method to Capture Network Data consumption :

compute Network usage after every @test pass

We extract Network utilization data for 2G, 3G, LTE, WiFi as per network conditions available at the time of test. For extracting network utilization it capture network type automatically as that time device is connected with network type available at that time by including arpa/inet.h,net/if.h,ifaddrs.h,net/if_dl.h libraries we captures network statistics for running test.

NetworkData metrics captures below information for every test case we run for activity and scenario level:

mobileTxData : mobile transmitted data
mobileRxData : mobile received data
mobileTotalData : mobile total data
wifiTxData : wifi transmitted data
wifiRxData : wifi received data
wifiTotalData : wifi received data
networkType : mobile or WiFi
Network data in bytes in db

2.4 iOS Performance Testing Automated Solution using XCUI Test :”

Fragmented test tools were taking too long to deliver results mobile performance engineers were measuring devices manually with developer tools that did not provide aggregated data over time.

The in house solution that we built for capturing Non Functional metrics via XCUITest instrumentation tool for capturing memory, cpu, network, etc. performance stats we captured these metrics via xcui test script.

We captured results from scripts written in XCUITest and validated them against results of IOS profiler tool and found them to be same.

IOS NFR Automation Framework Architecture
  • *In above architecture LOB pages stand for Line Of Business like Hotel,Flights etc are one Lob for us.*

XCUI is a testing framework for iOS to make it easy to write reliable user interface tests.

On UI test success focus comes to capture NFR data like memory usage, CPU usage and network data consumption for every test case and dump these stats in database, so that we can compare stats after next release of app.

2.5 Visualization and Reporting:

We compute memory usage, CPU usage, network usage for each test cases for each module and Run suite for at least 3~ 5 times , computed data will be dumped in test DB and we will compare with the average of data from prev version data and current version data available in DB. so basically average to average comparison for every test data.

How we do monitoring Pass/Fail criteria: :

Pass/Fail criteria, for every test case:

If (current version data <= (previous version data + (-) 20% of previous version data) PASS else FAIL;

Overall Report visualization with pass fail percentage

For generating report automated suite run which compare every test cases data from previous version data same test case data. And there is comparison of average to average value for each metrics and if computed data not fulfill above pass fail criteria we marked it fail and analyse cause for increasing metrics.

Fail Cases Caveat:

fail cases insight after comparing data from current iOS version release with previous version release.

We can also compare any release data with any previous release to see deeper insight for fail cases.

👉Memory consumption :

Memory consumption fail case info

👉CPU Consumption:

CPU consumption fail case info

👉Network Consumption:

Network data consumption fail case info

With above solution we can generate grater insight for iOS Performance metrics.

References:

MakeMyTrip-Engineering

MakeMyTrip Engineering & Data Science

Vishvnath Pratap Singh

Written by

MakeMyTrip Engineering Team

MakeMyTrip-Engineering

MakeMyTrip Engineering & Data Science