Software Energy-Efficiency: Resource Adaptation Tactics

Max Meinhardt
10 min readMay 25, 2023

--

Software energy-efficiency resource adaptation strategies focus on optimizing hardware and software resource architectures to improve software efficiency [1]. In this article, we will explore effective tactics for reducing overhead, adapting services, and maximizing application efficiency.

The content of this article builds upon the software energy-efficiency Resource Adaptation category introduced in the first article of this series on software energy efficiency.

Photo by Petr Magera on Unsplash

1) Reduce Overhead

Overhead refers to surplus software functionality and resources that are either unused or can be reduced in size and scope without compromising performance. It often arises from factors such as over-engineering, inadequate integration leading to the presence of “zombie” code, and other similar issues.

1.1) Make Software Cloud-Native (cloud-specific)

Cloud-native software applications comprise microservices that are often packaged into containers, seamlessly integrating with cloud environments. These microservices collaboratively form an application, with each having the ability to scale independently and undergo continuous improvement and iteration through orchestration and automation processes. The inherent flexibility of microservices contributes to the agility and continuous enhancement of cloud-native applications, enabling them to adapt and leverage cloud resources for optimized performance, cost-efficiency, and energy efficiency.

Furthermore, this architecture enables hosting shared microservices across multiple applications, such as both web and mobile native apps, facilitated by service brokering. It also facilitates the separation of application functionality, allowing different development teams to be assigned to specific tasks.

A common use-case for transforming an application into a cloud-native one involves migrating a web application from a monolithic architecture (e.g., one that relies on a web server) to a containerized microservices architecture. This entails refactoring the software into multiple microservices. Despite the potential overhead resulting from the additional software generated during this process, the cloud provider’s energy-saving features outweigh this overhead, resulting in overall energy efficiency gains.

1.2) Adopt Use-Case-Driven Design

Photo by Scott Graham on Unsplash

The use-case-driven design approach, also known as the “one-client approach” to software architecture, typically entails lower overhead compared to a domain-driven design, which emphasizes scalability. However, this architecture demands more careful planning and a deep understanding of the product’s long-term business logic compared to the use-case design’s focus on specific problem-solving aspects that are often easier to quantify during the early stages of software design.

During the software implementation phase, the use-case design approach prompts developers to address technical details sooner rather than later, enabling faster coding. However, this expedited approach may sacrifice scalability and introduce maintenance challenges down the line. To mitigate these concerns, it is recommended to combine multiple design methodologies, striking a balance that minimizes unnecessary layers of abstraction and redundant functionality while achieving the ultimate goal of efficient and maintainable software architecture.

2) Service Adaptation

In the context of service adaptation, Vos et al. [1] emphasize the importance of selecting services based on energy-related information. For instance, a cloud service broker (CSB) can distinguish itself by incorporating energy-related data into its service APIs, offering comprehensive application-energy data collection, and providing graphical visualization capabilities in its user interface. Ideally, leveraging the energy data from these APIs enables applications to develop a more precise energy model, enhancing the accuracy of static code energy analyzers and runtime energy optimization middleware, such as DVFS manipulation.

3) Increase Software Application Efficiency

These tactics are aimed at optimizing the energy efficiency of software applications during their implementation.

3.1) Make Resources Static

Maximizing the utilization of static resources leads to improved software performance and long-term energy efficiency by minimizing real-time processing requirements. For instance, a static website can load up to 10 times faster (Deszkewicz [2]) compared to a dynamic website generated with a content management system (CMS). When considering the cumulative effect over time, the energy savings can be significant.

3.2) Apply Edge Computing

To mitigate the energy consumption associated with data transmission over the Internet, it is beneficial to bring remote services closer to the devices through the use of edge computing technology.

In the context of web services, a content delivery network (CDN) serves as an early form of edge computing, focusing primarily on caching data closer to the end user. Modern edge computing goes beyond caching and brings cloud services and storage closer as well. This approach offers the advantages of reduced energy usage and network latency, particularly valuable for the increasing number of IoT devices projected to be deployed in the future.

Generic IoT Edge Network

3.2.1) Network Traffic Asymmetry

As per Vailshery [3], the number of IoT devices worldwide is projected to more than double from 2022 to 2030, and many of these devices, such as cameras and sensor-based devices, will require more upstream data transmission than downstream.

Currently, most internet traffic is transmitted downstream to media-streaming devices and PCs. However, failure to consider this reversal of traffic direction when selecting network interconnect devices for an IoT network can lead to network congestion issues and a significant increase in power consumption.

3.3) Apply More Granular Scaling

Granular scaling is essentially the splitting of a workload into more manageable components so that its resource utilization can be optimized. Here are a few examples:

  • Converting a single-threaded firmware architecture into a multi-threaded one, allowing for thread allocation across multiple CPU cores with different frequencies (p-states) and power requirements. This is dependent on the configuration of the CPU power governor.
  • Splitting services and functionalities during the code refactoring process when transitioning applications to a cloud-native architecture.
  • In cloud auto-scaling, fine-grained scaling can be achieved by customizing the scaling based on specific types, such as APIs, where certain APIs may experience higher traffic than others.

3.4) Choose Optimal Deployment Paradigm (cloud-specific)

The choice of deployment paradigm among containers, virtual machines (VMs), and serverless architectures should be based on the characteristics of the cloud workload to optimize energy utilization, as stated by Vos et al. [1]. For instance, VMs are most effective for stable and predictable workloads, while serverless architectures are suitable for bursty workloads.

3.5) Leveraging Caching

Photo by Liam Briese on Unsplash

Caching is a highly effective approach to enhance energy efficiency, and its optimization can be achieved by strategically utilizing caching functionalities within an application’s software framework and operating system. For instance, optimizing CPU cache usage can be achieved through performance tools like ‘perf’ on Linux and Intel’s VTune tool, which can identify lines of code with a high cache miss rate.

3.5.1) Web Caching

Web caching involves caching data on both the web server and the client side (e.g., web browser, HTTP or SOAP service request initiator).

3.5.1.1) Optimizing HTTP Cache Headers

Configuring the HTTP cache headers, such as Cache-Control and Expires [4], is an effective web caching technique. Cache-Control enables caching in the browser and intermediate proxies, while Expires sets the expiration date of a cached resource.

However, it’s important for software developers to implement these headers correctly. In some cases, developers may encounter issues where changes made to a page’s GUI, for example, are not reflected in the browser because it continues to load the previous cached version. Instead of disabling the cache using “Cache-Control:no-cache,” a common but inefficient practice is to add a random fake HTTP attribute to the request (e.g., <img src=’picture.jpg?123>) to prevent a cache hit. This approach wastes energy since it doesn’t proactively disable caching when unnecessary.

3.5.1.2) Web Application-Level Caching

Web application-level caching involves programmatically caching data within the application’s program execution through API calls or custom tag libraries imported into scripting languages like JSP. An example of application-level caching is the Java Object Cache [29], which can be accessed through the Java Spring Caching Abstraction [31][32] in Spring or Spring Boot.

It is crucial to carefully consider and plan the implementation of web application-level caching to avoid redundant caching in the application’s data flow, thereby optimizing energy efficiency.

3.5.1.3) Edge Side Includes

In certain scenarios, Edge Side Includes (ESI) [14], an XML-based markup language, can be used as an energy-efficient alternative to Client Side Includes (CSI) that relies on JavaScript and AJAX. ESI offers comparable load times while avoiding unnecessary requests. When CSI includes uncacheable AJAX response data, it requires multiple requests from the browser to the web server to fully load a page with dynamic fragments. In contrast, when a page is cached in the browser using ESI, it loads quickly, sends AJAX calls to the web server, and refreshes only the areas where response data is applied.

ESI enables the server to break down a dynamic web page into fragments, process them separately, and reassemble them before delivering the complete page to the browser. This method typically requires only one request to the web server, resulting in energy savings. ESI can be utilized on an edge server, such as a CDN edge node, and is also compatible with API calls that do not execute JavaScript.

3.5.2) Cloud Caching (cloud-specific)

Cloud caching involves using a managed web service to set up, operate, and scale a distributed web cache in the cloud. It provides the benefits of a high-performance in-memory cache with reduced administrative burden in managing a distributed cache. Additionally, users can configure the service to receive alarms if the cache becomes hot and access performance metrics through a user account page.

The energy efficiency of cloud caching depends on factors such as cache size, cache item sizes, and frequency of use. Notable examples of in-memory cloud caching solutions include AWS ElastiCache [33], Google Cloud Memorystore [34], and Microsoft Azure Cache for Redis [35]. These services offer energy-efficient caching capabilities tailored for cloud environments.

3.6) Optimize Search and Query Strategies

Photo by Markus Winkler on Unsplash

Optimizing search and query strategies can improve performance and save energy. The effectiveness of these tactics heavily relies on the type of database (relational or NoSQL) and the schema, which should be designed based on how the data will be utilized.

3.6.1) SQL Optimization

Optimizing SQL statements can reduce memory usage and disk access, and there are several techniques to achieve this. For instance, reducing table size, utilizing EXISTS() instead of COUNT(), implementing table indexes, using WHERE instead of HAVING, and adding EXPLAIN() at the beginning of a query to measure and optimize its performance during the optimization phase. SQL query optimization tools like the SolarWinds [5] Database Performance Analyzer can be employed to aid in this process.

3.6.2) NoSQL Optimization

Similar to SQL databases, there are multiple approaches to enhance the energy efficiency of NoSQL databases. One method involves storing records as variable-length delimited, which can significantly reduce their size compared to fixed-length record formats. Additionally, optimizing the layout of flat file data by logically and physically separating it and implementing sorting can narrow down the scope of records accessed during a search. This strategy improves energy efficiency, especially when the data layout is optimized for filtering methods such as regular expressions.

3.7) Compress Infrequently Accessed Data

Data compression is an effective way to save energy by reducing file storage and memory space. However, when transmitting compressed data over a computer network, especially over long distances, the positive impact on energy consumption can be even greater. It’s important to consider that compressing frequently accessed data may not be beneficial if the energy required for compression outweighs the energy saved during storage and movement over time. It’s worth noting that incompressible data, such as images with compressed data formats, should not be compressed.

3.7.1) Application Data Compression

Applications should implement real-time data compression when it is estimated to be advantageous. For example, in a Java EE Web application using the Spring Boot framework, Spring Boot can compress JSON data (using tools like Silz [6]) before transmitting it from the Web server to the browser. The compression type is recognized by the HTTP header Content-Encoding gzip, and the data is decompressed accordingly.

Another strategy in web development is to minimize JavaScript, CSS, XML, and JSON files. Minimization is performed during the build process, eliminating any additional energy or performance overhead during runtime. An example of a CSS and JavaScript compressor is YUI Compressor [7].

3.7.2) Cloud Database Compression (cloud-specific)

Certain cloud providers offer built-in compression capabilities. For instance, in the AWS RedShift [8] data warehouse service, database table column compression is automatically enabled by default unless explicitly configured otherwise. AWS provides documentation on Compression Encodings [9], which emphasizes the importance of evaluating table access frequency and identifying any incompressible data (such as graphics) before determining the appropriate configuration. It’s worth noting that compressing a table column may also result in a row offset.

3.8) Optimize Code

Photo by Ilya Pavlov on Unsplash

This topic has its own article here.

Conclusion

In this article, we explored resource adaptation tactics for software energy efficiency. Strategies included reducing overhead, adopting cloud-native approaches, service adaptation, optimizing application efficiency, caching, and compressing data. These tactics improve performance and reduce energy consumption. Implementing these strategies contributes to a sustainable software ecosystem.

Articles in this series

The following articles are part of this comprehensive series that delves into energy-efficient tactics in software architecture and implementation. The first article shown below contains a diagram that is described in the remaining four articles.

References

[1]: Sophie Vos, Patricia Lago, Roberto Verdecchia, Ilja Heitlager. Architectural Tactics to Optimize Software for Energy Efficiency in the Public Cloud. 2022, 11 pages.

[2]: Jason Deszkewicz, “Difference Between Static and Dynamic Web Pages”, https://www.academia.edu/23686425/Difference_Between_Static_and_Dynamic_Web_Pages. accessed on 2022–06–25.

[3]: L. Vailshery. Number of IoT connected devices worldwide 2019–2030.
https://www.statista.com/statistics/1183457/iot-connected-devices-worldwide/. accessed on 2022–06–25.

[4]: Heroku Dev Center. Increasing Application Performance with HTTP Cache Headers. https://devcenter.heroku.com/articles/increasing-application-performance-with-http-cache-headers. Last updated on 2022–09–03. accessed on 2022–05–07.

[5]: SolarWinds. Database Performance Analyzer. https://www.solarwinds.com/database-performance-analyzer.
accessed on 2022–06–25

[6]: Karsten Silz. Reducing JSON Data Size. June 24, 2022. https://www.baeldung.com/json-reduce-data-size. accessed on 2022–06–25.

[7]: YUI. YUI Compressor. https://yui.github.io/yuicompressor/

[8]: Amazon Web Services. Compression encodings. Amazon Redshift — Database Developer Guide. https://docs.aws.amazon.com/redshift/latest/dg/c_Compression_encodings.html. accessed on 2022–05–07.

[9]: Amazon Web Services. Amazon Redshift Engineering’s Advanced Table Design Playbook: Compression Encodings. https://aws.amazon.com/blogs/big-data/amazon-redshift-engineerings-advanced-table-design-playbook-compression-encodings/. accessed on 2022–06–26.

--

--