Osaka University Cuts Power Consumption by 13% with Kubernetes and AI
Relying on deep learning, the institution builds a workload allocation optimizer using neural network — to enable efficient edge computing.
Increasing power consumption is a problem
Over the past few years, edge computing, which places computation and data storage closer to the devices where it is being gathered, has become more prevalent due to the widespread adoption of the Internet of Things (IoT). Edge computing using 5G networks may reduce communication time between devices, but management tasks are becoming more and more complex. Additionally, the steady increase of IoT devices and the growing demand for 5G networking has led to a rise in the amount of computing resources necessary to operate such systems.
Read the digital transformation story.