With the advent of beyond 5G applications, the execution of computationally intense tasks moves further closer to the network edge. Alongside the capabilities of a MultiAccess Edge Computing (MEC), smart decision-making considering sustainability aspects has become achievable. In this paper, a resource management technique utilizing Reinforcement Learning (RL) at the MEC is presented in order to promote power efficient solutions. CPU resources at the MEC are managed and distributed to several network services for their individual disposal. A direct relation between the CPU resources and power consumption at the MEC is proposed further establishing the need for efficient resource handling. A Soft-Actor Critic (SAC) approach is leveraged to learn the patterns for intelligent resource allocation minimizing the power expenditure. Further, two baseline algorithms, the Knapsack method and the proportional resource allocation scheme, are implemented to prove the dominance of the proposed RL-based algorithm. The results confirm that in the SAC-based RL implementation, the power consumption at the MEC server is lower compared to the two baseline algorithms. The promising results pave way for the deployment of RL-based algorithms for efficient performance, thus promoting green technologies at the MEC.
Reinforcement learning driven sustainable resource and power management for the MEC
GLOBECOM 2024, IEEE Global Communications Conference, 8-12 December 2024, Cape Town, South Africa
Type:
Conference
City:
Cape Town
Date:
2024-12-08
Department:
Communication systems
Eurecom Ref:
7997
Copyright:
© 2024 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
See also:
PERMALINK : https://www.eurecom.fr/publication/7997