Summary: | In Federated Learning (FL), a global statistical model is developed by encouraging mobile users to perform the model training on their local data and aggregating the output local model parameters in an iterative manner. However, due to limited energy and computation capability at the mobile devices, the performance of the model training is always at stake to meet the objective of local energy minimization. In this regard, Multi-access Edge Computing (MEC)-enabled FL addresses the tradeoff between the model performance and the energy consumption of the mobile devices by allowing users to offload a portion of their local dataset to an edge server for the model training. Since the edge server has high computation capability, the time consumption of the model training at the edge server is insignificant. However, the time consumption for dataset offloading from mobile users to the edge server has a significant impact on the total time consumed to complete a single round of FL process. Thus, resource management in MEC-enabled FL is challenging, where the objective is to reduce the total time consumption while saving the energy consumption of the mobile devices. In this article, we formulate an energy-aware resource management for MEC-enabled FL in which the model training loss and the total time consumption are jointly minimized, while considering the energy limitation of mobile devices. In addition, we recast the formulated problem as a Generalized Nash Equilibrium Problem (GNEP) to capture the coupling constraints between the radio resource management and dataset offloading. To that end, we analyze the impact of the dataset offloading and computing resource allocation on the model training loss, time, and the energy consumption. Finally, we present the convergence analysis of the proposed solution, and evaluate its performance against the traditional FL approach. Simulation results demonstrate the efficacy of our proposed solution approach.
|