Jointly Learning Optimal Task Offloading and Scheduling Policies for Mobile Edge Computing
Date
2022-09-19Abstract
This work contributes towards optimizing edge analytics in Mobile Edge Computing (MEC) systems. We consider requests for computing tasks that are generated from users and can be satisfied either locally at their devices, or they can be offloaded to an edge server in their proximity for remote execution. We study a multi-user MEC system with limited energy autonomy for the mobile devices and with limitations on the computing capability of both mobile devices and at an edge server, where users can offload part of their computation load. We define a utility over “resource residuals”, that capture the difference between the resources assigned through our decisions, and those needed in practice, and we aim at the minimization of regret, i.e., of the difference between the utility obtained by an optimal offline benchmark that knows the system evolution in hindsight, and our online decision policy. We design an algorithm that jointly learns policies for offloading computations and scheduling them for execution at the shared MEC server. We prove that our algorithm is asymptotically optimal, i.e., it has no regret over the optimal static offline benchmark, and that its performance is independent of the number of devices in the system. From our numerical evaluation we conclude that our algorithm adapts to unpredictable demand changes, it learns to identify resource-limited devices, and it learns to share the server’s resources.