Improved Decision Module Selection for Hierarchical Inference in Resource-Constrained Edge Devices
Fecha
2023-10-02Resumen
The Hierarchical Inference (HI) paradigm has recently emerged as an effective method for balancing inference accuracy, data processing, transmission throughput, and offloading cost. This approach proves particularly efficient in scenarios involving resource-constrained edge devices like micro controller units (MCUs), tasked with executing tinyML inference. Notably, it outperforms strategies such as local inference execution, inference offloading, and split inference (i.e., inference execution distributed between two endpoints). Building upon the HI paradigm, this work explores different techniques aimed at further optimizing inference task execution. We propose three distinct HI approaches and evaluate their utility for image classification.