Showcasing In-Switch Machine Learning Inference
Fecha
2023-06-19Resumen
Recent endeavours have enabled the integration of trained machine learning models like Random Forests in resource-constrained programmable switches for line rate inference. In this work, we first show how packet-level information can be used to classify individual packets in production-level hardware with very low latency. We then demonstrate how the newly proposed Flowrest framework improves classification performance relative to the packet-level approach by exploiting flow-level statistics to instead classify traffic flows entirely within the switch without considerably increasing latency. We conduct experiments using measurement data in a real-world testbed with an Intel Tofino switch and shed light on how Flowrest achieves an F1-score of 99% in a service classification use case, outperforming its packet-level counterpart by 8%.