Open access
Date
2022-11Type
- Conference Paper
Abstract
Generalizing machine learning (ML) models for network traffic dynamics tends to be considered a lost cause. Hence for every new task, we design new models and train them on model-specific datasets closely mimicking the deployment environments. Yet, an ML architecture called Transformer has enabled previously unimaginable generalization in other domains. Nowadays, one can download a model pre-trained on massive datasets and only fine-tune it for a specific task and context with comparatively little time and data. These fine-tuned models are now state-of-the-art for many benchmarks.
We believe this progress could translate to networking and propose a Network Traffic Transformer (NTT), a transformer adapted to learn network dynamics from packet traces. Our initial results are promising: NTT seems able to generalize to new prediction tasks and environments. This study suggests there is still hope for generalization through future research. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000577569Publication status
publishedExternal links
Book title
HotNets '22: Proceedings of the 21st ACM Workshop on Hot Topics in NetworksPages / Article No.
Publisher
Association for Computing MachineryEvent
Subject
Transformer; Packet-level modelingOrganisational unit
09477 - Vanbever, Laurent / Vanbever, Laurent
Funding
ETH-03 19-2 - Dependable and Data-Driven Intelligent Networks (ETHZ)
Related publications and datasets
Is supplemented by: https://doi.org/10.3929/ethz-b-000586827
Notes
Conference lecture held on November 14, 2022More
Show all metadata