Abstract
To maximize the performance of IoT devices in edge computing, an adaptive polling technique that efficiently and accurately searches for the workload-optimized polling interval is required. In this paper, we propose NetAP-ML, which utilizes a machine learning technique to shrink the search space for finding an optimal polling interval. NetAP-ML is able to minimize the performance degradation in the search process and find a more accurate polling interval with the random forest regression algorithm. We implement and evaluate NetAP-ML in a Linux system. Our experimental setup consists of a various number of virtual machines (2–4) and threads (1–5). We demonstrate that NetAP-ML provides up to 23% higher bandwidth than the state-of-the-art technique.
Original language | English |
---|---|
Article number | 1484 |
Journal | Sensors |
Volume | 23 |
Issue number | 3 |
DOIs | |
State | Published - Feb 2023 |
Keywords
- adaptive polling
- edge computing
- I/O virtualization
- machine learning