Cloud platforms have become essential in rapidly deploying application systems online to serve large numbers of users. Resource estimation and workload forecasting are critical in cloud data centers. Complexity in the cloud provider environment due to varying numbers of virtual machines introduces high variability in workloads and resource usage, making resource predictions problematic using state-of-the-art models that fail to deal with nonlinear characteristics. Estimating and predicting the resource metrics of cloud systems across packet networks influenced by unknown external dynamics is a task affected by high measurement noise and variance. An ideal solution to these problems is the Kalman filter, a variance-minimizing estimator used for system state estimation and efficient low latency system state prediction. Kalman filters are optimal estimators for highly variable data with Gaussian state space characteristics such as internet workloads. This work provides a solution by making these contributions: i) it introduces and evaluates the Kalman filter-based model parameter prediction using principal component analysis and an attention mechanism for noisy cloud data, ii) evaluates the scheme on a Google Cloud benchmark comparing it to the state-of-the-art Bi-directional Grid Long Short-Term Memory network model on prediction tasks, iii) it applies these techniques to demonstrate the accuracy and stability improvements on a realtime messaging system auto-scaler in Apache Kafka. The new scheme improves prediction accuracy by $37\%$ over state-of-the-art Kalman filters in noisy signal prediction tasks. It reduces the prediction error of the neural network model by over $40\%$. It is shown to improve Apache Kafka workload-based scaling stability by $58\%$.
翻译:暂无翻译