TY - GEN
T1 - Deep neural networks for predicting task time series in cloud computing systems
AU - Bi, Jing
AU - Li, Shuang
AU - Yuan, Haitao
AU - Zhao, Ziyan
AU - Liu, Haoyue
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/5
Y1 - 2019/5
N2 - A large number of cloud services provided by cloud data centers have become the most important part of Internet services. In spite of numerous benefits, cloud providers face some challenging issues in accurate large-scale task time series prediction. Such prediction benefits providers since appropriate resource provisioning can be performed to ensure the full satisfaction of their service-level agreements with users without wasting computing and networking resources. In this work, we first perform a logarithmic operation before task sequence smoothing to reduce the standard deviation. Then, the method of a Savitzky-Golay (S-G) filter is chosen to eliminate the extreme points and noise interference in the original sequence. Next, this work proposes an integrated prediction method that combines the S-G filter with Long Short-Term Memory network models to predict task time series at the next time slot. We further adopt a gradient clipping method to eliminate the gradient exploding problem. Furthermore, in the process of model training, we choose optimizer Adam to achieve the best results. Experimental results demonstrate that it achieves better prediction results than some commonly-used prediction methods.
AB - A large number of cloud services provided by cloud data centers have become the most important part of Internet services. In spite of numerous benefits, cloud providers face some challenging issues in accurate large-scale task time series prediction. Such prediction benefits providers since appropriate resource provisioning can be performed to ensure the full satisfaction of their service-level agreements with users without wasting computing and networking resources. In this work, we first perform a logarithmic operation before task sequence smoothing to reduce the standard deviation. Then, the method of a Savitzky-Golay (S-G) filter is chosen to eliminate the extreme points and noise interference in the original sequence. Next, this work proposes an integrated prediction method that combines the S-G filter with Long Short-Term Memory network models to predict task time series at the next time slot. We further adopt a gradient clipping method to eliminate the gradient exploding problem. Furthermore, in the process of model training, we choose optimizer Adam to achieve the best results. Experimental results demonstrate that it achieves better prediction results than some commonly-used prediction methods.
KW - Cloud data centers
KW - LSTM
KW - Recurrent neural networks
KW - Savitzky-Golay filter
KW - Task time series prediction
UR - https://www.scopus.com/pages/publications/85068768916
U2 - 10.1109/ICNSC.2019.8743188
DO - 10.1109/ICNSC.2019.8743188
M3 - 会议稿件
AN - SCOPUS:85068768916
T3 - Proceedings of the 2019 IEEE 16th International Conference on Networking, Sensing and Control, ICNSC 2019
SP - 86
EP - 91
BT - Proceedings of the 2019 IEEE 16th International Conference on Networking, Sensing and Control, ICNSC 2019
A2 - Zhu, Haibin
A2 - Wang, Jiacun
A2 - Zhou, MengChu
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 16th IEEE International Conference on Networking, Sensing and Control, ICNSC 2019
Y2 - 9 May 2019 through 11 May 2019
ER -