Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction

Purnawansyah Purnawansyah, Haviluddin Haviluddin, Herdianti Darwis, Huzain Azis, Yulita Salim

Abstract


Predicting network traffic is crucial for preventing congestion and gaining superior quality of network services. This research aims to use backpropagation to predict the inbound level to understand and determine internet usage. The architecture consists of one input layer, two hidden layers, and one output layer. The study compares three activation functions: sigmoid, rectified linear unit (ReLU), and hyperbolic Tangent (tanh). Three learning rates: 0.1, 0.5, and 0.9 represent low, moderate, and high rates, respectively. Based on the result, in terms of a single form of activation function, although sigmoid provides the least RMSE and MSE values, the ReLu function is more superior in learning the high traffic pattern with a learning rate of 0.9. In addition, Re-LU is more powerful to be used in the first order in terms of combination. Hence, combining a high learning rate and pure ReLU, ReLu-sigmoid, or ReLu-Tanh is more suitable and recommended to predict upper traffic utilization

Full Text:

PDF

References


M. Kihl, P. Ödling, C. Lagerstedt, and A. Aurelius, “Traffic analysis and characterization of internet user behavior,” 2010 Int. Congr. Ultra Mod. Telecommun. Control Syst. Work. ICUMT 2010, no. November, pp. 224–231, 2010.

V. J. Ribeiro, Z. L. Zhang, S. Moon, and C. Diot, “Small-time scaling behavior of Internet backbone traffic,” Comput. Networks, vol. 48, no. 3, pp. 315–334, 2005.

J. K. Taylor and C. Cihon, Statistical techniques for data analysis, second edition. 2004.

P. Purnawansyah, H. Haviluddin, R. Alfred, and A. F. O. Gaffar, “Network Traffic Time Series Performance Analysis Using Statistical Methods,” Knowl. Eng. Data Sci., vol. 1, no. 1, p. 1, 2017.

M. Hanif, F. Sami, M. Hyder, and M. I. Ch, “Hidden Markov Model for Time Series Prediction,” J. Asian Sci. Res., vol. 7, no. 5, pp. 196–205, 2017.

C. You & K. Chandra, “Time series models for Internet data traffic,” Conf. Local Comput. Networks, 164–171, 1999.

M. S. Mahdavinejad, M. Rezvan, M. Barekatain, P. Adibi, P. Barnaghi, and A. P. Sheth, “Machine learning for internet of things data analysis: a survey,” Digit. Commun. Networks, vol. 4, no. 3, pp. 161–175, 2018.

M. Wang, Y. Cui, X. Wang, S. Xiao, and J. Jiang, “Machine learning for networking: Workflow, advances and opportunities,” arXiv, pp. 1–8, 2017.

E. S. Yu and C. Y. R. Chen, “Traffic prediction using neural networks,” IEEE Glob. Telecommun. Conf., vol. 2, no. May, pp. 991–995, 1993.

N. Boutaba, R. Salahuddin, M. A. Limam, N. Ayoubi, S. Shahriar and M. Solano, F, E. Aicedo, O, “A comprehensive survey on machine learning for networking: evolution, applications and research opportunities,” J. Internet Serv. Appl., vol. 9, no. 5, pp. 1–99, 2018.

C. N. Babu and B. E. Reddy, “A moving-average filter based hybrid ARIMA-ANN model for forecasting time series data,” Appl. Soft Comput. J., vol. 23, pp. 27–38, 2014.

C. Narendra Babu and B. Eswara Reddy, “Performance comparison of four new ARIMA-ANN prediction models on internet traffic data,” J. Telecommun. Inf. Technol., vol. 2015, no. 1, pp. 67–75, 2015.

J. Rynkiewicz, “Hybrid HMM / MLP models for time series prediction,” Eur. Symp. Artif. Neural Networks, no. April, pp. 455–462, 1999.

S. Amin, “Backpropagation – Artificial Neural Network (BP-ANN): Understanding gender characteristics of older driver accidents in West Midlands of United Kingdom,” Saf. Sci., vol. 122, no. July 2019, p. 104539, 2020.

Y. Ling, Q. Yue, C. Chai, Q. Shan, D. Hei, and W. Jia, “Nuclear accident source term estimation using Kernel Principal Component Analysis, Particle Swarm Optimization, and Backpropagation Neural Networks,” Ann. Nucl. Energy, vol. 136, p. 107031, 2020.

J. N. Ogunbo, O. A. Alagbe, M. I. Oladapo, and C. Shin, “N-hidden layer artificial neural network architecture computer code: geophysical application example,” Heliyon, vol. 6, no. 6, p. e04108, 2020.

M. Lopez-Martin, B. Carro, and A. Sanchez-Esguevillas, “Neural network architecture based on gradient boosting for IoT traffic prediction,” Futur. Gener. Comput. Syst., vol. 100, pp. 656–673, 2019.

S. Wang, W. Zhu, Y. Shen, J. Ren, H. Gu, and X. Wei, “Temperature compensation for MEMS resonant accelerometer based on genetic algorithm optimized backpropagation neural network,” Sens. Actuators, A Phys., 316, p.112393, 2020.

G. Panchal, A. Ganatra, Y. P. Kosta, and D. Panchal, “Behaviour Analysis of Multilayer Perceptronswith Multiple Hidden Neurons and Hidden Layers,” Int. J. Comput. Theory Eng., no. June 2017, pp. 332–337, 2011.

A. J. Thomas, M. Petridis, S. D. Walters, S. M. Gheytassi, and R. E. Morgan, “Two hidden layers are usually better than one,” Commun. Comput. Inf. Sci., vol. 744, pp. 279–290, 2017.

S. Narejo and E. Pasero, “An application of internet traffic prediction with deep neural network,” Smart Innov. Syst. Technol., vol. 69, pp. 139–149, 2017.

C. E. Nwankpa, W. Ijomah, A. Gachagan, and S. Marshall, “Activation functions: Comparison of trends in practice and research for deep learning,” arXiv, pp. 1–20, 2018.




DOI: http://dx.doi.org/10.17977/um018v4i12021p14-28

Refbacks

  • There are currently no refbacks.


Copyright (c) 2021 Knowledge Engineering and Data Science

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Flag Counter

Creative Commons License


This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

View My Stats