Facebook Prophet Model with Bayesian Optimization for USD Index Prediction

Ahmad Fitra Hamdani, Daniel Swanjaya, Risa Helilintar

Abstract


Accuracy is the primary focus in prediction research. Optimization is conducted to improve the performance of prediction models, thereby enhancing prediction accuracy. This study aims to optimize the Facebook Prophet model by performing hyperparameter tuning using Bayesian Optimization to improve the accuracy of USD Index Value prediction. Evaluation is conducted through multiple prediction experiments using different ranges of historical data. The results of the study demonstrate that performing hyperparameter tuning on the Facebook Prophet model yields better prediction results. Prior to parameter tuning, the MAPE indicator metric is 1.38% for the historical data range of 2014-2023, and it decreases to 1.33% after parameter tuning. Further evaluation shows improved prediction performance using different ranges of historical data. For the historical data range of 2015-2023, the MAPE value decreases from 1.39% to 1.20%. Similarly, for the data range of 2016-2023, the MAPE decreases from 1.12% to 0.80%. Furthermore, for the data range of 2017-2023, there is a decrease from 0.80% to 0.76%. This is followed by the data range of 2018-2023, with a decrease from 0.75% to 0.70%. Lastly, for the data range of 2019-2023, there is a decrease from 0.63% to 0.55%. These results demonstrate that performing Hyperparameter Optimization using Bayesian Optimization consistently improves prediction accuracy in the Facebook Prophet model.

Keywords


Facebook Prophet, Bayesian Optimization, USD Index, Prediction, Data Mining

References


[1] T. Jalbert, “Dollar index adjusted stock indices,” J. Appl. Bus. Res., vol. 30, no. 1, pp. 1–13, 2014, doi: 10.19030/jabr.v30i1.8275.

[2] M. More, “Analyzing the Impact of Multiple Stock Indices in Prediction of US Dollar Index,” p. 20, 2020.

[3] S. J. Taylor and B. Letham, “Forecasting at Scale,” Am. Stat., vol. 72, no. 1, pp. 37–45, 2018, doi: 10.7287/peerj.preprints.3190v2.

[4] P. Probst, A. L. Boulesteix, and B. Bischl, “Tunability: Importance of hyperparameters of machine learning algorithms,” J. Mach. Learn. Res., vol. 20, pp. 1–32, 2019, doi: 10.48550/arXiv.1802.09596.

[5] A. H. Victoria and G. Maragatham, “Automatic tuning of hyperparameters using Bayesian optimization,” Evol. Syst., vol. 12, no. 1, pp. 217–223, 2021, doi: 10.1007/s12530-020-09345-2.

[6] G. Malkomes, C. Schaff, and R. Garnett, “Bayesian optimization for automated model selection,” Adv. Neural Inf. Process. Syst., no. Nips, pp. 2900–2908, 2016.

[7] T. Liwei, F. Li, S. Yu, and G. Yuankai, “Forecast of LSTM-XGBoost in stock price based on Bayesian optimization,” Intell. Autom. Soft Comput., vol. 29, no. 3, pp. 855–868, 2021, doi: 10.32604/iasc.2021.016805.

[8] R. Shi, X. Xu, J. Li, and Y. Li, “Prediction and analysis of train arrival delay based on XGBoost and Bayesian optimization,” Appl. Soft Comput., vol. 109, p. 107538, 2021, doi: 10.1016/j.asoc.2021.107538.

[9] A. M. Elshewey, M. Y. Shams, N. El-Rashidy, A. M. Elhady, S. M. Shohieb, and Z. Tarek, “Bayesian Optimization with Support Vector Machine Model for Parkinson Disease Classification,” Sensors 2023, Vol. 23, Page 2085, vol. 23, no. 4, p. 2085, Feb. 2023, doi: 10.3390/S23042085.

[10] S. Lahmiri, “Integrating convolutional neural networks, kNN, and Bayesian optimization for efficient diagnosis of Alzheimer’s disease in magnetic resonance images,” Biomed. Signal Process. Control, vol. 80, p. 104375, Feb. 2023, doi: 10.1016/J.BSPC.2022.104375.

[11] X. Long, X. Gu, C. Lu, Z. Li, Y. Ma, and Z. Jian, “Prediction of the jump height of transmission lines after ice-shedding based on XGBoost and Bayesian optimization,” Cold Reg. Sci. Technol., vol. 213, p. 103928, Sep. 2023, doi: 10.1016/J.COLDREGIONS.2023.103928.

[12] I. A. K. Shaikh, P. V. Krishna, S. G. Biswal, A. S. Kumar, S. Baranidharan, and K. Singh, “Bayesian Optimization with Stacked Sparse Autoencoder based Cryptocurrency Price Prediction Model,” pp. 653–658, Mar. 2023, doi: 10.1109/ICSSIT55814.2023.10061153.

[13] C. He, D. Wang, Y. Yu, and Z. Cai, “A Hybrid Deep Learning Model for Link Dynamic Vehicle Count Forecasting with Bayesian Optimization,” J. Adv. Transp., vol. 2023, 2023, doi: 10.1155/2023/5070504.

[14] M. Lin, S. Teng, G. Chen, and B. Hu, “Application of convolutional neural networks based on Bayesian optimization to landslide susceptibility mapping of transmission tower foundation,” Bull. Eng. Geol. Environ., vol. 82, no. 2, pp. 1–21, Feb. 2023, doi: 10.1007/S10064-023-03069-8/TABLES/11.

[15] D. Zhang, X. Jin, P. Shi, and X. Y. Chew, “Real-time load forecasting model for the smart grid using bayesian optimized CNN-BiLSTM,” Front. Energy Res., vol. 11, p. 1193662, May 2023, doi: 10.3389/FENRG.2023.1193662/BIBTEX.

[16] E. T. Habtemariam, K. Kekeba, M. Martínez-Ballesteros, and F. Martínez-Álvarez, “A Bayesian Optimization-Based LSTM Model for Wind Power Forecasting in the Adama District, Ethiopia,” Energies 2023, Vol. 16, Page 2317, vol. 16, no. 5, p. 2317, Feb. 2023, doi: 10.3390/EN16052317.

[17] B. Kurt et al., “Prediction of gestational diabetes using deep learning and Bayesian optimization and traditional machine learning techniques,” Med. Biol. Eng. Comput., vol. 61, no. 7, pp. 1649–1660, Jul. 2023, doi: 10.1007/S11517-023-02800-7/FIGURES/3.

[18] H. Meng, M. Geng, and T. Han, “Long short-term memory network with Bayesian optimization for health prognostics of lithium-ion batteries based on partial incremental capacity analysis,” Reliab. Eng. Syst. Saf., vol. 236, p. 109288, Aug. 2023, doi: 10.1016/J.RESS.2023.109288.

[19] L. Menculini et al., “Comparing Prophet and Deep Learning to ARIMA in Forecasting Wholesale Food Prices,” Forecasting, vol. 3, no. 3, pp. 644–662, 2021, doi: 10.3390/forecast3030040.

[20] R. S. Pontoh, S. Zahroh, H. R. Nurahman, R. I. Aprillion, A. Ramdani, and D. I. Akmal, “Applied of feed-forward neural network and facebook prophet model for train passengers forecasting,” J. Phys. Conf. Ser., vol. 1776, no. 1, 2021, doi: 10.1088/1742-6596/1776/1/012057.

[21] S. Kaninde, M. Mahajan, A. Janghale, and B. Joshi, “Stock Price Prediction using Facebook Prophet,” ITM Web Conf., vol. 44, p. 03060, 2022, doi: 10.1051/itmconf/20224403060.

[22] P. I. Frazier, “A Tutorial on Bayesian Optimization,” no. Section 5, pp. 1–22, 2018, [Online]. Available: http://arxiv.org/abs/1807.02811

[23] W. Zhang, C. Wu, H. Zhong, Y. Li, and L. Wang, “Prediction of undrained shear strength using extreme gradient boosting and random forest based on Bayesian optimization,” Geosci. Front., vol. 12, no. 1, pp. 469–477, 2021, doi: 10.1016/j.gsf.2020.03.007.

[24] C. Chen and H. Seo, “Prediction of rock mass class ahead of TBM excavation face by ML and DL algorithms with Bayesian TPE optimization and SHAP feature analysis,” Acta Geotech., vol. 18, no. 7, pp. 3825–3848, Jan. 2023, doi: 10.1007/s11440-022-01779-z.

[25] X. B. Jin et al., “Deep-learning forecasting method for electric power load via attention-based encoder-decoder with bayesian optimization,” Energies, vol. 14, no. 6, 2021, doi: 10.3390/en14061596.

[26] X. Zhang, B. He, M. M. S. Sabri, M. Al-Bahrani, and D. V. Ulrikh, “Soil Liquefaction Prediction Based on Bayesian Optimization and Support Vector Machines,” Sustain., vol. 14, no. 19, 2022, doi: 10.3390/su141911944.

[27] R. Astudillo and P. I. Frazier, “Bayesian optimization of composite functions,” 36th Int. Conf. Mach. Learn. ICML 2019, vol. 2019-June, pp. 547–556, 2019, doi: 10.48550/arXiv.1906.01537.

[28] J. Liu, C. Jiang, and J. Zheng, “Batch Bayesian optimization via adaptive local search,” Appl. Intell., vol. 51, no. 3, pp. 1280–1295, Mar. 2021, doi: 10.1007/S10489-020-01790-5/METRICS.

[29] S. H. Jafar, “Financial Applications of Gaussian Processes and Bayesian Optimization,” Bayesian Reason. Gaussian Process. Mach. Learn. Appl., pp. 111–122, 2022, [Online]. Available: https://arxiv.org/abs/1903.04841

[30] A. Hebbal, L. Brevault, M. Balesdent, E. G. Talbi, and N. Melab, Bayesian optimization using deep Gaussian processes with applications to aerospace system design, vol. 22, no. 1. Springer US, 2021. doi: 10.1007/s11081-020-09517-8.

[31] J. Drahokoupil, “Application of the XGBoost algorithm and Bayesian optimization for the Bitcoin price prediction during the COVID-19 period,” vol. 4, no. 2022.006, 2022, [Online]. Available: https://ideas.repec.org/p/prg/jnlwps/v4y2022id4.006.html

[32] R. P. A. Jasper Snoek, Hugo Larochelle, “Practical Bayesian Optimization of Machine Learning Algorithms,” NeurIPS Thirty-seventh Conf. Neural Inf. Process. Syst., pp. 1–9, 2012, doi: 10.1163/15685292-12341254.

[33] H. J. Kushner, “A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise,” J. Fluids Eng. Trans. ASME, vol. 86, no. 1, pp. 97–106, 1964, doi: 10.1115/1.3653121.

[34] N. Srinivas, A. Krause, S. Kakade, and M. Seeger, “Gaussian process optimization in the bandit setting: No regret and experimental design,” ICML 2010 - Proceedings, 27th Int. Conf. Mach. Learn., pp. 1015–1022, 2010, doi: 10.1109/TIT.2011.2182033.

[35] M. Becerra, A. Jerez, B. Aballay, H. O. Garcés, and A. Fuentes, “Forecasting emergency admissions due to respiratory diseases in high variability scenarios using time series: A case study in Chile,” Sci. Total Environ., vol. 706, p. 134978, 2020, doi: 10.1016/j.scitotenv.2019.134978.

[36] D. GÜLERYÜZ and E. ÖZDEN, “LSTM ve Facebook Prophet Kullanarak Brent Ham Petrol Trendinin Tahmini,” Eur. J. Sci. Technol., no. 20, pp. 1–9, 2020, doi: 10.31590/ejosat.759302.

[37] “Trend Changepoints | Prophet.” https://facebook.github.io/prophet/docs/trend_changepoints.html (accessed Jun. 06, 2023).

[38] “Seasonality, Holiday Effects, And Regressors | Prophet.” https://facebook.github.io/prophet/docs/seasonality,_holiday_effects,_and_regressors.html (accessed Jun. 06, 2023).

[39] “Uncertainty Intervals | Prophet.” https://facebook.github.io/prophet/docs/uncertainty_intervals.html (accessed Jul. 12, 2023).


Full Text: PDF

DOI: 10.30595/juita.v11i2.17880

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

ISSN: 2579-8901