Acessibilidade / Reportar erro

A comparative study of forecasting methods using real-life econometric series data

Abstract

Paper aims

This paper presents a comparative evaluation of different forecasting methods using two artificial neural networks (Multilayer Perceptron network and Radial Basis Functions Neural Network) and the Gaussian process regression.

Originality

Due to the current world scenario, solving economic problems has become extremely important. Artificial neural networks are one of the most promising tools to forecast economic trends and are being widely studied in economic analyses. Therefore, due to the concerns about the performance of different forecasting methods to solve economic problems, this study contributes with an example of the forecasting performance of artificial neural network models compared with Gaussian process regression using Nelson-Plosser and U.S. macroeconomic real-life data sets.

Research method

Two real-life data sets were used to evaluate the forecasting methods proposed in this paper. These data sets were normalised to values between zero and one. After that, the data training was performed and, once it was built, a model was used to generate forecasts. Thus, observations were made to verify how accurately the fitted model forecast the values.

Main findings

The results obtained from the study show that, for all forecasting horizons, multi-layer perceptron networks and Gaussian process regression models had the most satisfactory results. On the other hand, the radial basis functions neural network model was unsuitable for econometric data.

Implications for theory and practice

This study contributes to a discussion about artificial neural networks and Gaussian process regression models for econometric forecasting. Although artificial neural networks are mainly used in economic analyses, the results showed that not all models, such as radial basis functions neural networks, present good results. In addition, the regression of the Gaussian process showed promising results to forecast econometric data.

Keywords
Artificial neural networks; Gaussian process regression; Forecasting; Macroeconomic series

1. Introduction

Over the last few years, the world has been going through several cascading crises, and economic problems have become increasingly concerning in many countries. COVID-19 affected many aspects of society, from the economic to the digital spheres (Robinson et al., 2021Robinson, L., Schulz, J., Ragnedda, M., Pait, H., Kwon, K. H., & Khilnani, A. (2021). An Unequal Pandemic: vulnerability and COVID-19. The American Behavioral Scientist, 1-5. https://doi.org/10.1177/00027642211003141.
https://doi.org/10.1177/0002764221100314...
). Thus, studies that can contribute to helping governments forecast their indicators and recover their economy are in growing demand. Several studies are developing new theories, methods, and tools, such as neural networks, to improve the modelling of economic problems (Khraisha, 2020Khraisha, T. (2020). Complex economic problems and fitness landscapes: Assessment and methodological perspectives. Structural Change and Economic Dynamics, 52, 390-407. http://dx.doi.org/10.1016/j.strueco.2019.01.002.
http://dx.doi.org/10.1016/j.strueco.2019...
).

Time series are observations on economic variables, which can be drawn from various fields of economics and business. Such variables include: dividend price ratio, dividend yield, industrial production growth, treasury bill rate, inflation, long-term return, earnings-price ratio, default yield spread, default return spread, volatility of industrial production growth, volatility of producer's price index (Chen et al., 2016Chen, J., Jiang, F., Li, H., & Xu, W. (2016). Chinese stock market volatility and the role of U.S. economic variables. Pacific-Basin Finance Journal, 39, 70-83. http://dx.doi.org/10.1016/j.pacfin.2016.05.013.
http://dx.doi.org/10.1016/j.pacfin.2016....
) stock market indices, unemployment rates, and market shares. Out-of-sample forecasts for such variables are often needed to set policy targets. For example, the forecast of a market share company in the next few months may lead to changes in the budget allocation for advertising.

The quantitative analysis of these phenomena can be called econometric time series models. Econometrics is a statistical method applied to economic data to give empirical content to economic relationships (Brooks et al., 2019Brooks, C., Hoepner, A. G. F., Mcmillan, D., Vivian, A., & Wese Simen, C. (2019). Financial data science: the birth of a new financial research paradigm complementing econometrics? The European Journal of Finance, 25(17), 1627-1636. Retrieved in 2021, May 15, from http://www.tandfonline.com/doi/abs/10.1080/1351847X.2019.1662822
http://www.tandfonline.com/doi/abs/10.10...
).

Forecasting methods linked to economic problems are used to predict economic variables of political debate in many countries. Some examples include: the projection of the future unemployment rate, which defines the condition of a country's economic equilibrium and is fundamental to social development (Dritsakis & Klazoglou, 2018Dritsakis, N., & Klazoglou, P. (2018). Forecasting unemployment rates in USA using Box-Jenkins methodology. International Journal of Economics and Financial Issues, 8(1), 9-20. Retrieved in 2021, May 15, from https://doaj.org/article/3e560af0e38c40409586478cc51c7d43
https://doaj.org/article/3e560af0e38c404...
); the industry volatility forecasting, crucial to many essential issues in finance (Chen et al., 2016Chen, J., Jiang, F., Li, H., & Xu, W. (2016). Chinese stock market volatility and the role of U.S. economic variables. Pacific-Basin Finance Journal, 39, 70-83. http://dx.doi.org/10.1016/j.pacfin.2016.05.013.
http://dx.doi.org/10.1016/j.pacfin.2016....
); the prediction in measuring equity risk premiums, perhaps the most widely studied problem in finance (Gu et al., 2020Gu, S., Kelly, B., & Xiu, D. (2020). Empirical asset pricing via machine learning. Review of Financial Studies, 33(5), 2223-2273. http://dx.doi.org/10.1093/rfs/hhaa009.
http://dx.doi.org/10.1093/rfs/hhaa009...
); and the forecast of variations in the price of products and services in agribusiness, which is an essential part of the gross domestic product of Brazil (Puchalsky et al., 2018Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010.
http://dx.doi.org/10.1016/j.ijpe.2018.06...
). The examples cited above highlight an important question about which model is the most accurate to help political decisions.

A method that has been widely studied in economic analysis is Artificial Neural Networks (ANN). ANN are computational systems that can be implemented in software or hardware under the influence of biological research regarding the human brain (Puchalsky et al., 2018Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010.
http://dx.doi.org/10.1016/j.ijpe.2018.06...
). Many researchers agree that ANN is the best predictor and the best performing non-linear analysis method (Gu et al., 2020Gu, S., Kelly, B., & Xiu, D. (2020). Empirical asset pricing via machine learning. Review of Financial Studies, 33(5), 2223-2273. http://dx.doi.org/10.1093/rfs/hhaa009.
http://dx.doi.org/10.1093/rfs/hhaa009...
). The ANN architectures used in the field of economics are Backpropagation and Radial Basis Function Networks (RBF).

Thus, several authors and researchers face concerns about the capability of different forecasting methods to solve economic problems. Some studies compared ANN with traditional forecasting techniques (Gu et al., 2020Gu, S., Kelly, B., & Xiu, D. (2020). Empirical asset pricing via machine learning. Review of Financial Studies, 33(5), 2223-2273. http://dx.doi.org/10.1093/rfs/hhaa009.
http://dx.doi.org/10.1093/rfs/hhaa009...
; Puchalsky et al., 2018Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010.
http://dx.doi.org/10.1016/j.ijpe.2018.06...
; Safari et al., 2016Safari, M.-J.-S., Aksoy, H., & Mohammadi, M. (2016). Artificial neural network and regression models for flow velocity at sediment incipient deposition. Journal of Hydrology (Amsterdam), 541, 1420-1429. http://dx.doi.org/10.1016/j.jhydrol.2016.08.045.
http://dx.doi.org/10.1016/j.jhydrol.2016...
; Yu et al., 2006Yu, R., Leung, P., & Bienfang, P. (2006). Predicting shrimp growth: artificial neural network versus nonlinear regression models. Aquacultural Engineering, 34(1), 26-32. http://dx.doi.org/10.1016/j.aquaeng.2005.03.003.
http://dx.doi.org/10.1016/j.aquaeng.2005...
; Zhang et al., 2019Zhang, X., Xue, T., & Eugene Stanley, H. (2019). Comparison of econometric models and artificial neural networks algorithms for the prediction of baltic dry index. IEEE Access: Practical Innovations, Open Solutions, 7(99), 1647-1657. http://dx.doi.org/10.1109/ACCESS.2018.2884877.
http://dx.doi.org/10.1109/ACCESS.2018.28...
).

In general, ANN outperformed these techniques, and was able to capture dynamic non-linear trends and seasonal patterns and the interactions between them. Therefore, this paper presents a comparative study of the performance of the econometric model and ANN. However, what sets this study apart from many others published in the area is that it uses the Nelson-Plosser (Nelson & Plosser, 1982Nelson, C. R., & Plosser, C. R. (1982). Trends and random walks in macroeconmic time series. Journal of Monetary Economics, 10(2), 139-162. http://dx.doi.org/10.1016/0304-3932(82)90012-5.
http://dx.doi.org/10.1016/0304-3932(82)9...
) and U.S. macroeconomic (Smets & Wouters, 2005Smets, F., & Wouters, R. (2005). Comparing shocks and frictions in US and euro area business cycles: a Bayesian DSGE approach. Journal of Applied Econometrics, 20(2), 161-183. http://dx.doi.org/10.1002/jae.834.
http://dx.doi.org/10.1002/jae.834...
) real-life data sets to analyse and evaluate both methods. Both data sets comprise fourteen U.S. data observed yearly, which generate fourteen economic time series widely used by researchers in this field. Furthermore, unlike most other studies in the field, this uses Gaussian Process Regression (GPR), which is capable of yielding reliable out-of-sample predictions in the presence of highly non-linear unknown relationships between dependent and explanatory variables.

Regarding the remainder of this paper, section 2 delineates some related works in which econometrics models were compared to ANN. Section 3 consists of a brief background review of forecasting methods, and section 4 presents the accuracy methods used to evaluate the performance of the forecasting methods. The experimental evaluation, final results, and some discussions are described in section 5. Finally, section 6 comprises the conclusion of this paper and suggestions for future work.

2. Related works

As mentioned previously, forecasting techniques are fundamental in several fields of research. Several authors explored these techniques to demonstrate which method has better performance. In this section, some comparative works are presented.

A systematic review was performed in indexed journals and present in one of the following databases: Web of Science, Scopus or Academic Search Premier (ASP – EBSCO). The inclusion criteria for articles were as follows: containing one or more of the chosen descriptors (i.e., <forecasting>, <method forecasting>, <time series forecasting> or <economic time series>); published in English, between 2016 and 2021. The exclusion criteria were conference abstracts or book chapters.

The first study (Puchalsky et al., 2018Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010.
http://dx.doi.org/10.1016/j.ijpe.2018.06...
), aims to evaluate the performance of Wavelet Neural Networks (WNN), combined with five optimisation techniques to obtain the best time series forecast, which have been used in two case studies on the agribusiness sector. The performance of the optimisation techniques when training the WNN was compared to the well-established Backpropagation algorithm and Extreme Learning Machine (ELM), assuming accuracy measures. In both cases analysed in the study, ELM outperforms most of the methods during validation procedures. However, during short and long-term tests, metaheuristic optimisation methods used for training outperformed ELM results in almost all cases.

A time series study regarding quarterly observations on Gross Domestic Product (GDP) compares the results of a multi-layer perceptron network with those of the Autoregressive Integrated Moving Average (ARIMA) and regression as benchmark methods (Safi, 2016Safi, S. K. (2016). A comparison of artificial neural network and time series models for forecasting GDP in Palestine. American Journal of Theoretical and Applied Statistics, 5(2), 58. http://dx.doi.org/10.11648/j.ajtas.20160502.13.
http://dx.doi.org/10.11648/j.ajtas.20160...
). Using Root Mean Square Error, the empirical results show that ANN performs better than the traditional methods in forecasting GDP.

The work of (Bandeira et al., 2020Bandeira, S. G., Alcalá, S. G. S., Vita, R. O., Barbosa, T. M. G. A. (2020). Comparison of selection and combination strategies for demand forecasting methods. Production, 30, e20200009. https://doi.org/10.1590/0103-6513.20200009.
https://doi.org/10.1590/0103-6513.202000...
) proposes a forecasting application strategy considering two procedures: the combination of state-of-the-art forecasting methods and the selection of forecasting methods based on the accuracy of the models. The authors propose two combination strategies: simple mean and weighted mean based on the accuracy of the methods. They used two data sets with different characteristics — a public dataset of the competition and a private data set of spare part demand from an elevator industry. The results showed that the combination of forecasting methods is valuable if a weighting scheme based on the performance is employed. However, the combination using a simple mean outperforms other forecasting methods.

In another study, (Zhang et al., 2019Zhang, X., Xue, T., & Eugene Stanley, H. (2019). Comparison of econometric models and artificial neural networks algorithms for the prediction of baltic dry index. IEEE Access: Practical Innovations, Open Solutions, 7(99), 1647-1657. http://dx.doi.org/10.1109/ACCESS.2018.2884877.
http://dx.doi.org/10.1109/ACCESS.2018.28...
) presents a comparison of the performance of two forecasting techniques. The authors compared the econometric and computational approaches of ANN algorithms, and the result shows that the ANN approach predicts the most accurate weekly and monthly data. On the other hand, econometric forecasting models produce better one-step-ahead predictions than ANN-based algorithms using daily data.

Finally, (dos Santos et al., 2020dos Santos, C. H., Lima, R. D. C., Leal, F., de Queiroz, J. A., Balestrassi, P. P., & Montevechi, J. A. B. (2020). A decision support tool for operational planning: a Digital Twin using simulation and forecasting methods. Production, 30, e20200018. Retrieved in 2021, May 15, from http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0103-65132020000100708&nrm=iso
http://www.scielo.br/scielo.php?script=s...
) explores the use of integrated tools, aiming at the creation of a continuous decision aid system, a Digital Twin, using the Discrete Event Simulation to describe and optimise the behaviour of a process, with the aid of forecasting models based on the moving average, single exponential smoothing, and double exponential smoothing methods. The proposed approach was applied to a real subject, which concerns a material supply process in Kanban stations of an aeronautical industry. The results of both techniques showed the great versatility of simulation and forecasting methods regarding their use in an integrated way, forming a Digital Twin.

In general, the studies present a comparative evaluation of forecasting methods applied in different contexts and the results on each one’s performance. Nevertheless, to the best of our knowledge, no study has ever compared artificial neural networks and Gaussian process regression performance using Nelson-Plosser (1860–1970) and U.S. macroeconomic (1947–2009) real-life data.

3. Forecasting methods

Forecasting methods predict future values based on a given time series dataset by evaluating historical data and making assumptions on future trends. This can be applied to many areas of the decision-making process, such as operations management, risk management, economics, industrial process control, and demography (dos Santos et al., 2020dos Santos, C. H., Lima, R. D. C., Leal, F., de Queiroz, J. A., Balestrassi, P. P., & Montevechi, J. A. B. (2020). A decision support tool for operational planning: a Digital Twin using simulation and forecasting methods. Production, 30, e20200018. Retrieved in 2021, May 15, from http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0103-65132020000100708&nrm=iso
http://www.scielo.br/scielo.php?script=s...
).

Forecasting is a significant problem spanning many fields, including business and industry, government, economics, environmental sciences, medicine, social science, politics, and finance. Forecasting problems are often classified as short-term, medium-term, and long-term. Short-term forecasting problems involve predicting events within a short time span (days, weeks, and months). Medium-term forecasts can extend from 1 to 2 years into the future, and long-term forecasting problems can extend far beyond that (Montgomery et al., 2015Montgomery, D. C., Jennings, C., & Kulahci, M. (2015). Introduction to time series analysis and forecasting. Hoboken: Wiley.).

In this study, econometric modelling methods are used to forecast a long-term and medium-term horizon using Nelson-Plosser and U.S. macroeconomic series data.

3.1. Gaussian process regression

Gaussian process regression (GPR), a new machine learning regression method developed in recent years, is a non-parametric model algorithm based on a Bayesian network (Rasmussen & Nickisch, 2010Rasmussen, C. E., & Nickisch, H. (2010). Gaussian Processes for Machine Learning (GPML) Toolbox. Journal of Machine Learning Research, 11, 3011-3015.). The GPR algorithm can adaptively determine the number of model parameters according to the information provided by training samples, add prior knowledge of the existing objects into the modelling process, and then combine the actual experimental data to obtain the posterior Gauss process model (Fu et al., 2019Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005.
http://dx.doi.org/10.1016/j.inpa.2018.11...
).

GPR has been applied in various fields due to many desirable properties, such as the existence of explicit forms, the ease of obtaining and expressing uncertainty in predictions, the ability to capture a wide variety of behaviour through covariance functions, and a natural Bayesian interpretation (Wu & Wang, 2018Wu, R., & Wang, B. (2018). Gaussian process regression method for forecasting of mortality rates. Neurocomputing (Amsterdam), 316, 232-239. http://dx.doi.org/10.1016/j.neucom.2018.08.001.
http://dx.doi.org/10.1016/j.neucom.2018....
). As an example, GPR has been used to capture randomness in wind energy, since wind variability is stochastic (Yan et al., 2016Yan, J., Li, K., Bai, E., Yang, Z., & Foley, A. (2016). Time series wind power forecasting based on variant Gaussian process and TLBO. Neurocomputing (Amsterdam), 189, 135-144. http://dx.doi.org/10.1016/j.neucom.2015.12.081.
http://dx.doi.org/10.1016/j.neucom.2015....
).

GPR can be considered a Bayesian non-parametric approach to regression, where the function from the Gaussian processes takes values in a function space (Ballabio et al., 2019Ballabio, C., Lugato, E., Fernandez-Ugalde, O., Orgiazzi, A., Jones, A., Borrelli, P., Montanarella, L., & Panagos, P. (2019). Mapping LUCAS topsoil chemical properties at European scale using Gaussian process regression. Geoderma, 355, 113912. http://dx.doi.org/10.1016/j.geoderma.2019.113912. PMid:31798185.
http://dx.doi.org/10.1016/j.geoderma.201...
). A Gaussian process is a stochastic process expressed through the mean and the covariance (Rasmussen & Nickisch, 2010Rasmussen, C. E., & Nickisch, H. (2010). Gaussian Processes for Machine Learning (GPML) Toolbox. Journal of Machine Learning Research, 11, 3011-3015.). GPR assumes that the output y of a function f with input x can be expressed as:

y=fx+ε,(1)

where, y= y1 , y2, , ynT is the n × 1 dimensional observation vector affected by noise, x= xi,1 , xi,2, , xi,LT, xi RL , i=1, 2,, n is n × 1 dimensional random variable obeying Gaussian distribution; εN0,σϵ2 is the observation noise vector obeying Gaussian distribution independently of each other, N is normal distribution and σϵ2 is the error variance.

Gaussian processes (GP) for regression prediction are a supervised machine learning problem that can learn to map the relationship between input and corresponding output values when given the training set composed of input-output pairs (Fu et al., 2019Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005.
http://dx.doi.org/10.1016/j.inpa.2018.11...
). The GPR method first defines a prior distribution in the function space for Bayesian inference, and by learning from the test sample data, it can describe the fx indirectly and accurately (Fu et al., 2019Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005.
http://dx.doi.org/10.1016/j.inpa.2018.11...
).

The GPR is based on GP modelling, where GP is a set of joint Gaussian distributions of arbitrary finite random variables (Ballabio et al., 2019Ballabio, C., Lugato, E., Fernandez-Ugalde, O., Orgiazzi, A., Jones, A., Borrelli, P., Montanarella, L., & Panagos, P. (2019). Mapping LUCAS topsoil chemical properties at European scale using Gaussian process regression. Geoderma, 355, 113912. http://dx.doi.org/10.1016/j.geoderma.2019.113912. PMid:31798185.
http://dx.doi.org/10.1016/j.geoderma.201...
; Fu et al., 2019Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005.
http://dx.doi.org/10.1016/j.inpa.2018.11...
), and its statistical properties are uniquely determined by the mean and covariance functions. In GPR, fx is distributed as a Gaussian process:

fxGPμx, kx,x*,(2)

where fx is defined by its mean μx and covariance kx,x*; x* is the estimated value; and the variances σ2 for the elements of fx can be obtained from the diagonal of the covariance matrix.

Let the observation data set T= xi , yi|i=1, 2, , n, where T also represents the training sample set or the learning sample set. Under the condition that the training set T has been obtained, the posterior distribution of the predicted value y is as follows:

p y * | T , x * N μ y * , k x , x * (3)

The covariance function k, also known as the GPR kernel, and models the dependence of the function values between different values of x. In this study, the kernel function chosen was the squared exponential covariance function, which is defined as:

kxi,xj; θ=σf2exp12 l=1Ldlxil xjl2+σn2δij,(4)

where xil is the component of the input vector xi RL in the l dimension; θ= σf2, D,σn2 is the hyperparameter set of the kernel function, which is a set of vectors consisting of signal variance σf2, noise variance σn2, and model covariance function parameters; D=diagd1, , dl, , dL is a symmetric matrix, and dl reflects the degree of association between the input variable and the target output variable; δij is the Kronecker operator.

The GPR model’s building and testing processes are shown in Figure 1. In this study, the forecasting based on the GPR method can be divided into the GPR training model construction stage and the test stage.

Figure 1
GPR modelling and testing process. Source: (Fu et al., 2019Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005.
http://dx.doi.org/10.1016/j.inpa.2018.11...
).

In general, the GPR training process is based on the Bayesian principle to obtain the maximum posterior likelihood estimate of θ as the hyperparameter optimal solution. When the training of GPR is completed, the corresponding covariance matrix can be obtained and x* to predict, then the y-value can be obtained.

3.2. Artificial neural network

Artificial Neural Networks, which were originally inspired by research on the human brain, are a computational method that can be implemented in hardware or software (Puchalsky et al., 2018Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010.
http://dx.doi.org/10.1016/j.ijpe.2018.06...
; Samadianfard et al., 2020Samadianfard, S., Hashemi, S., Kargar, K., Izadyar, M., Mostafaeipour, A., Mosavi, A., Nabipour, N., & Shamshirband, S. (2020). Wind speed prediction using a hybrid model of the multi-layer perceptron and whale optimisation algorithm. Energy Reports, 6, 1147-1159. Retrieved in 2021, May 15, from https://doaj.org/article/721f11b629bb4820a171614924eab0ce
https://doaj.org/article/721f11b629bb482...
). They have been used extensively in different application areas, especially for non-linear time series modelling (Chen et al., 2020Chen, Z., Lin, X., Xiong, C., & Chen, N. (2020). Modeling the relationship of precipitation and water level using grid precipitation products with a neural network model. Remote Sensing, 12(7), 1096. http://dx.doi.org/10.3390/rs12071096.
http://dx.doi.org/10.3390/rs12071096...
; Li et al., 2019Li, Y., Liu, R. W., Liu, Z., & Liu, J. (2019). Similarity grouping-guided neural network modeling for maritime time series prediction. IEEE Access: Practical Innovations, Open Solutions, 7(99), 72647-72659. http://dx.doi.org/10.1109/ACCESS.2019.2920436.
http://dx.doi.org/10.1109/ACCESS.2019.29...
; Sun et al., 2019Sun, S., Lu, H., Tsui, K.-L., & Wang, S. (2019). Nonlinear vector auto-regression neural network for forecasting air passenger flow. Journal of Air Transport Management, 78, 54-62. http://dx.doi.org/10.1016/j.jairtraman.2019.04.005.
http://dx.doi.org/10.1016/j.jairtraman.2...
). ANN have several advantages over other forecasting models, such as the capacity of fitting a complex non-linear function (Büyükşahin & Ertekin, 2019Büyükşahin, Ü. Ç., & Ertekin, Ş. (2019). Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition. Neurocomputing (Amsterdam), 361, 151-163. http://dx.doi.org/10.1016/j.neucom.2019.05.099.
http://dx.doi.org/10.1016/j.neucom.2019....
). ANN’s flexibility and non-linear learning capabilities make this method coherent with research on forecasting (Gupta et al., 2017Gupta, P., Batra, S. S., & Jayadeva, (2017). Sparse short-term time series forecasting models via minimum model complexity. Neurocomputing (Amsterdam), 243, 1-11. http://dx.doi.org/10.1016/j.neucom.2017.02.002.
http://dx.doi.org/10.1016/j.neucom.2017....
). The ANN, as a prevalent modelling method, has been used for the identification of the complicated non-linear relationship of inputs and output (Nourani et al., 2021Nourani, V., Paknezhad, N. J., & Tanaka, H. (2021). Prediction Interval Estimation Methods for Artificial Neural Network (ANN)-based modeling of the hydro-climatic processes, a review. Sustainability, 13(4), 1633. http://dx.doi.org/10.3390/su13041633.
http://dx.doi.org/10.3390/su13041633...
).

ANN provides a flexible computation framework for non-linear modelling in several applications, and thus, the number of layers and the neurons at each layer can easily vary. Moreover, ANN does not demand any prior assumption, such as input data stationarity, and the characteristics of the data largely determine an ANN configuration (Büyükşahin & Ertekin, 2019Büyükşahin, Ü. Ç., & Ertekin, Ş. (2019). Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition. Neurocomputing (Amsterdam), 361, 151-163. http://dx.doi.org/10.1016/j.neucom.2019.05.099.
http://dx.doi.org/10.1016/j.neucom.2019....
).

ANN combines several processing layers, using simple elements operating in parallel. It consists of an input layer, one or more hidden layers, and an output layer. Each layer contains several neurons that can modify the inputs with weights and activation functions to obtain the output. In this study, a three-layered structure composed of (i) input layer, (ii) hidden layer, and (iii) output layer is used, as seen in Figure 2.

Figure 2
Neural Network architecture. Source: Authors.

The mathematical formulation of ANN models can be expressed by Equation 5. In this Equation, at any given time t, wij and wj are model weights, w0 is the threshold (bias), H and N are the number of hidden and input nodes, respectively, and et is a noise or error term.

y t = w 0 + j = 1 H w j f w 0 j + i = 1 N w i j y t 1 + e t (5)

Some reassuring types of ANN techniques are applicable in engineering problems, such as multi-layer perceptron and radial basis function (Sadeghi et al., 2021Sadeghi, G., Pisello, A. L., Nazari, S., Jowzi, M., & Shama, F. (2021). Empirical data-driven multi-layer perceptron and radial basis function techniques in predicting the performance of nanofluid-based modified tubular solar collectors. Journal of Cleaner Production, 295, 126409. http://dx.doi.org/10.1016/j.jclepro.2021.126409.
http://dx.doi.org/10.1016/j.jclepro.2021...
), both of which shall be briefly expounded in the following sections.

3.2.1. Multi-layer Perceptron network

The multi-layer perceptron network (MLP), also called the Backpropagation network, is a feedforward neural network with a supervised learning rule to search for weight values employing a linear activation function — an approach that tends to solve complex problems (Madhiarasan & Deepa, 2017Madhiarasan, M., & Deepa, S. (2017). Comparative analysis on hidden neurons estimation in multi layer perceptron neural networks for wind speed forecasting. Artificial Intelligence Review, 48(4), 449-471. http://dx.doi.org/10.1007/s10462-016-9506-6.
http://dx.doi.org/10.1007/s10462-016-950...
).

These networks learn linear and non-linear relationships between the input and output vectors because of hidden layer neurons and their non-linear transfer function. The most commonly used non-linear transfer function is the hyperbolic tangent sigmoid activation function applied over the net input of the hidden layer to obtain the respective output. The backpropagation gradient descent is employed to train multi-layer perceptron networks. These networks are fully connected, which induces the faster convergence of the network (Madhiarasan & Deepa, 2017Madhiarasan, M., & Deepa, S. (2017). Comparative analysis on hidden neurons estimation in multi layer perceptron neural networks for wind speed forecasting. Artificial Intelligence Review, 48(4), 449-471. http://dx.doi.org/10.1007/s10462-016-9506-6.
http://dx.doi.org/10.1007/s10462-016-950...
).MLP is a supervised learning algorithm that learns a non-linear function and maps inputs to outputs by training on a dataset (Feng et al., 2020Feng, X., Ma, G., Su, S.-F., Huang, C., Boswell, M. K., & Xue, P. (2020). A multi-layer perceptron approach for accelerated wave forecasting in Lake Michigan. Ocean Engineering, 211(1-2), 107526.). The topology of the MLP neural network includes an input layer, one hidden layer, and an output layer, and the operations can be divided into two steps: feedforward and backpropagation. In the feedforward step, an input pattern is applied to the input layer, and its effect propagates, layer by layer, through the network until the output is produced. The network current output value is then compared to the expected output, and an error signal is computed for each of the output nodes.

3.2.2. Radial basis functions neural network

A radial basis function neural network (RBF) is a feedforward neural network that uses a radial basis function as its activation function. The connections between the input and hidden layers are not weighted, and the transfer functions on the hidden layer nodes are radial basis functions in the RBF, which is different from those in BPNN and generally train faster than MLP due to the use of the radial basis functions (Zhang et al., 2013Zhang, X., Liu, Y., Yang, M., Zhang, T., Young, A. A., & Li, X. (2013). Comparative study of four time series methods in forecasting typhoid fever incidence in China. PLoS One, 8(5), e63116. http://dx.doi.org/10.1371/journal.pone.0063116. PMid:23650546.
http://dx.doi.org/10.1371/journal.pone.0...
).

The RBF is a general class of non-linear and three-layer feedforward neural networks: (i) an input layer, (ii) a hidden layer with neurons, and (iii) an output layer with one or several nodes (Rani R. & Victoire T, 2018). Yet, instead of using a sigmoid activation function, it performs a radially symmetric linear combination of n basis functions around a centre. Formally, for a given input x, the network output y can be written as:

y = i = 1 n ω i R i x + ω 0 (6)

where ωi are weights, ω0 is a bias term, n denotes the number of the neurons in the hidden layer, whereas Ri are the activation functions, given by:

R i x = φ x c i (7)

where φ is the radial function providing the non-linear feature of the model, and ci represents the so-called RBF centres.

The most popular RBF is given by the Gauss function:

φ r = e x p r 2 / σ 2 (8)

with r indicating the Euclidean distance between the input vector x and centre ci and σ being the so-called spread parameter to be determined.

4. Methodology for data analysis

This section describes the forecasting procedure to develop the ANN and GPR models using two real data sets for the study.

4.1. Data set

Two data sets were considered for the forecasting evaluation: the first study was the Nelson-Plosser (Table 1), and the second was the U.S. macroeconomic time series (Table 2). Both included fourteen-time series.

Table 1
The Nelson-Plosser data set (1860-1970).
Table 2
U.S. macroeconomic series, 1947-2009.

The Nelson-Plosser was a U.S. macroeconomic time series that used econometric models dynamically through regressions models, and is currently described as one of the most advanced time series for forecasting. Nelson-Plosser is an annual time series and includes variables such as Real Gross National Product (GNP), Stock Prices, Real Money, and the Unemployment Rate.

The second data set comprises fourteen U.S. macroeconomic data set includes fourteen-time series updated quarterly, available from Jan. 1947 to Jan. 2009. This model comprises a more recent data series if compared to the classic Nelson-Plosser database. Despite having more recent data on the North American economy, the study was limited to the database available in the Federal Reserve Bank of St. Louis (FRED, 2021FRED. (2021). FRED: Federal Reserve Economic Data. St. Louis, MO: Federal Reserve Bank of St. Louis. Retrieved in 2021, May 15, from https://fred.stlouisfed.org/
https://fred.stlouisfed.org/...
).

This study considered the Consumer Price Index (CPI) as forecasting data (highlighted in grey in Tables 1 and 2) and defined the other data as inputs sets to adjust the predictor model.

The original CPI data from Nelson-Plosser covered the years 1860 through 1970, as shown in Figure 3a, and CPI from U.S. macroeconomic covered the years 1947 to 2009, as shown in Figure 3b.

Figure 3
Consumer Price Index. Source: Authors.

The Consumer Price Index, also called cost-of-living index, measures the average change over time in the prices paid by urban consumers for a representative basket of consumer goods and services, including everything from food items to automobiles to rent (Konny, 2020Konny, C. (2020). Modernizing data collection for the Consumer Price Index. Business Economics (Cleveland, Ohio), 55(1), 45-52. http://dx.doi.org/10.1057/s11369-019-00146-3.
http://dx.doi.org/10.1057/s11369-019-001...
). As the most widely used measure of inflation, the CPI is an indicator of the effectiveness of government policies.

Figures 4 and Figure 5 show the graphs of each inputs time series for both data sets.

Figure 4
Nelson-Plosser. Source: Authors.
Figure 5
U.S. Macroeconomic Series. Source: Authors.

4.2. Qualitative analysis

This paper proposes and analyses the use of a selection strategy for choosing the best forecasting model based on the performance of forecasting accuracy measures. There is no specific rule governing the data split in the literature. However, it is generally agreed that most data points should be used for model building (Qi & Zhang, 2008Qi, M., & Zhang, G. P. (2008). Trend time-series modeling and forecasting with neural networks. IEEE Transactions on Neural Networks, 19(5), 808-816. http://dx.doi.org/10.1109/TNN.2007.912308. PMid:18467210.
http://dx.doi.org/10.1109/TNN.2007.91230...
). The selected model is designed using the training intervals and is evaluated on the testing interval for the long-term and medium-term to analyse its performance on future samples.

Three scenarios were proposed for this study. The first and second long-term prediction scenarios spanned 10 and 5 years, and in the third case, a medium-term prediction of 2 years was used for the two data sets. Since U.S. macroeconomic time series are quarterly, four observations cover one year.

4.3. Pre-processing

The data were normalised to values between zero and one, and the missing values were replaced by the mean value of each variable series for the data sets. Normalisation was necessary to ensure that values ​​measured on different scales were standardised.

4.4. Software

The algorithms were developed using the Matlab® R2020a software in a 4 GB RAM computer, with a 3.41 GHz Intel® Core processor and Windows® 32-bit operating system.

5. Experimental evaluation

This section formally compares the forecasting capabilities of the proposed GPR, MLP, and RBF. The comparison among the techniques was based on the most widespread mean value measures for forecasting, i.e. (a) the Mean Squared Errors (MSE); (b) The Root Mean Squared Errors (RMSE); (c) the Mean Absolute Error (MAE); and (d) the Mean Absolute Percentage Errors (MAPE).

5.1. Forecasting accuracy measurements

The first measurement, MSE, is the average of the squared error. Since these prediction errors are squared in the MSE calculation, it significantly influences larger errors. Therefore, this property makes MSE worthwhile when significant errors are not wanted but penalise outliers (Büyükşahin & Ertekin, 2019Büyükşahin, Ü. Ç., & Ertekin, Ş. (2019). Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition. Neurocomputing (Amsterdam), 361, 151-163. http://dx.doi.org/10.1016/j.neucom.2019.05.099.
http://dx.doi.org/10.1016/j.neucom.2019....
). Equation 9 shows the MSE metric, where et= yt y^t and yt is the data value, y^t is the forecasted value at time t, and n is the forecasting horizon.

M S E = 1 n t = 1 n e t 2 (9)

The RMSE, also known as the root mean square deviation (RMSD), is a quadratic scoring rule that measures the average magnitude of the error. It is the square root of the average squared differences between prediction and actual observation, Equation 10.

R M S E = 1 n t = 1 n e t 2 (10)

The RMSE statistic provides information about the performance of a model by allowing a term-by-term comparison of the actual difference between the estimated and the measured value. The smaller the RSME value, the better the model's performance.

The MAE method measures the average magnitude of errors in a set of predictions without considering their direction. It is the average of absolute differences between prediction and actual observation over the test sample where all individual differences have equal weight, as seen in Equation 11.

M A E = 1 n t = 1 n e t (11)

The mean absolute percentage error is the percentage equivalent of MAE. Equation 12 shows the MAPE:

M A P E = 1 n t = 1 n e t y t × 100 (12)

5.2. Forecasting results and discussions

5.2.1. Long-term estimate (ten-period-ahead forecast)

In the first scenario, classified as long-term (Montgomery et al., 2015Montgomery, D. C., Jennings, C., & Kulahci, M. (2015). Introduction to time series analysis and forecasting. Hoboken: Wiley.), MLP and GPR methods had good performances for a ten-period-ahead forecast horizon, since they obtained good MAPE results. Consequently, the MSE, RMSE and MAE values also indicate good performance for these models. However, the RBF model did not show satisfactory results compared to the others methods, as shown in Table 3.

Table 3
Results of the ten-period-ahead forecast.

In this case, if we assess only the RMSE value, the RBF model could be a possible solution for a ten-period-ahead forecast horizon. However, when analysing the MAPE value, the RBF model presents a high percentage of forecasting errors. Therefore, comparing all models, the MLP and GPR models are the most suited for this case.

In an investigation for long-term forecast horizons (Torra & Claveria, 2017Torra, S., & Claveria, O. (2017). Regional tourism demand forecasting with machine learning models: Gaussian process regression vs. neural network models in a multiple-input multiple-output setting (Vol. 201701). Barcelona: Regional Quantitative Analysis Group, University of Barcelona.), the authors found that the RBF network surpasses the GPR model, unlike the results of our study. Thus, in the first scenario, the three models had a good forecasting accuracy, but GPR outperforms the ANN models.

Figure 6 shows the results obtained by different methods for a ten-period-ahead forecast horizon. For better visualisation, the graphics below show the ten years prior to the prediction.

Figure 6
Results of the ten-period-ahead forecast. Source: Authors

The two other scenarios that were subjected to the forecast are described in the following sections.

5.2.2. Long-term estimation (five-period-ahead forecast)

In another long-term scenario, the results obtained for a five-period-ahead forecast horizon showed that the MLP and GPR models presented good accuracy values. In contrast, the RBF model once again did not show a satisfactory performance, similar to the first study. Table 4 shows the results.

Table 4
Results of the five-period-ahead forecast.

Figure 7 shows the graphic sample forecast for a five-year horizon. For better visualisation, the graphics show ten years before starting the prediction. As in the first scenario, the performance of MLP and GPR was good, with GPR performing slightly better than MPL. However, in this second scenario, MLP had better performance than GPR.

Figure 7
Results of the five-period-ahead forecast. Source: Authors.

Similar to the first scenario, the RBF model did not present a satisfactory result. This time the model had a worse performance, presenting a MAPE value quite discrepant in comparison with the other two models. This result shows that not all neural network models outperform the regression model in a medium-term horizon.

5.2.3. Medium-term estimation (two-period-ahead forecast)

The third scenario, classified as medium-term (Montgomery et al., 2015Montgomery, D. C., Jennings, C., & Kulahci, M. (2015). Introduction to time series analysis and forecasting. Hoboken: Wiley.). The forecasting quality can also be observed when analysing the two-period-ahead forecast. These results are present in Table 5.

Table 5
Results of the two-period-ahead forecast.

Figure 8 shows the sample forecast graphically for a two-year horizon. For better visualisation, the graphics show the ten previous years before starting the prediction. According to predictive accuracy measures, the GPR is the best predictor, followed by MLP, similar to the previous scenarios.

Figure 8
Results of the two-period-ahead forecast. Source: Authors.

This study implies that the MLP and GPR models are the most indicated for econometric data forecasts. The findings of this study complement ongoing research with the knowledge that, although ANN models are generally superior to regression models, both can be successfully applied for estimation (Safari et al., 2016Safari, M.-J.-S., Aksoy, H., & Mohammadi, M. (2016). Artificial neural network and regression models for flow velocity at sediment incipient deposition. Journal of Hydrology (Amsterdam), 541, 1420-1429. http://dx.doi.org/10.1016/j.jhydrol.2016.08.045.
http://dx.doi.org/10.1016/j.jhydrol.2016...
; Zhang et al., 2019Zhang, X., Xue, T., & Eugene Stanley, H. (2019). Comparison of econometric models and artificial neural networks algorithms for the prediction of baltic dry index. IEEE Access: Practical Innovations, Open Solutions, 7(99), 1647-1657. http://dx.doi.org/10.1109/ACCESS.2018.2884877.
http://dx.doi.org/10.1109/ACCESS.2018.28...
). These findings indicate that no particular model is the best in all situations (Zhang et al., 2019Zhang, X., Xue, T., & Eugene Stanley, H. (2019). Comparison of econometric models and artificial neural networks algorithms for the prediction of baltic dry index. IEEE Access: Practical Innovations, Open Solutions, 7(99), 1647-1657. http://dx.doi.org/10.1109/ACCESS.2018.2884877.
http://dx.doi.org/10.1109/ACCESS.2018.28...
).

Regarding the models used in this study (MLB, RBF and GPR), it is worth mentioning that the effect of parameters on the prediction accuracy is not discussed in this study. Better performance would be expected if some factors such as hyperparameters were further optimised and multiple techniques effectively integrated, which is the focus of our future research.

6. Conclusions

Considering the recent worldwide economic problems, ANN is one of the most promising tools to forecast trends and is widely studied in economics analysis.

Several authors voice concerns about the performance of different forecasting methods for solving economic problems. This study examines the forecasting performance of ANN models compared with GPR using econometric Nelson-Plosser and U.S. macroeconomic data sets.

The MLP model is the most extensively used neural network. In this study, the MLP showed similar results to GPR and presented a satisfactory performance for all forecasting horizons, indicating that MLP can provide a good forecasting accuracy for econometric data. In forecasts one and three, the GPR model showed good results and surpassed the ANN models, proving that it can also provide a good forecasting accuracy for econometric data.

The RBF model presented large error values for all forecast horizons. The high values in all metrics indicate this neural network is unsuitable for econometric data, reinforcing the current literature that no particular model is the best for all situations.

In future works, other forecasting methods should be tested and compared with ANN results when assessing whether ANN models outperform traditional econometric forecasting methods. The design of experiments should be used to delineate ANN and GPR hyperparameters. Another possible course for future studies is the development of a hybrid forecasting method that combines econometrics and ANN. We also suggest that future studies investigate whether the frequency of the series can influence the quality of predictions.

References

  • Ballabio, C., Lugato, E., Fernandez-Ugalde, O., Orgiazzi, A., Jones, A., Borrelli, P., Montanarella, L., & Panagos, P. (2019). Mapping LUCAS topsoil chemical properties at European scale using Gaussian process regression. Geoderma, 355, 113912. http://dx.doi.org/10.1016/j.geoderma.2019.113912 PMid:31798185.
    » http://dx.doi.org/10.1016/j.geoderma.2019.113912
  • Bandeira, S. G., Alcalá, S. G. S., Vita, R. O., Barbosa, T. M. G. A. (2020). Comparison of selection and combination strategies for demand forecasting methods. Production, 30, e20200009. https://doi.org/10.1590/0103-6513.20200009
    » https://doi.org/10.1590/0103-6513.20200009
  • Brooks, C., Hoepner, A. G. F., Mcmillan, D., Vivian, A., & Wese Simen, C. (2019). Financial data science: the birth of a new financial research paradigm complementing econometrics? The European Journal of Finance, 25(17), 1627-1636. Retrieved in 2021, May 15, from http://www.tandfonline.com/doi/abs/10.1080/1351847X.2019.1662822
    » http://www.tandfonline.com/doi/abs/10.1080/1351847X.2019.1662822
  • Büyükşahin, Ü. Ç., & Ertekin, Ş. (2019). Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition. Neurocomputing (Amsterdam), 361, 151-163. http://dx.doi.org/10.1016/j.neucom.2019.05.099
    » http://dx.doi.org/10.1016/j.neucom.2019.05.099
  • Chen, J., Jiang, F., Li, H., & Xu, W. (2016). Chinese stock market volatility and the role of U.S. economic variables. Pacific-Basin Finance Journal, 39, 70-83. http://dx.doi.org/10.1016/j.pacfin.2016.05.013
    » http://dx.doi.org/10.1016/j.pacfin.2016.05.013
  • Chen, Z., Lin, X., Xiong, C., & Chen, N. (2020). Modeling the relationship of precipitation and water level using grid precipitation products with a neural network model. Remote Sensing, 12(7), 1096. http://dx.doi.org/10.3390/rs12071096
    » http://dx.doi.org/10.3390/rs12071096
  • dos Santos, C. H., Lima, R. D. C., Leal, F., de Queiroz, J. A., Balestrassi, P. P., & Montevechi, J. A. B. (2020). A decision support tool for operational planning: a Digital Twin using simulation and forecasting methods. Production, 30, e20200018. Retrieved in 2021, May 15, from http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0103-65132020000100708&nrm=iso
    » http://www.scielo.br/scielo.php?script=sci_arttext&pid=S0103-65132020000100708&nrm=iso
  • Dritsakis, N., & Klazoglou, P. (2018). Forecasting unemployment rates in USA using Box-Jenkins methodology. International Journal of Economics and Financial Issues, 8(1), 9-20. Retrieved in 2021, May 15, from https://doaj.org/article/3e560af0e38c40409586478cc51c7d43
    » https://doaj.org/article/3e560af0e38c40409586478cc51c7d43
  • Feng, X., Ma, G., Su, S.-F., Huang, C., Boswell, M. K., & Xue, P. (2020). A multi-layer perceptron approach for accelerated wave forecasting in Lake Michigan. Ocean Engineering, 211(1-2), 107526.
  • FRED. (2021). FRED: Federal Reserve Economic Data St. Louis, MO: Federal Reserve Bank of St. Louis. Retrieved in 2021, May 15, from https://fred.stlouisfed.org/
    » https://fred.stlouisfed.org/
  • Fu, Q., Shen, W., Wei, X., Zheng, P., Xin, H., & Zhao, C. (2019). Prediction of the diet nutrients digestibility of dairy cows using Gaussian process regression. Information Processing in Agriculture, 6(3), 396-406. http://dx.doi.org/10.1016/j.inpa.2018.11.005
    » http://dx.doi.org/10.1016/j.inpa.2018.11.005
  • Gu, S., Kelly, B., & Xiu, D. (2020). Empirical asset pricing via machine learning. Review of Financial Studies, 33(5), 2223-2273. http://dx.doi.org/10.1093/rfs/hhaa009
    » http://dx.doi.org/10.1093/rfs/hhaa009
  • Gupta, P., Batra, S. S., & Jayadeva, (2017). Sparse short-term time series forecasting models via minimum model complexity. Neurocomputing (Amsterdam), 243, 1-11. http://dx.doi.org/10.1016/j.neucom.2017.02.002
    » http://dx.doi.org/10.1016/j.neucom.2017.02.002
  • Khraisha, T. (2020). Complex economic problems and fitness landscapes: Assessment and methodological perspectives. Structural Change and Economic Dynamics, 52, 390-407. http://dx.doi.org/10.1016/j.strueco.2019.01.002
    » http://dx.doi.org/10.1016/j.strueco.2019.01.002
  • Konny, C. (2020). Modernizing data collection for the Consumer Price Index. Business Economics (Cleveland, Ohio), 55(1), 45-52. http://dx.doi.org/10.1057/s11369-019-00146-3
    » http://dx.doi.org/10.1057/s11369-019-00146-3
  • Li, Y., Liu, R. W., Liu, Z., & Liu, J. (2019). Similarity grouping-guided neural network modeling for maritime time series prediction. IEEE Access: Practical Innovations, Open Solutions, 7(99), 72647-72659. http://dx.doi.org/10.1109/ACCESS.2019.2920436
    » http://dx.doi.org/10.1109/ACCESS.2019.2920436
  • Madhiarasan, M., & Deepa, S. (2017). Comparative analysis on hidden neurons estimation in multi layer perceptron neural networks for wind speed forecasting. Artificial Intelligence Review, 48(4), 449-471. http://dx.doi.org/10.1007/s10462-016-9506-6
    » http://dx.doi.org/10.1007/s10462-016-9506-6
  • Montgomery, D. C., Jennings, C., & Kulahci, M. (2015). Introduction to time series analysis and forecasting Hoboken: Wiley.
  • Nelson, C. R., & Plosser, C. R. (1982). Trends and random walks in macroeconmic time series. Journal of Monetary Economics, 10(2), 139-162. http://dx.doi.org/10.1016/0304-3932(82)90012-5
    » http://dx.doi.org/10.1016/0304-3932(82)90012-5
  • Nourani, V., Paknezhad, N. J., & Tanaka, H. (2021). Prediction Interval Estimation Methods for Artificial Neural Network (ANN)-based modeling of the hydro-climatic processes, a review. Sustainability, 13(4), 1633. http://dx.doi.org/10.3390/su13041633
    » http://dx.doi.org/10.3390/su13041633
  • Puchalsky W., Ribeiro, G. T., da Veiga, C. P., Freire, R. Z., & Santos Coelho, L. (2018). Agribusiness time series forecasting using Wavelet neural networks and metaheuristic optimisation: an analysis of the soybean sack price and perishable products demand. International Journal of Production Economics, 203, 174-189. http://dx.doi.org/10.1016/j.ijpe.2018.06.010
    » http://dx.doi.org/10.1016/j.ijpe.2018.06.010
  • Qi, M., & Zhang, G. P. (2008). Trend time-series modeling and forecasting with neural networks. IEEE Transactions on Neural Networks, 19(5), 808-816. http://dx.doi.org/10.1109/TNN.2007.912308 PMid:18467210.
    » http://dx.doi.org/10.1109/TNN.2007.912308
  • Rani R, H. J., & Victoire T, A. A. (2018). Training radial basis function networks for wind speed prediction using PSO enhanced differential search optimiser. PLoS One, 13(5), e0196871. http://dx.doi.org/10.1371/journal.pone.0196871
    » http://dx.doi.org/10.1371/journal.pone.0196871
  • Rasmussen, C. E., & Nickisch, H. (2010). Gaussian Processes for Machine Learning (GPML) Toolbox. Journal of Machine Learning Research, 11, 3011-3015.
  • Robinson, L., Schulz, J., Ragnedda, M., Pait, H., Kwon, K. H., & Khilnani, A. (2021). An Unequal Pandemic: vulnerability and COVID-19. The American Behavioral Scientist, 1-5. https://doi.org/10.1177/00027642211003141
    » https://doi.org/10.1177/00027642211003141
  • Sadeghi, G., Pisello, A. L., Nazari, S., Jowzi, M., & Shama, F. (2021). Empirical data-driven multi-layer perceptron and radial basis function techniques in predicting the performance of nanofluid-based modified tubular solar collectors. Journal of Cleaner Production, 295, 126409. http://dx.doi.org/10.1016/j.jclepro.2021.126409
    » http://dx.doi.org/10.1016/j.jclepro.2021.126409
  • Safari, M.-J.-S., Aksoy, H., & Mohammadi, M. (2016). Artificial neural network and regression models for flow velocity at sediment incipient deposition. Journal of Hydrology (Amsterdam), 541, 1420-1429. http://dx.doi.org/10.1016/j.jhydrol.2016.08.045
    » http://dx.doi.org/10.1016/j.jhydrol.2016.08.045
  • Safi, S. K. (2016). A comparison of artificial neural network and time series models for forecasting GDP in Palestine. American Journal of Theoretical and Applied Statistics, 5(2), 58. http://dx.doi.org/10.11648/j.ajtas.20160502.13
    » http://dx.doi.org/10.11648/j.ajtas.20160502.13
  • Samadianfard, S., Hashemi, S., Kargar, K., Izadyar, M., Mostafaeipour, A., Mosavi, A., Nabipour, N., & Shamshirband, S. (2020). Wind speed prediction using a hybrid model of the multi-layer perceptron and whale optimisation algorithm. Energy Reports, 6, 1147-1159. Retrieved in 2021, May 15, from https://doaj.org/article/721f11b629bb4820a171614924eab0ce
    » https://doaj.org/article/721f11b629bb4820a171614924eab0ce
  • Smets, F., & Wouters, R. (2005). Comparing shocks and frictions in US and euro area business cycles: a Bayesian DSGE approach. Journal of Applied Econometrics, 20(2), 161-183. http://dx.doi.org/10.1002/jae.834
    » http://dx.doi.org/10.1002/jae.834
  • Sun, S., Lu, H., Tsui, K.-L., & Wang, S. (2019). Nonlinear vector auto-regression neural network for forecasting air passenger flow. Journal of Air Transport Management, 78, 54-62. http://dx.doi.org/10.1016/j.jairtraman.2019.04.005
    » http://dx.doi.org/10.1016/j.jairtraman.2019.04.005
  • Torra, S., & Claveria, O. (2017). Regional tourism demand forecasting with machine learning models: Gaussian process regression vs. neural network models in a multiple-input multiple-output setting (Vol. 201701). Barcelona: Regional Quantitative Analysis Group, University of Barcelona.
  • Wu, R., & Wang, B. (2018). Gaussian process regression method for forecasting of mortality rates. Neurocomputing (Amsterdam), 316, 232-239. http://dx.doi.org/10.1016/j.neucom.2018.08.001
    » http://dx.doi.org/10.1016/j.neucom.2018.08.001
  • Yan, J., Li, K., Bai, E., Yang, Z., & Foley, A. (2016). Time series wind power forecasting based on variant Gaussian process and TLBO. Neurocomputing (Amsterdam), 189, 135-144. http://dx.doi.org/10.1016/j.neucom.2015.12.081
    » http://dx.doi.org/10.1016/j.neucom.2015.12.081
  • Yu, R., Leung, P., & Bienfang, P. (2006). Predicting shrimp growth: artificial neural network versus nonlinear regression models. Aquacultural Engineering, 34(1), 26-32. http://dx.doi.org/10.1016/j.aquaeng.2005.03.003
    » http://dx.doi.org/10.1016/j.aquaeng.2005.03.003
  • Zhang, X., Liu, Y., Yang, M., Zhang, T., Young, A. A., & Li, X. (2013). Comparative study of four time series methods in forecasting typhoid fever incidence in China. PLoS One, 8(5), e63116. http://dx.doi.org/10.1371/journal.pone.0063116 PMid:23650546.
    » http://dx.doi.org/10.1371/journal.pone.0063116
  • Zhang, X., Xue, T., & Eugene Stanley, H. (2019). Comparison of econometric models and artificial neural networks algorithms for the prediction of baltic dry index. IEEE Access: Practical Innovations, Open Solutions, 7(99), 1647-1657. http://dx.doi.org/10.1109/ACCESS.2018.2884877
    » http://dx.doi.org/10.1109/ACCESS.2018.2884877

Publication Dates

  • Publication in this collection
    11 Oct 2021
  • Date of issue
    2021

History

  • Received
    15 May 2021
  • Accepted
    14 Sept 2021
Associação Brasileira de Engenharia de Produção Av. Prof. Almeida Prado, Travessa 2, 128 - 2º andar - Room 231, 05508-900 São Paulo - SP - São Paulo - SP - Brazil
E-mail: production@editoracubo.com.br