Macroeconomic **forecasting** is a very difficult task due to the lack of an accurate, convincing model of the economy. The most accurate models for **economic** **forecasting**, “black box” time series models, assume little about the structure of the economy. Constructing reliable time series models is challenging due to short data series, high noise levels, nonstationarities, and nonlinear effects. This paper describes these challenges and surveys some neural network solutions to them. Important issues include balancing the bias/variance tradeoff and the noise/nonstationarity tradeoff. The methods surveyed include hyperparameter selection (regularization parameter and training window length), input variable selection and pruning, network architecture selection and pruning, new smoothing regularizers, and committee forecasts. Empirical results are presented for **forecasting** the U.S. Index of Industrial Production. These demonstrate that, relative to conventional linear time series and regression methods, superior performance can be obtained using state-of-the-art neural network models.

Show more
4. To mitigate the problem of a data scarce environment, spatial panel data models are becoming increasingly common practice in regional **forecasting**. Many authors explicitly mention the necessity of spatial effects to capture, for example, regional spillovers (Baltagi et al., 2014). In this context, the study by Mayor and Patuelli (2012) is worth mentioning. They explicitly examine the trade-off between the cross section and time dimension. The main message from their exercise is that time series models work better than spatial models when the time dimension is higher than the cross section. In cases where the two dimensions nearly equal each other, spatial models outperform more sophisticated time series models. Additionally, Rapach and Strauss (2012) emphasise the outstanding role of labour market data in studying such methodological differences. In this way, future regional **economic** **forecasting** literature may be able to improve **forecasting** performance by simply applying models that are designed for different cross section and time dimension setups.

Show more
14 Read more

Abstract. This paper applies component-wise boosting to the topic of regional **economic** **forecasting**. By using unique quarterly gross domestic product data for one German state for the period from 1996 to 2013, in combination with a large data set of 253 monthly indicators, we show how accurate forecasts obtained from component-wise boosting are compared to a simple benchmark model. We additionally take a closer look into the algorithm and evaluate whether a sta- ble pattern of selected indicators exists over time and four different **forecasting** horizons. All in all, boosting is a viable method for **forecasting** regional GDP, especially one and two quarters ahead. We also find that regional survey results, indicators that mirror the Saxon economy and the Composite Leading Indicators by the OECD, get frequently selected by the algorithm.

Show more
17 Read more

In making this transformation from a MoP to a true MoE, there is much that official forecasters in transition economies can learn about **economic** **forecasting** from the experience of industrialized countries such as Canada. A brief overview of official **economic** **forecasting** as it is currently practised in Canada is provided in t his paper. This includes econometric model- based **forecasting** at the Department of Finance and the Bank of Canada and the recent prudent approach to fiscal **forecasting** initiated by the current Finance Minister to reduce the risk of unpleasant deficit surprises and thereby gain more credibility for fiscal policy. The adoption of the prudent approach underlines the perceived weakness of **economic** **forecasting** even in advanced industrialized economies. A summary of the advantages of **forecasting** with macroeconomic models based on the Canadian experience is presented in the paper.

Show more
16 Read more

For this reason, our objective in this paper is to find out the extent of the influence of variables representing the internal and external monetary conditions i.e. the domestic interest rate, the external interest rate and the exchange rate on the Tunisian macroeconomic aggregates such as **economic** activity and inflation. We thus focus on the construction of monetary conditions index (MCI) for Tunisian **economic** **forecasting**. For this purposes, a VAR and structural VAR models are used for dynamic analysis. Our contribution is based on three steps. Firstly, we calculate the weights assigned to the interest rate and the exchange rate based on the estimated coefficients respectively for these two variables over the period 1965-2015. The estimation approach adopted is a VAR model using stationary variables. At this level, we begin by defining a close relationship to create the Tunisian’s MCI. Secondly, a broader definition, including several macroeconomic variables, will be discussed, using cointegration procedure and vector error correction model (VECM). Thirdly, we attempt, through a SVAR model, to examine the short-run structural dynamics between selected variables. This issue is attributed to responses of aggregate **economic** variables to structural shocks.

Show more
24 Read more

Our contribution is strongly connected to a more general issue in (fnancial) econometrics on what is the appropriate objective (loss) function in economet- ric inference: See general discussion and arguments for much more detailed examination in this respect in Elliott and Timmermann (2016, Chapter 2). Moreover, as argued by Brandt (2010) in his literature review (see also Brandt, 1999; Aït-Sahalia and Brandt, 2001; Brandt and Santa-Clara, 2006; Brandt, Santa-Clara and Valkanov, 2009), besides the obvious intuitive fact that asset allocation is the ultimate object of interest, there are several benefts when focusing portfolio weights directly with available predictive information. Ulti- mately, in contrast to direct utility maximization, the usual two-step statistical approach requires frst to learn the notoriously complicated data generating process of stock returns, with various misspecifcation possibilities, before obtaining necessary input predictions to be plugged in (ad-hoc) trading rules. To evaluate the resulting asset allocation decisions with several **economic** cri- teria does not remove the fact that the underlying inference is not directed to optimize **economic** performance. To address this challenge, we develop a cus- tomized gradient boosting algorithm to empirically optimize asset allocation decisions in a fexible and single-step manner.

Show more
166 Read more

National Product and Expenditure12 The sum total of the preceding discussion clearly indicates a further improvement in the performance of the Irish economy¯ It appears that I96r, too, X[r]

20 Read more

The plan of the remainder of this exposition is as follows. Section 3 analyses fore- casting models, distinguishing between error correction and equilibrium correction. Somewhat paradoxically, models formerly known as ‘error-correction models’ (see e.g. Davidson, Hendry, Srba and Yeo, 1978) do not in fact ‘error correct’ when equilibria shift, so are actually equilibrium-correction models (EqCMs). A prob- lem with such models is that when the equilibrium shifts, EqCMs still converge to their in-built equilibria. Consequently, this class of model is prone to systematic failure. Shifts in equilibrium means are the most important example of location shifts. Conversely, models with an additional imposed unit root, irrespective of the process being modeled having a stochastic trend or not, do error correct to loca- tion shifts. This distinction is at the heart of understanding why, say, the random- walk and the Box–Jenkins time-series method, can prove hard to beat. The class of EqCMs is huge and includes most widely-used **forecasting** models, such as regres- sions, dynamic systems, vector autoregressions (VARs), dynamic-stochastic general equilibrium models (DSGEs), vector equilibrium-correction models (VEqCMs: see e.g., Hendry and Juselius, 2000, 2001), autoregressive conditional heteroskedastic (ARCH) processes, and generalized ARCH (GARCH: see Engle, 1982, and Boller- slev, 1986), as well as some other volatility models. Thus, the failure of EqCMs to adjust to equilibrium shifts has far-reaching implications. Although the majority of the macroeconomic **forecasting** literature is concerned with **forecasting** the cen- tral tendency or conditional first moment of the variable of interest, there is also much interest in **forecasting** the volatility of financial time series, such as returns on assets, that is, **forecasting** the conditional variance of such series. As the reference above to GARCH models as members of the EqCM class suggests, the problems that afflict first-moment forecasts are also relevant for equilibrium-correcting volatility **forecasting** models (see Clements and Hendry, 2006, p.614-7).

Show more
21 Read more

• These reports are created for local sales taxing jurisdic4ons (city, county, transit authority or special purpose district) to use for **economic** forecas4ng. They show actual local sales and use tax alloca4on amounts paid to a local sales taxing jurisdic4on by certain businesses in speciﬁc months. Data is shown for the current calendar year (to-‐date) and the previous calendar year. The informa4on is conﬁden4al and not open to public inspec4on.

Show more
32 Read more

Following consideration of the projections by the conference a final version will be prepared for the NIEC The customary pre-budgetaiy consultations with independent economists and forec[r]

29 Read more

BVAR Bayesian vector autoregressive model with six lags for each of seven variables (annual growth rates of real GNP and IPD; the unemployment rate; lagged levels of the money supply Ml [r]

30 Read more

Assuming correct specification and exogenous variable assumptions, structural econometric models should produce more accurate forecasts than multivariate time series methods.. This arise[r]

78 Read more

Standard Deviations of Monthly Estimates from Annual Outturn in Percentage Changes from Merchandise Exports, Merchandise Imports and the Retail Sales Index, Produced by Reference to the [r]

29 Read more

Let us compare the actual result with the forecast. In December 2015, the oil prices fell down to 40 US dollars/barrel, annual average prices (2015) made up 54 US dollars/barrel. That is, they proved to be lower than the base level of the forecast GDP quoted above whereas the GDP, in spite of the logic of Kudrin’s forecast and the forecasts of the authoritative organizations set out above turned out to be better rather than worse than their forecasts. According to the first (preliminary) evaluation by Rosstat (Russian Statistics Authority), the GDP in Russia in 2015 dropped down by 3.7%, which is already considerably better than all the forecasts mentioned above. Following that, the projection of the downturn was first estimated as being down to -3%, and then down to -2.8%. “Petroleum” **forecasting** suffered a setback.

Show more
15 Read more

* This paper was prepared for the international workshop within the program “Improvement of **Economic** Policy through Think Tank Partnership”, held in Bucharest, Romania, on October 27-29, 2003, and is part of a grant by the U.S. Agency for International Development for the project “Mechanisms of Long-term Growth in the Economies in Transition (Cases of Russia and Romania)”. The research partners of this project are Global Insight (former DRI-WEFA – USA), the Institute of **Economic** **Forecasting** (Romania) and the Center for Macroeconomic Analysis and Short-term **Economic** **Forecasting** (Russian Federation). The opinions, findings and conclusions or recommendations expressed herein are those of the authors and do not necessarily reflect the views of the U.S. Agency for International Development.

Show more
18 Read more

If literature has reached a consensus, it is that uncertainties have to be incorpora- ted into the **forecasting** framework. However this is not trivial, as, depending on the source of uncertainty – about the future, the parameters of the considered mo- del, or the model itself – the forecaster has to evaluate different ways of taking each type of uncertainty into account. Clements and Hendry (1995) examine the frameworks for **economic** **forecasting**. Basically, the authors notice that in the **forecasting** context, the methods for **forecasting** models and procedures might not be the only source of failures, which might also involve the states of nature related to the properties of the variables to be forecasted. They argued that the assump- tions of constant, time-invariant and stationary data generating processes, also thought to be coincident with a unique model of the economy, were a wrong way of representing the world.

Show more
22 Read more

Grouping components together can produce new series with characteristics that differ quite significantly from those of the originating series. In this context, it might be possible to find specific groupings that avoid the problems associated with disaggregate **forecasting** while still allowing for distinct disaggregate dynamics to be picked up in the process. With this objective we develop a two-stage method that combines statistical learning techniques and traditional **economic** **forecasting** evaluation. In the first stage, we use agglomerative hierarchical clustering to reduce the dimension of the problem by choosing a subset of feasible groupings based on the commonality among the different components. In the second stage, we try different selection procedures on the resulting hierarchy to produce the final aggregate forecast. These selection procedures include choosing a single grouping based on some criterion and combining the whole subset of groups.

Show more
39 Read more

26 Read more

The main contributions of the paper may be summarised as follows. First, the class of Functional Signal plus Noise (FSN) models is introduced that provides a new, general method for modelling and **forecasting** time series of **economic** functions. The underlying continuous, smooth **economic** function (or ‘signal’) is a natural cubic spline whose dynamic evolution is driven by a cointegrated VAR for the ordinates (‘y-values’) at the knots of the spline. The natural cubic spline provides flexible cross-sectional fit and results in a linear state space model. Since the model’s state equation is the cointegrated VAR written and parametrised in Error Correction form (see Johansen 1996), we call it the FSN-ECM model. This model achieves dimension reduction, provides a coherent description of the observed price or yield curve and its dynamics as the cross- sectional dimension N becomes large, and can feasibly be estimated and used for **forecasting** when N is large. Second, under the assumption that the m knot ordinates (or yields) follow a cointegrated I(1) process with cointegrating rank r, a theorem is derived showing that the observed and latent yield curves of the FSN-ECM process with dimension N are I(1) processes with cointegrating rank [N − (m − r)]. Third, the FSN-ECM models are applied to **forecasting** 36-dimensional yield curves for US Treasury bonds at the one month ahead horizon. The method consistently outperforms both the widely used Diebold and Li (2006) and random walk forecasts on the basis of both mean square forecast error (MSFE) criteria and economically relevant loss functions derived from the realised profits of pairs trading algorithms that each period construct an arbitrage portfolio of discount bonds. Yield spreads are found to provide important information for **forecasting** the yield curve, but not in the manner prescribed by the Expectations Theory. The analysis also highlights in a concrete setting the dangers of attempts to infer the relative **economic** value of model forecasts on the basis of their associated MSFEs.

Show more
40 Read more

Today, a data centre is the main instrument for providing flexible, scalable IT services to business on the basis of distributed or cloud computing technologies. Building a data centre is always expensive and resource-intensive project. Therefore, during the development of the concept of this project it is extremely important to estimate accurately its **economic** efficiency. This article represents the model for the analysis of the effectiveness of investment in the data centre construction. Our model comprises several regressions that show correlations between main characteristics of the project (capital and operational expenditures, Net Present Value) and parameters of the data centre under construction (data centre area and the number of racks). The model is based on the results of the analysis of the current state and trends in the data centre market in Russia.

Show more
10 Read more