Top PDF Economic Forecasting with an Agent-Based Model

Economic Forecasting with an Agent-Based Model

Economic Forecasting with an Agent-Based Model

The previous section has demonstrated that the size and detailed structure of the ABM tend to improve its forecasting performance compared to standard models. Another important advantage of our approach is the possibility of breaking down simulation results in a stock-flow consistent way according to national accounting (ESA). In particular, we are able to report results for all economic activities depicted in this model consistent with national accounting rules, in addition to relating them to the main macroeconomic aggregates. Most importantly, for all simulations and forecasts, our model preserves the principle of double-entry bookkeeping. This implies that all financial flows within the model are made explicit and are recorded as an outflow of money (use of funds) for one agent in the model in relation to a certain economic activity, and as an inflow of money (source of funds) for another agent. In principle, we can thus consistently report on the economic activity of every single agent at the micro-level. A more informative aggregation is on a meso-level according to the NACE/CPA classification into 64 industries, which encompasses many variables. This multitude of results consists of all components of GDP on a sectoral level: among others, wages, operating surplus, investment, taxes and subsidies of different kinds, intermediate inputs, exports, imports, final consumption of different agents (household, government), employment, and also economic indicators such as productivity coefficients for capital, labor, and intermediate inputs. Probably the simplest example indicative of this model structure is that it breaks down simulation results into the larger components of GDP.
Show more

65 Read more

COMPLEX AGENT-BASED MODELS: APPLICATION OF A CONSTRUCTIVISM IN THE ECONOMIC RESEARCH

COMPLEX AGENT-BASED MODELS: APPLICATION OF A CONSTRUCTIVISM IN THE ECONOMIC RESEARCH

However, the ACE movement has to cope with issues related to this research paradigm. For instance, ACE modelling requires the construction of dynamically complete economic models. It means that a modeller has to start from initial conditions and the model must permit and fully support the playing out of agent interactions over time without further interven- tion from the modeller. This completeness requires detailed initial specifications for agent data and methods determining structural attributes, institutional arrangements, and behavioural dispositions. Consequently, there is the difficulty with validating ACE model outcomes against empirical data. ACE experiments provide outcome distributions for theoretical economic systems with explicitly articulated micro- foundations. Mostly these outcome distributions have a multi-peaked form suggesting multiple equilibria rather than a central-tendency form permitting simple point predictions [34]. Then, intensive experimentation must often be conducted over a wide array of plausible initial specifications for ACE models if robust prediction is to be achieved [22].
Show more

17 Read more

Conceptual Agent based Modeling in Supply Chain: An Economic Perspective

Conceptual Agent based Modeling in Supply Chain: An Economic Perspective

was to maximize the total financial benefit and an alternative formulation which is based upon the paths in the scenario tree was also proposed. Longinidis and Georgiadis (2013) presented a mathematical model that integrates financial performance and credit solvency modelling with SCN design decisions under economic uncertainty. The multi-objective Mixed Integer Non-Linear Programming (moMINLP) model enchased financial performance through economic value added and credit solvency through a valid credit scoring model. Ramezani, Kimiagari, and Karimi (2015) consider bi-objective logistic design problem integrating the financial and physical flows of a closed-loop supply chain in which the uncertainty of demand and the return rate are described by a finite set of possible scenarios. Golpîra, Zandieh, Najafi, and Sadi-Nezhad (2017) presented a multi-objective, multi- echelon supply chain network design problem. The proposed framework is green, in which it tackles the demand uncertainty of a product, environmental uncertainties, and the downstream risk attitude into the problem formulation. Carnovale, Rogers, and Yeniyurt (2018) consider network power and network cohesion and examine the role of these factors on financial performance of a firm. Meng, Li, Liu, and Chen (2017) present a multi-agent model of four three-level supply chains that apply different types of combined contracts by considering the effects of vertical and horizontal competition between supply chains. Cao and Yu (2018) consider an emission-dependent supply chain comprised of a supplier and a manufacturer who has limited capital and obtains the pledged loan by utilizing the carbon emission permits. The results show that the capital-constrained manufacturer makes more profit by pledging carbon emission permits to obtain a loan compared with having no access to borrowing money.
Show more

14 Read more

Model based forecasting for demand response strategies

Model based forecasting for demand response strategies

of aggregators is one of the most promising ones [7]. The question of how the aggregators will achieve coordination, however, is still matter for research with a number of promis- ing solutions being investigated. An interesting way to exploit flexibility of a pool of prosumers is to explicitly formulate a common target for their aggregated power profile, and give them economic incentives to follow this target. Energy retailers and balance responsible parties, which bid for purchase of energy in the energy market, would benefit from a reduction of uncertainty in the prosumers consumption or production. In this paper, we consider prosumers as cooperative agents not able to modify the control algorithm that optimizes the operations of their flexible loads. As such, we are not obliged to choose prosumers’ utility functions that generate a unique generalized Nash equilibrium. We will rather focus on a distributed control protocol allowing prosumers coordination through multiple voltage levels. In this context, a good coor- dination protocol must ensure prosumers privacy while being scalable. Prosumers privacy is inherently guaranteed if they do not need to share their private information (e.g. size of batteries, desired set-point temperatures in their homes), or their forecasted power profile. Scalability ensures that the computational time of coordination scales near-linearly with increasing number of agents, allowing for fast control. Most studies on the subject are focused on maximizing the welfare of a group of prosumers, by means of maximiz- ing their utility functions. In the mathematical optimization framework, this problem can be modelled as an allocation or exchange problem [8]. In [9] the welfare maximization problem is considered with additional coupling constraints, modelling line congestions. The problem is solved using a primal-dual interior point method, considering that each agent has access to the dual updates of his neighbours. In [10] the same problem is solved with different multi-steps gradient methods. In recent years, other authors proposed decompo- sition techniques based on the theory of monotone operator splitting [11]. These algorithms are known to have more convenient features in terms of convergence with respect of the gradient-based counterparts. An example of such approach, is represented by the proximal algorithms, which are well
Show more

202 Read more

Agent–Based Keynesian Macroeconomics   An Evolutionary Model Embedded in an Agent–Based Computer Simulation

Agent–Based Keynesian Macroeconomics An Evolutionary Model Embedded in an Agent–Based Computer Simulation

Turning back to the ‘expected utility function’: The maximization is subject to an intertemporal budget constraint, which leads to the well–known ‘Euler equation’ as the optimality condition. This approach generates ‘precautionary motives’ for savings. Usually the optimization problem must be solved by ‘dynamic programming’, i.e. ‘backward induction’. As mentioned above in chapter 1, it can be assumed that households do not follow such a hyper–rational optimization: This can be due to limited computational power, or at least due to the fact that in reality agents face ‘true uncertainty’—so that they cannot calculate expected utility. Finally, individuals can have various savings objectives, which differ from consumption smoothing. For example bequest motives, pres- tige, economic power, or safety can also justify the accumulation of wealth (Frietsch, 1991). Usually, models of optimal ‘intertemporal choice’ do not incorporate such motives. Moreover, macroeconomic and microeconomic evidence offer some support for the view that individuals follow rule–of–thumb behavior (Shefrin and Thaler, 1988; Campbell and Mankiw, 1989; Loewenstein, 1988). Several experimental studies identify that individuals do not perform ‘backward induction’ to solve such complex ‘intertemporal’ decision problems in a rational way (Anderhub, 1998; Carbone and Hey, 1997; Hey and Dardanoni, 1988). The most important reason is that people have limited compu- tational power. Hence there is empirical and experimental evidence that justifies that we do not employ hyper–rational optimizing behavior: We follow this intention and design the savings behav- ior through rule–of–thumb behavior. Surely, it could be the task for further research to develop a more sophisticated yet better manageable approach. To note, it could be fruitful to use the results of future experimental studies in order to define reasonable savings heuristics.
Show more

330 Read more

Forecasting Economic Cycle with a Structural Equation Model: Evidence from Thailand

Forecasting Economic Cycle with a Structural Equation Model: Evidence from Thailand

SEM can be classified into two types: covariance-based (CB-SEM) and variance-base structural equation modeling (i.e., PLS-SEM). These two approaches have the distinction. The objective of CB- SEM is theory confirmation. It aims to minimize the difference between the estimated and the sample covariance matrices to evaluate model parameters, and the data applied in the model have to be normal distribution and quit a big sample size. Whereas, the purpose of PLS-SEM is to forecast the major target constructs. It attempts to maximize the explained variance of the endogenous unobserved variables. The method can estimate a complex model with all data distribution and a small sample size. Therefore there is much literature applying PLS-SEM with a forecasting purpose using a small sample size (Castro-González and Leon, 2017; Jabeur and Sghaier, 2018). Because of its outstanding ability, PLS-SEM become a common statistical method in many fields of science (F. Hair Jr et al., 2014; Hair Jr et al., 2016).
Show more

11 Read more

Agent–Based Keynesian Macroeconomics - An Evolutionary Model Embedded in an Agent–Based Computer Simulation

Agent–Based Keynesian Macroeconomics - An Evolutionary Model Embedded in an Agent–Based Computer Simulation

interaction effects within the model. Consequently, the influence of a certain parameter on the model output depends on the levels of probably many other parameters. Insofar as some of the decision rules of agents are not investigated in the empirical economic literature, it is not possible to find empirical verified ‘domains’ (or levels) for all ‘main’ (or ‘peripheral’) parameters; some have to be defined ad hoc—primarily based upon ‘face validation’ runs. But as explained above, one (ad hoc defined) parameter is supposed to influence the reasonable levels of other parameters indirectly. For example, the supply decision of consumer firms features a sensitivity of output growth to marginal profit, defined through the parameter θ. This parameter is supposed to affect consumer goods mar- kets, and therefore future incomes and future investment demand as well. However, this parameter has not been investigated in any empirical study until now. This uncertainty propagates, i.e. it produces undesired interaction effects on other parameters. For instance, it affects the behavioral parameters of the investment decisions of firms, or parameters defining the savings and consumption decisions of households. Additionally, feedback–loops are operating. In the example, the consump- tion decision, in turn, influences the subsequent supply decisions of consumer goods firms, and hence θ. To sum up, the complexity of the artificial economy of Agent Island indicates that one cannot cut a single parameter out of the model, investigate its level in reality through empirical data, and apply the obtained levels to the simulation model without consideration for the other parameters. As stated in chapter 1, ‘everything seems to depend on everything else’.
Show more

330 Read more

Health and economic benefits of public financing of epilepsy treatment in India : an agent-based simulation model

Health and economic benefits of public financing of epilepsy treatment in India : an agent-based simulation model

We conduct analysis of IndiaSim output using R (version 3.1.2) and report health and economic outcomes at state and national levels for the first 10 years of intervention. Health burden is measured in DALYs. Economic impact is mea- sured using ICERs and epilepsy-specific OOP expenditure averted. Financial risk protection is measured by the money-metric value of insurance — the price individuals are willing to pay to avoid the risk of financial shock. 25,27 We report the mean present value for the 10 years, discounted at 3% annually. Costs and expenditures are in 2013 US dol- lars. We conduct a Latin Hypercube Sampling (LHS) sensi- tivity analysis and construct 95% uncertainty ranges (URs). Further details on these calculations are in Appendix S1.
Show more

11 Read more

DSGE model-based forecasting of non-modelled variables

DSGE model-based forecasting of non-modelled variables

Our paper proposes a simpler two-step estimation approach for an empirical model that consists of a medium-scale DSGE model for a set of core macroeconomic variables and a collection of measurement equations or auxiliary regressions that link the state variables of the DSGE model with the non-core variables of interest to the analyst. In the first step we estimate the DSGE model using the core variables as measurements. Since the DSGE model estimation is fairly tedious and delicate, in real time applications the DSGE model could be re-estimated infrequently, for instance, once a year. Based on the DSGE model parameter estimates, we apply the Kalman filter to obtain estimates of the latent state variables given the most recent information set. We then use the filtered state variables as regressors to estimate simple linear measurement equations with serially correlated idiosyncratic errors. This estimation is quick and can be easily repeated in real time as new information arrives or interest in additional non-core variables arises. An attractive feature of our empirical model for policy makers is that we are linking the non-core variables to the fundamental shocks that are believed to drive business cycle fluctuations. In particular, we are creating a link between monetary policy shocks and non-core variables, which allows us to study the effect of unanticipated changes in monetary policy on a broad set of economic variables.
Show more

43 Read more

Financialisation and Crisis in an Agent Based Macroeconomomic Model

Financialisation and Crisis in an Agent Based Macroeconomomic Model

To assess the possibility of short periods of large unemployment even in simulations classi- fied as without large crisis, we check the maximum unemployment registered between t = 101 and t = 500. Table 2 contains the number of simulations with a maximum unemployment rate included in the following ranges: less than 20%, between 20% and 30%, between 30% and 40%, between 40% and 50%, between 50% and 60%, and above 60%. According to the data presented in the table, it is almost impossible that an economy does not face some periods of strong crisis, when the dividend yield becomes too high (for instance above ¯ δ = 0.45). We repeat the previous analysis in the other three settings itemized above. Indeed, the analysis on a time horizon equal to T = 150 showed that the economic system does not present very relevant problems till ¯ δ = 0.35, then we decide to check if the system is stable even in the long run, for a value of ¯ δ below this threshold (that is ¯ δ = 0.3). For the same reason we also check for heterogeneity of behaviours between firms and banks: in the short run analysis we showed that large crises appear when firms do not have enough money to hire workers, both from internal funds and from bank credit; but, it could also happen that firms with high dividends reduce their net worth, but banks have more money to lend ( ¯ δ f = 0.45 and
Show more

22 Read more

An Agent Based Decentralized Matching Macroeconomic Model

An Agent Based Decentralized Matching Macroeconomic Model

Acknowledgments: We are very grateful to partecipants at the following conferences for helpful com- ments and suggestions: 17th Annual Workshop on Economic Heterogeneous Interacting Agents (WEHIA), University of Pantheon-Assas, Paris II, June 21-23, 2012; 18th International Conference Computing in Eco- nomics and Finance (CEF), Prague, June 27-29, 2012; Systemic Risk: Economists meet Neuroscientists, Frankfurt Institute for Advanced Studies (FIAS) and the House of Finance, Frankfurt am Main, September 17-18, 2012; 3rd International Workshop on Managing Financial Instability in Capitalist Economies (MAFIN), Genoa, September 19-21, 2012. Authors acknowledge the financial support from the European Community Seventh Framework Programme (FP7/2007-2013) under Socio-economic Sciences and Humanities, grant agree- ment no. 255987 (FOC-II).
Show more

25 Read more

An agent based decentralized matching macroeconomic model

An agent based decentralized matching macroeconomic model

One of the basic characteristics of agent-based models is heterogeneity . Agents may differ in many dimensions such as income, wealth, size, financial fragility, location, information, and so on. One of the main advantages of considering agent heterogeneity is that aggreg- ate regularities are not approximated by the behavior of a “representative agent” (Kirman, 1992; see also Gallegati and Kirman, 1999). The latter assumption, indeed, may lead to some inconsistencies, for instance because real-world data are often not Gaussian distributed, showing in many cases a power law shape. Consequently, it is usually not possible to reduce the complexity of macroeconomic dynamics to the behavior of a single agent with an “av- erage” behavior, because the average does not represent the behavior of the system. As a consequence, macroeconomic models based on the representative agent hypothesis suffer from a “fallacy of composition”. However, even in mainstream economic models it is possible to introduce a certain degree of heterogeneity. For instance, the recent debate in the DSGE (Dy- namic Stochastic General Equilibrium) models community is focused on the introduction of financial factors and agents’ heterogeneity. However, another basic feature that can be hardly introduced in mainstream models and that is instead at the root of agent-based models is the “direct interaction” among heterogeneous agents.
Show more

34 Read more

Financial fragility in a basic agent-based model

Financial fragility in a basic agent-based model

Agent-based simulation models constitute a further development of the possibilities offered by computing machines, in that aggregate equations no longer constitute the point of departure. Rather, the idea is to describe the behaviour of single components of a system – e.g. economic agents in an economic system – and reconstruct the aggregate behaviour by simulating their interactions. In this way, agent-based computer modelling develops features that are in many respects intermediate between those of verbal descriptions and those of equations-based models (Gilbert and Terna 2000).
Show more

15 Read more

Forecasting reduction of forest resources based on GM model

Forecasting reduction of forest resources based on GM model

Forest land resources is an important part of forestry resource, is the foundation of forestry development, forest land resources protection quality not only relates to the ecological construction, but also directly affect the economic and social development of the country[1]. Forest land resource is non-renewable, but with continuous use. In order to strengthen the management of forest, for the development of forest resources, improve the ecological environment, economic and social sustainable development is of great significance. The reduction of forest land resources is always shown as forest land size reduces. At present our country forest land area of patients increased year by year, the accurate forecast of the requisition of forest land is an important of way of forecast the forest resources change trends[2]. The GM model is based on grey module concept. It is a good means to study the application of grey model on the anticipation of weapon equipments consumption[3]. The grey system theory is that all random quantity is within a certain range, a certain period of time, change the grey value and grey process. For the amount of gray processing, not to seek its statistical rule and probability distribution, from the original data of irregular but rule, namely the data through a certain way, make it become a regular time series data, to build models[4-6]. It is always applied to nonlinear function approaching[7-9].
Show more

5 Read more

DSGE Model-Based Forecasting of Non-modelled Variables

DSGE Model-Based Forecasting of Non-modelled Variables

We thank seminar participants at the Board of Governors, the FRB Philadelphia, the University of Richmond, and Texas A&M University for helpful comments. This research was conducted while Schorfheide was visiting the FRB Philadelphia, for whose hospitality he is thankful. Schorfheide gratefully acknowledges financial support from the Alfred P. Sloan Foundation and the National Science Foundation (Grant SES 0617803). The views expressed in this paper do not necessarily reflect those of the Federal Reserve Bank of Philadelphia, the Federal Reserve System, or the National Bureau of Economic Research. NBER working papers are circulated for discussion and comment purposes. They have not been peer- reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.
Show more

48 Read more

Financial Regulation in an Agent Based Macroeconomic Model

Financial Regulation in an Agent Based Macroeconomic Model

The model is populated by heterogeneous agents (households, firms and banks) that interact according to a fully decentralized matching mechanism. The matching protocol is common to all markets (goods, labor, credit, deposits) and represents a best partner choice in a context of imperfect information. The model is useful because it gives rise to emergent macroeconomic properties like the fluctuation of the unemployment rate, the relevance of leverage cycles and credit constraints on economic performance, the presence of bank defaults and the role of fin- ancial instability, and so on. In particular, simulations show that endogenous business cycles emerge as a consequence of the interaction between real and financial factors: when firms’ profits are improving, they try to expand the production and, if banks extend the required credit, this results in more employment; the decrease of the unemployment rate leads to the rise of wages that, on the one hand, increases the aggregate demand, while on the other hand reduces firms’ profits, and this may cause the inversion of the business cycle. Moreover,
Show more

26 Read more

The Economic Burden of Multiple Myeloma. Definition of a Model for Forecasting Patients’ Costs

The Economic Burden of Multiple Myeloma. Definition of a Model for Forecasting Patients’ Costs

The analysis of this cohort through the GLM identified four determinants useful for building a model to forecast expenditure for MM patients: age, bortezomib use, lenalidomide use, and number of lines of therapy. Age had a positive influence on the burden of expenditure. The explanation of this observation is different in the two groups: in younger patients the observed reduction of costs along the years is related to the impact of the cost of transplantation in the first year, whereas for the patients not eligible for transplantation the lower costs are related to the fact that, not benefiting from transplantation, these patients tend to be treated for shorter periods than younger patients. In line with previous literature, bortezomib and lenalidomide were shown to be the most influential variables in total expenditure per patient [22, 24]. Differently from Teitelbaum et al. [22], who focused only on the claim-based burden of new drugs, our study considered direct costs from the database of a single hospital.
Show more

8 Read more

A forecasting model of the economic efficiency of data centre construction project

A forecasting model of the economic efficiency of data centre construction project

Today, a data centre is the main instrument for providing flexible, scalable IT services to business on the basis of distributed or cloud computing technologies. Building a data centre is always expensive and resource-intensive project. Therefore, during the development of the concept of this project it is extremely important to estimate accurately its economic efficiency. This article represents the model for the analysis of the effectiveness of investment in the data centre construction. Our model comprises several regressions that show correlations between main characteristics of the project (capital and operational expenditures, Net Present Value) and parameters of the data centre under construction (data centre area and the number of racks). The model is based on the results of the analysis of the current state and trends in the data centre market in Russia.
Show more

10 Read more

Agent-based models and economic policy

Agent-based models and economic policy

evaluating the impact of macroeconomic policies in order to assess what the agent-based literature can say on the current Great Recession and to provide a straightforward comparison with DSGE models. More specifically, in what follows we classify agent-based models in four macroeconomic policy areas, namely fiscal policy, monetary policy, bank regulation, and central bank independence. Fiscal policy. The Great Recession has rewaked interest for employing fiscal policies to tackle economic downturns. An advan- tage of agent-based models vis-à-vis mainstream ones is the possi- bility to jointly study the short- and long-run impact of fiscal policies. Dosi et al. (2010) try to do so developing an ABM, bridging Keynesian theories of demand-generation and Schumpeterian theories of technology-fueled economic growth (the K+S model). The model is populated by capital-good firms, consumption good- firms, consumers/workers and a public sector. Capital-good firms perform R&D and sell heterogeneous machine tools to consump- tion-good firms. Consumers supply labor to firms and fully consume the income they receive. The government levies taxes and it provides unemployment benefits. The model is able to endo- genously generate growth and business cycles and to replicate an ensemble of stylized facts concerning both macroeconomic dyna- mics (e.g. cross-correlations, relative volatilities, output distribu- tions) and microeconomic ones (firm size distributions, firm productivity dynamics, firm investment patterns). After having been empirically validated according to the output generated, the K+S model is employed to study the impact of fiscal policies (i.e. tax rate and unemployment benefits) on average GDP growth rate, output volatility and unemployment rate. The authors find that Keynesian fiscal policies are a necessary condition for economic growth and they can be successfully employed to dampen economic fluctuations. 28 Moreover, Dosi et al. (2012) find a strong interaction between income distribution and fiscal policies: the more income distribution is skewed toward profits, the greater the effects of fiscal policies.
Show more

448 Read more

Innovation, Finance, and Economic Growth : an agent-based model

Innovation, Finance, and Economic Growth : an agent-based model

Let us now consider the impact of banks and credit on output volatility. Figure 3 shows that volatility is typically high in the first periods of the simulation, and then quickly converges to its long-run stable value. This points out that the model gets out of its transient phase quite fast and then reaches its long-run stable growth pattern. The presence of banks appears to reduce output volatility. Again, the positive influence of the credit sector stems from the delicate balance between exploration and exploitation, together with the possible emergence of coordination failures. Indeed, when a firm leaves her island, production falls, and output can increase again only when a new (highly-productive) island is discovered. If agents move in a coordinated way, a huge drop in output will be followed by spurs of growth when new technologies are discovered and diffused. Banks break such a vicious cycle allowing agents to leave their island smoothly without the accumulation of the resources otherwise needed in the ‘autarkic’ scenario.
Show more

30 Read more

Show all 10000 documents...