Herbert Dawid, Simon Gemkow, Philipp Harting, Sander van der Hoog, and Michael Neugart
This chapter introduces the Eurace@Unibi model, one of the agent-based simulation models that are relatively new additions to the toolbox of macroeconomists, and the research that has been done within this framework. It shows how an agent-based model can be used to identify economic mechanisms and how it can be applied to spatial policy analysis. The assessment is that agent-based models in economics have passed the proof-of-concept phase and it is now time to move beyond that stage. It has been shown that new kinds of insights can be obtained that complement established modeling approaches. The chapter concludes by pointing toward some potentially fruitful areas of agent-based macroeconomic research.
Giulia Iori and James Porter
This chapter discusses a step in the evolution of agent-based model (ABM) research in finance. Agent-based modeling has concentrated on the development of stylized market models, which have been extremely useful for understanding how complex macro-scale phenomena emerge from micro-rules. In order to further develop ABMs from proof of concept into robust tools for policy makers, to control and forecast complex real-world financial markets, it is essential to permit agents to behave as active data-gathering decision makers with sophisticated learning capabilities. The main focus of this chapter is to show how agent based models (ABMs) in financial markets have evolved from simple zero- intelligence agents that follow arbitrary rules of thumb into sophisticated agents described by microfounded rules of behavior. The chapter then briefly looks at the challenges posed by and approaches to model calibration and provides examples of how ABMs have been successful at offering useful insights for policy making.
Frank Westerhoff and Reiner Franke
With the help of two examples, this chapter illustrates the usefulness of agent-based models as tools for economic policy design. The first example applies a financial market model in which the order flow of speculators, relying on technical and fundamental analysis, generates intricate price dynamics. The second example applies a Keynesian-type goods market model in which the investment behavior of firms, relying on extrapolative and regressive predictors, generates complex business cycles. It adds a central authority to these two setups and explores the impact of simple intervention strategies on the model dynamics. On the basis of these experiments, the chapter concludes that agent-based models may help us understand how markets function and evaluate the effectiveness of various stabilization policies.
Michael Neugart and Matteo Richiardi
The chapter reviews the literature concerning agent-based labor market models by tracing its roots to the microsimulation literature and surveying a selection of con- tributions made since the work by Bergmann and Eliasson et al. Agent-based models have been applied to explain stylized facts of labor markets as well as labor market policy evaluations. They also constitute a major part of agent-based macroeconomic models. Besides reviewing the various results achieved, the chapter discusses modeling choices with respect to agents' behavior and the structure of interaction. The overall assessment is that agent-based labor market models have given us valuable insights into the functioning of labor markets and the consequences of labor market policies, and that they will increasingly become an essential tool of analysis, in particular, when the construction of large macro-models is involved.
Vassilios Vassiliadis and Georgios Dounias
The chapter discusses algorithmic trading, which refers to any automated process, consisting of a number of interconnected components, whose main aim is to perform financial transactions of any kind. Its chief advantage lies in the fact that human intervention is minimized to an acceptable extent. This is quite desirable because nowadays numerous factors affect financial decisions. Financial managers are able to deal with a limited amount of information. There are many ways to implement algorithmic trading systems. This chapter aims to highlight the efficiency of biologically inspired methodologies when incorporated in such systems. Biologically inspired intelligence comprises a range of algorithms whose common philosophy is based on the behavior of real-world, natural systems and networks. What is more, the performance of the applied nature-inspired intelligence (NII) methodologies is compared to traditional benchmark approaches such as the random portfolio construction.
Peter Gomber and Kai Zimmermann
The use of computer algorithms in securities trading, or algorithmic trading, has become a central factor in modern financial markets. The desire for cost and time savings within the trading industry spurred buy side as well as sell side institutions to implement algorithmic services along the entire securities trading value chain. This chapter encompasses this algorithmic evolution, highlighting key cornerstones in it development discussing main trading strategies, and summarizing implications for overall securities markets quality. In addition, it touches on the contribution of algorithmic trading to the recent market turmoil, the U.S. Flash Crash, including the discussions of potential solutions for assuring market reliability and integrity.
This chapter reviews recent developments in the analysis of macroeconomic panel data which typically involve aggregate variables from various countries. In contrast to the large N, small T framework that characterizes microeconomic panels, the two dimensions of a macroeconomic data set are more balanced, often providing a comparable number of time periods and countries (regions). Although this is inconsequential for the analysis based on the linear static panel data framework, it becomes crucial when estimating a dynamic model. A second important feature of macroeconomic data is cross-section dependence among countries. In many cases this dependence cannot be accommodated by a simple function of the geographical distance but also depends on trade relations and the level of economic development. Furthermore, cross-country data often exhibit a much richer pattern of heterogeneity that cannot be represented just by letting the intercept vary across countries. While it is often infeasible to allow for individual specific regression coefficients in a large N, small T panel framework, this may be a reasonable option when analyzing macroeconomic data.
Marine Carrasco, Jean-Pierre Florens, and Eric Renault
This chapter studies the estimation of φ in linear inverse problems Tφ = r, where r is only observed with error and T may be given or estimated. The unknown element φ belongs to a Hilbert space E. Four examples are relevant for econometrics: the density estimation, the deconvolution problem, the linear regression with an infinite number of possibly endogenous explanatory variables, and the nonparametric instrumental variables estimation. In the first two cases T is given, whereas it is estimated in the two other cases, respectively at a parametric or nonparametric rate. This chapter will recall the main results on these models: concepts of degree of ill-posedness, regularity of φ, regularized estimation, and the rates of convergence usually obtained. The main contributions are, moreover, related to the asymptotic normality of the regularized solution φ obtained with a regularization parameter α. If α → 0, we particularly consider the asymptotic normality of inner products <φ, ϕ>, where ϕ is an element of E. These results can be used to construct (asymptotic) tests on φ.
Petr Dostál and Chia-Yang Lin
The chapter focuses on the use of fuzzy logic, or soft computing, among the different methods used as supports for decision making in business applications. The processes are focused on private corporate attempts at making money or decreasing expenses; therefore, the details of applications, successful or not, are not published very often. Fuzzy logic helps in decentralization of decisionmaking processes that are to be standardized, reproduced, and documented. Fuzzy logic plays very important roles, especially in business, because it helps reduce costs. It differs from conventional (hard) computing in that it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the role model for fuzzy logic is the human mind. The guiding principle of fuzzy logic is to exploit this tolerance to achieve tractability, robustness, and low solution cost.
Ian Sue Wing and Edward J. Balistreri
This chapter reviews recent applications of computable general equilibrium (CGE) modeling in the analysis and evaluation of policies that affect interactions among multiple markets. At the core of this research is a particular approach to the data and structural representations of the economy, elaborated through the device of a canonical static multiregional model. This template is adapted and extended to shed light on the structural and methodological foundations of simulating dynamic economies, incorporating “bottom-up” representations of discrete production activities, and modeling contemporary theories of international trade with monopolistic competition and heterogeneous firms. These techniques are motivated by policy applications including trade liberalization, development, energy policy and greenhouse gas mitigation, the impacts of climate change and natural disasters, and economic integration and liberalization of trade in services.
In this chapter an agent-based model of endogenously evolving migrant networks is developed to find and estimate the size of determinants of migration and return decisions. Individuals are connected by links, the strength of which declines over time and distance. Methodologically speaking, this chapter combines parameterization using data from the Mexican Migration Project with calibration. It is shown that expected earnings, an idiosyncratic home bias, network ties to other migrants, strength of links to the home country, and age have a significant impact on circular migration patterns over time. The model can reproduce spatial patterns of migration as well as the distribution of the number of trips of migrants. It can also be used for computational experiments and policy analysis.
Computational Economics in the Era of Natural Computationalism: The Theory of Self-Reproducing Automata
Shu-Heng Chen, Mak Kaboudan, and Ye-Rong Du
After a brief review of natural computationalism, this introductory chapter presents a new skeleton of computational economics and finance (CEF) along with an overview of the handbook. It begins with a conventional pursuit focusing on the algorithmic or numerical aspect of CEF such as computational efforts devoted to rational expectations, (dynamic) general equilibrium, and volatility. It then moves toward an automata- or organism-based perspective of CEF, involving nature-inspired intelligence, algorithmic trading, automated markets, network- and agent-based computing, and neural computing. As an alternative way to introduce this novel skeleton, the chapter starts with a view of computation or computing, addressing what computational economics intends to compute and what kinds of economics make computation so hard, and then it turns to a view of computing systems in which the Walrasian kind of computational economics is replaced by the Wolframian kind due to computational irreducibility.
Computational Industrial Economics: A Generative Approach to Dynamic Analysis in Industrial Organization
Approach to Dynamic Analysis in Industrial Organization This chapter offers a basic agent-based computational model of industry dynamics that allows us to study the evolving industry structure through entry and exit of heterogeneous firms. The field of modern industrial economics focuses on the structure and performance of the industry in equilibrium when firms make decisions in an optimizing way, typically with perfect foresight. The patterns that arise in the process of adjustment, induced by persistent external shocks, are often ignored for lack of a proper tool for analysis. The model introduced here induces turbulence in market structure through unpredictable shocks to the firms' technological environment. The base model presented here enables the analysis of interactive dynamics between firms as they compete in a changing environment with limited rationality and foresight. A possible extension of the base model, allowing for R&D by firms, is also discussed.
This chapter introduces the concept of financial networks and reviews research in three of the most active research areas of financial systems: interbank payment networks, interbank exposure networks, and asset correlation networks. The financial crisis of 2007-2008 revealed the intertwined nature of modern financial systems. A promising methodology for capturing and modeling connections in the financial system is provided by network theory. The intricate structure of linkages between financial institutions, among sectors of the economy, and across financial systems can conveniently be captured by using a network representation. Empirical research on describing existing networks is presented, as well as new modeling and simulation approaches for financial risk that take into account the complex structure of financial markets and infrastructures.
Average quarterly price changes in six contiguous southern California cities are obtained and used, first, to determine if price changes in contiguous cities are spatiotemporally contagious, then to forecast each city’s average prices for four quarters (or one year, 2014). In order to capture the contagious effects, a spatiotemporal contagion response measure is proposed and computed. The measure quantifies the responsiveness of residential home-price changes in one location (or city) to lagged price changes in another location. Average home characteristics (such as square footage and number of bedrooms), as well as lagged average quarterly mortgage rates, lagged average quarterly unemployment rates, and lagged average quarterly price changes of all locations, are input variables used to estimate the response measures and produce price forecasts. Models and forecasts are obtained first using genetic programming then compared to outcomes obtained using linear regressions.
Sanjeev Goyal, Adrien Vigier, and Marcin Dziubinski
Conflict remains a central element in human interaction. Networks—social, economic, and infrastructure—are a defining feature of society. Conflict and networks intersect in a wide range of empirical contexts.. The aim of this chapter is to present the general themes, provide a survey of the nascent research, and point to a number of interesting open questions .
Ernesto Dal Bó and Pedro Dal Bó
This article examines general equilibrium effects of conflict using a standard trade model, but with a focus on the effects of domestic conflict. Conflict introduces distortions in an economy that are typically different than other distortions examined by economists. Tax or subsidy schemes on consumption and production, trade policies, and technology policies can then be optimal responses to the presence of conflict. The article shows how the different policy instruments can reduce conflict and how they rank relative to one another in terms of welfare.
A. Colin Cameron and Pravin K. Trivedi
This chapter surveys panel data methods for a count dependent variable that takes nonnegative integer values, such as number of doctor visits. The focus is on short panels, as the literature has concentrated on this case. The survey covers both static and dynamic models with random and fixed effects. The chapter surveys quasi---ML\ methods based on the Poisson, as well as richer more parametric models - negative binomial models, finite mixture models, hurdle models, and with-zeros models.
Jeffrey S. Racine and Christopher F. Parmeter
When comparing two competing approximate models using a particular loss function, the one having smallest “expected true error” for that loss function is expected to lie closest to the underlying data generating process (DGP) given this loss function and is therefore to be preferred. This chapter considers a data-driven method for testing whether or not two competing approximate models are equivalent in terms of their expected true error (i.e., their expected performance on unseen data drawn from the same DGP). The proposed test is quite flexible with regard to the types of models that can be compared (i.e., nested versus non-nested, parametric versus nonparametric) and is applicable in cross-sectional and time-series settings. Moreover, in time-series settings our method overcomes two of the drawbacks associated with dominant approaches, namely, their reliance on only one split of the data and the need to have a sufficiently large “hold-out” sample for these tests to possess adequate power.
This chapter examines models of diffusion in networks, and specifically how the topology of the network impacts the spreading process. The chapter begins by discussing epidemiological models and how stochastic dominance relations can be used to understand the effect of the degree distribution of the network. The chapter then turns to more sophisticated models of social influence, including threshold models and models of social learning. A key insight that emerges from the collection of models discussed is that not only does network structure matter, but how the network matters depends on the way in which agents influence one another. Network features that facilitate contagion under one model of influence can inhibit diffusion in another. The chapter concludes with thoughts on directions for future research.