Mark Kritzman, Simon Myrgren, and Sebastien Page
A technique called dynamic programming can be used to identify an optimal rebalancing schedule, which significantly reduces rebalancing and sub-optimality costs. Dynamic programming provides solutions to multi-stage decision processes in which the decisions made in prior periods affect the choices available in later periods. Dynamic programming provides the optimal year-by-year decision policy by working backwards from year 10. The results of the test of the relative efficacy of dynamic programming and the MvD heuristic with data on domestic equities, domestic fixed income, non-US equities, non-US fixed income, and emerging market equities, show that the MvD heuristic performs quite well compared to the dynamic programming solution for the two-asset case and substantially better than other heuristics. The increase in the number of assets reduces the advantage of dynamic programming over the MvD heuristic and is reversed at the level of five assets. Dynamic programming cannot be applied beyond five assets, but the MvD heuristic can be extended up to 100 assets. The MvD heuristic reduces total costs relative to all of the other heuristics by substantial amounts. The performance of the MvD heuristic improves relative to the dynamic programming solution as more assets are added but this improvement reflects a growing reliance on an approximation for the dynamic programming approach.
Petter N. Kolm and Lee Maclin
This article discusses the portfolio optimization with market impact costs, combining execution and portfolio risk, and dynamic portfolio analysis. A multi-period portfolio optimization model is proposed that incorporates permanent and temporary market impact costs, and alpha decay. There are five popular algorithmic trading strategies that include arrival price, market-on-close, participation, time-weighted average price (TWAP), and volume-weighted average price (VWAP). For a VWAP benchmark, the lowest risk execution is obtained by trading one's own shares in the same fractional volume pattern as the market. VWAP execution is expected to result in the lowest temporary market impact costs. The temporary market impact in a rate of trading model is a function of one's own rate of trading expressed as a fraction of the absolute trading activity of the market. One popular interpretation of the model is that the markets are relatively efficient with respect to the relationship between trading volume and volatility, which are typical inputs of the model. Any reduction in impact that results from more trading volume would be offset by an increase in impact due to increased volatility. The lowest absolute rate of trading can be realized by distributing one's orders evenly over time. This is called a time-weighted average price (TWAP) execution.
Antony Davies, Kajal Lahiri, and Xuguang Sheng
This article illustrates how frameworks built around multidimensional panel data of forecasts can be used not only to test the rational expectations hypothesis correctly, but also to study alternative expectations-formation mechanisms, to distinguish anticipated from unanticipated shocks, and to distinguish forecast uncertainty from disagreement.
Francis Breedon and Robert Kosowski
The article aims to discuss the optimal asset allocation for sovereign wealth funds (SWF). The main purpose of a commodity based sovereign wealth fund is to create a permanent income stream out of a temporary one and so allow consumption smoothing over time. The asset allocation framework typically consists of an objective function that implies a preference for the highest return for a given level of risk. The ultimate objective of a SWF is to smooth consumption and achieve intergenerational transfers. The accumulation of financial assets presupposes functioning markets for consumption goods such as food products. Another consideration that may guide the investment behavior of sovereign wealth funds and that highlights the role of liabilities is food security. Future food imports are a key component of the balance of payments identity. A rigorous analysis of the commodity fund's optimal asset allocation policy must take into account the role of liabilities and therefore requires an analysis of the country's balance of payments. The ALM takes into account the role of liabilities and the resulting additional hedging demands. The asset liability management (ALM) examines both assets and financial liabilities and models the return on assets and the return on liabilities.
Eric Jacquier and Nicholas Polson
This article looks at the usefulness of Bayesian methods in finance. It covers all the major topics in finance. It discusses the predictability of the mean of asset returns, central to finance, as it relates to the efficiency of financial markets. It reviews the economic relevance of predictability and its impact on optimal allocation. It also describes the Markov chain Monte Carlo (MCMC) and particle filtering algorithms that are important in modern Bayesian financial econometrics. MCMC algorithms have resulted in a tremendous growth in the use of stochastic volatility models in financial econometrics. This article also contains some major contributions of Bayesian econometrics to the literature on empirical asset pricing. Many of the other themes in modern Bayesian econometrics, including the use of shrinkage and the interaction between theory and econometrics are discussed. This article ends up with the discussion of a promising recent development in finance: filtering with parameter learning.
This article provides an overview on the Bayesian approach to investment decisions, emphasizing its foundations, its most practical uses, and the computational techniques that are essential to its effective implementation. The Bayesian approach provides a convenient framework for incorporating subjective information and views into an investment decision, through prior distribution. The Bayesian approach to investment decisions begins with a statistical model that relates historical data, such as past returns, to important parameters, such as expected future returns. Bayes' theorem is a simple relationship between the probability of an event A conditional on another event B and the probability of B conditional on A. The posterior distribution encapsulates the information content of both the data and the prior and is often the central focus of a Bayesian statistical analysis. The predictive distribution contains all information about the future that is of interest to the investor, combining the information content of the prior distribution with that of the historical data. Estimation risk means the investment risk is associated with not knowing the true values of parameters and is a focus of a Bayesian investment analysis. Utility theory and subjective probability are two of the main concepts in the Bayesian approach to investment decisions.
Mark H. A. Davis
This article gives an account of mathematical techniques for credit risk models where there is contagion between the obligors, i.e., default of one party either directly causes default of other parties or (more commonly) changes other parties' risk of default. Section 2 starts with a general discussion of joint distributions and copulas, mainly to point out that ‘contagion’ is in some sense already built into the copula concept. Section 3 gives a general formulation of the reduced-form model and a taxonomy of models distinguishing between factor, frailty, and contagion models. Section 4 gives some background information about Markov processes, Markov chains, and phase-type distributions as required for the subsequent sections. Section 5 discusses four simple but effective Markov chain-based models with applications in counterparty risk and credit risk for inhomogeneous and homogeneous portfolios. Sections 6 and 7 develop the ‘subsidiary themes’ mentioned above. Section 8 returns to the further development of the Enhanced Risk homogeneous portfolio model, introduced in Section 5.4, in the light of these themes.
This article addresses the challenges posed by marginal default distribution models – for illustrative purposes it uses a straightforward Gaussian copula – and details counterparty risk corrections for credit default swaps (CDSs), index CDSs, and collateralised debt obligations. The static copula approach fixes the expected value of the product conditional on counterparty default, but not its distribution. As a consequence, only bounds for the value of the correction are provided, but in many cases these are tight enough to be useful. The article presents a number of numerical examples including a timely reminder that ‘risk-free’ super-senior tranches are particularly prone to counterparty risk.
Alexander Lipton and Andrew Rennie
This article develops a methodology for valuing the counterparty credit risk inherent in credit default swaps, and presents a multi-dimensional extension of Merton's model (Merton 1974), where the joint dynamics of the firm's values are driven by a multi-dimensional jump-diffusion process. Applying the Fast Fourier Transform and finite-difference methods, it develops a forward induction procedure for calibrating the model, and a backward induction procedure for valuing credit derivatives in 1D and 2D. Jump size distributions of two types are considered, namely, discrete negative jumps (DNJs) and exponential negative jumps (ENJs), and showed that, for joint bivariate dynamics, the model with ENJs produces a noticeably lower implied Gaussian correlation than the one produced by the model with DNJs, although for both jump specifications the corresponding marginal dynamics fit the market data adequately. Based on these observations, and given the high level of the default correlation among financial institutions (above 50 per cent), the model with DNJs, albeit simple, seems to provide a more realistic description of default correlations and, thus, the counterparty risk, than a more sophisticated model with ENJs.
Jules H. van Binsbergen, Michael W. Brandt, and Ralph S. J. Koijen
The article addresses the investment problem of a pension fund in which a centralized decision maker, the Chief Investment Officer (CIO), employs multiple asset managers to implement investment strategies in separate asset classes. The investment management division of pension funds is typically structured around traditional asset classes such as equities, fixed income, and alternative investments. The asset allocation decisions are made in at least two stages. Firstly, the CIO allocates capital to the different asset classes, each managed by a different asset manager. Secondly, each manager decides how to allocate the funds made available to him, that is, to the assets within his class. The CIO of the fund therefore faces a tradeoff between the benefits of decentralization, driven by the market timing and stock selection skills of the managers, and the costs of delegation and decentralization. The optimal portfolio of the asset managers can be decomposed into two components. The first component is the standard myopic demand that optimally exploits the risk-return trade-off. The second component minimizes the instantaneous return variance and is therefore labeled the minimum-variance portfolio. The minimum variance portfolio substitutes for the riskless asset in the optimal portfolio of the asset manager. The two components are then weighted by the risk attitude of the asset manager to arrive at the optimal portfolio.
Edward I. Altman
Three main variables affect the credit risk of a financial asset: (i) the probability of default (PD); (ii) the ‘loss given default’ (LGD), which is equal to one minus the recovery rate in the event of default (RR); and (iii) the exposure at default. While significant attention has been devoted by the credit risk literature to the estimation of the first component (PD), much less has been dedicated to the estimation of RR and to the relationship between PD and RR. This article, which presents a detailed review of the way credit risk models, developed during the last thirty years, have treated the recovery rate and, more specifically, its relationship with the probability of default of an obligor, is organized as follows. Sections 2, 3, and 4 review these three different approaches, together with their basic assumptions, advantages, drawbacks, and empirical performance. Section 5 examines credit value-at-risk models. Section 6 considers the more recent studies explicitly modelling and empirically investigating the relationship between PD and RR. Section 7 discusses Bank of International Settlement efforts to motivate banks to consider ‘downturn LGD’ in the specification of capital requirements under Basel II. Section 8 reviews the very recent efforts by the major rating agencies to provide explicit estimates of recovery given default. Section 9 revisits the issue of procyclicality and Section 10 presents some recent empirical evidence on recovery rates on both defaulted bonds and loans, and also on the relationship between default and recovery rates. Section 11 concludes.
The use of equity factor models is increasingly significantly within the institutional asset management community. They are routinely used to estimate the potential benchmark relative returns of equity securities and portfolios. Equity factor models offer numerous advantages over simple historical observation in providing understandable linkages between security characteristics and subsequent returns, and filtering out much of the random noise affecting returns. Most importantly such models help clarify the distinction between return generating processes that impact a particular security, and processes which are in common across many firms. The return models are used to describe whether a particular return generating process is in equilibrium or to reveal an anomaly in asset pricing theory. Random matrix theory has been used to demonstrate that blind factor models often incorrectly identify factors where there is no true underlying structure of the observed returns. A variety of pricing models can be used to value the options, allowing for the inclusion of stochastic processes for volatility and interest rates. Several techniques have become available to make such models respond more rapidly to changes in financial market conditions.
Valérie Chavez‐Demoulin and Paul Embrechts
This article aims to provide the basics any risk manager should know on the modelling of external events, and this from a past–present–future research perspective. Such events are often also referred to as low-probability events or rare events. The article is organised as follows. Section 2 starts with an overview of the credit risk-specific issues within Quantitative Risk Management and shows where relevant Extreme Value Theory-related questions are being asked. Section 3 presents the one-dimensional theory of extremes, whereas Section 4 is concerned with the multivariate case. Section 5 discusses particular applications and gives an outlook on current research in the field, while Section 6 concludes.
This article gives an introduction to the mechanics and techniques used to develop market models for credit, focusing on the single credit case. The concept of a background filtration was introduced by Jeanblanc and Rutkowski (2000), and has been used by multiple authors since, e.g., Jamshidian (2004) and Brigo and Morini (2005). The theory is worked out in relative detail in Section 2, which also formalises the concept of a background price process. Section 3 shows how this theory can be used to derive Black-type pricing formulas for default swaptions. A full market model is developed in Section 4. It turns out that, in the case where both credit spreads and interest rates are stochastic and dependent, the situation is not as clean as for interest rate market models. This is because default protection payoffs inherently contain a measure mismatch due to the timing of default and the payout. In the case where credit and interest rates are independent, the situation simplifies considerably. Under these simplifications, Section 5 describes in detail the induction steps needed to construct a market model from scratch. Section 6 describes an alternative method of constructing credit default swaps (CDS) market models. Instead of using forward default intensities as model primitives, it directly describes the dynamics of certain survival probability ratios. Finally, the pricing of constant maturity CDS contracts is examined in Section 7.
This article explores the component parts of the global fixed income market and focus on risks arising from interest rates and credit spreads. In fixed income markets including foreign exchange, over-the-counter (OTC) derivatives dwarf exchange-traded derivatives in notional size. An OTC derivative is the contract between two parties where there is counter party credit risk. It is not certain that both parties will honor their contractual obligations. Exchange-traded derivatives have minimal counterparty risk, since exchanges use capital cushions and collateral (margin) collection to avoid defaults. Derivatives' notionals, particularly interest rate derivatives' notionals are many times as large as global fixed income markets. A fixed income instrument consists of a series of future cash flows that may occur at future times. The sources of risk are the sizes of the flows, the timing of the flows, and whether or not the flows will actually occur as agreed. The observations about dimension reduction and interpolation indicate that there isn't a large gap between observable market prices of a discrete number of bonds, and a properly fitted smooth continuous function describing a yield curve. The class of models allows Monte Carlo simulations of complex instruments to be run in order to characterize the distribution of responses to changes in the yield curve.
Terence C. Mills
This article provides a comprehensive review of the core ideas and models that have proved central to the forecasting of financial time series. Forecasting the levels or, more appropriately, the changes in financial time series can be an extremely difficult exercise, particularly when using just the past history of the series itself. Forecasts other than the “no change” implied by a random walk tend to be associated with considering long forecast horizons, with taking account of the nonlinearity induced by, say, different regimes, or with incorporating wider information sets, particularly long-run, equilibrium relationships. Other features of financial time series, such as volatility and the time between price changes, are more likely to exhibit some degree of forecastability. The moral of this analysis is that one must consider very carefully what features of a financial time series are likely to be predictable and what features will be inherently unforecastable, and consequently concentrate attention on the former.
Derek W. Bunn and Nektaria V. Karakatsani
This article, which reviews the modeling and forecasting of energy commodities prices, with a particular focus on spot electricity prices, describes the complexities of the market and price formation, and the models that are used for forecasting in the face of these complexities. It also notes the need to adopt methods that allow for adaptation in the face of a changing environment.
Peter Reinhard Hansen and Asger Lunde
This article focuses on some aspects of high-frequency data and their use in volatility forecasting. High-frequency data can be used to construct volatility forecasts. The article reviews two leading approaches to this. One approach is the reduced-form forecast, where the forecast is constructed from a time series model for realized measures, or a simple regression-based approach such as the heterogeneous autoregressive model. The other is based on more traditional discrete-time volatility models that include a modeling of returns. Such models can be generalized to utilize information provided by realized measures. The article also discusses how volatility forecasts, produced by complex volatility models, can benefit from high-frequency data in an indirect manner, through the use of realized measures to facilitate and improve the estimation of complex models.
Michael Wolf and Dan Wunderli
This article addresses the problem of fund selection from a statistical point of view. The analysis is based solely on the track records of individual managers. The statistical problems that need to be addressed in order to implement the solution effectively include the non-normality of hedge fund returns, the time series nature of hedge fund returns, and the choice of the individual performance measures. One other issue that needs to be considered is accounting for the dependency across managers in order to improve the power of the statistical method, that is, its ability to detect skilled managers. It would be possible to rank the fund managers simply according to their non-studentized test statistics that is, according to the “raw” alpha estimates. Ranking by test statistics does not account for the varying risks taken on by the various fund managers. Once the test statistics have been obtained, it is the task of the multiple testing methods to compute a cutoff value, denoted by d, from the joint track records of all managers in the investment universe and then the task is to declare those managers as skilled for which test statistics is greater than the cutoff value. This has to be done in a way such that the familywise error rate (FEW) is controlled.
Alexander Lipton and Andrew Rennie
This article examines the conventional bond pricing methodology and shows that it does not adequately reflect the nature of the credit risk faced by investors. In particular, it demonstrates that the strippable discounted cash flows valuation assumption, which is normally taken for granted by most analysts, leads to biased estimates of relative value for credit bonds. The article introduces a consistent survival-based valuation methodology that is free of biases, albeit at a price of abandoning the strippable discounted cash flows valuation assumption, and also develops a robust estimation methodology for survival probability term structures using the exponential splines approximation. This methodology is implemented and tested in a wide variety of market conditions, and across a large set of sectors and issuers, from the highest credit quality to highly distressed ones. It concludes that the adoption of the survival-based methodologies advocated in this article by market participants will lead to an increase in the efficiency of the credit markets, just as the adoption of better pre-payment models led to efficiency in the mortgage-based securities markets twenty years ago.