Weekly Seminars

2007 – 2008

October 5, 2007

Wolfgang Polonik

University of California, Davis, Department of Statistics

Nonparametric Testing for Multivariate Volatility Models

Abstract
A novel nonparametric methodology is presented that facilitates the investigation of different features of a volatility function. We will discuss the construction of tests for (i) heteroscedasticity, for (ii) a bathtub shape, or a “smile effect”, and (iii) a parametric volatility model, where interestingly the resulting test for a smile effect can be viewed as a nonparametric generalization of the well-known LR-test for constant volatility versus an ARCH model. The tests can also be viewed as tests for the presence of certain stochastic dominance relations between two multivariate distributions. The inference based on those tests may be further enhanced through associated diagnostic plots. We will illustrated our methods via simulations and applications to real financial data. The large sample behavior of our test statistics is also investigated.

This is joint work with Q. Yao, London School of Economics.

October 11, 2007

Asger Lunde

Aarhus School of Business, Denmark

Bipower Variation with Noisy Data (Paper)

Abstract
Recent innovations in the financial econometrics literature have allowed us to estimate the ex-post variation of asset prices in the presence of noise and, separately, extract from the variation of asset prices the piece due to jumps when there is no noise. In this paper we combine these two tasks and provide estimates of the variation of the continuous component of prices taking into account the effect of noise. We provide a complete distribution theory for the staggered Bipower variation under flexible parametric assumptions about the noise. Moreover, we suggest a method for estimating the parameters of the noise distribution. The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice.

November 2, 2007

Yazhen Wang

National Science Foundation

University of Connecticut

Modeling and Analyzing High-Frequency Financial Data

Abstract
Volatilities of asset returns are central to the theory and practice of asset pricing, portfolio allocation, and risk management. In financialeconomics, there is extensive research on modeling and forecastingvolatility based on Black-Scholes, diffusion, GARCH, stochastic volatilitymodels and option pricing formulas. Nowadays, thanks to technologicalinnovations, high-frequency financial data are available for a host ofdifferent financial instruments on markets of all locations and at scaleslike individual bids to buy and sell, and the full distribution of suchbids. The availability of high-frequency data stimulates an upsurgeinterest in statistical research on better estimation of volatility. This talk will start with a review on low-frequency financial time series andhigh-frequency financial data. Then I will introduce popular realizedvolatility computed from high-frequency financial data and present my workon wavelet methods for analyzing jump and volatility variations and thematrix factor model for handling large size volatility matrices. Theproposed wavelet based methodology can cope with both jumps in the priceand market microstructure noise in the data, and estimate both volatilityand jump variations from the noisy data. The matrix factor model isproposed to produce good estimators of large size volatility matrices byattacking non-synchronized problem in high-frequency price data and reducing the huge dimension (or size) of volatility matrices. Parts of mytalk are based on joint work with Jianqing Fan, Qiwei Yao, and Pengfei Li.

November 9, 2007

Amir E. Khandani and Andrew W. Lo

Massachusetts Institute of Technology

What happened to the quants in August 2007

Abstract
During the week of August 6, 2007, a number of high-profile and highlysuccessful quantitative long/short equity hedge funds experiencedunprecedented losses. Based on empirical results from TASS hedge-fund dataas well as the simulated performance of a specific long/short equitystrategy, we hypothesize that the losses were initiated by the rapidunwinding of one or more sizable quantitative equity market-neutralportfolios. Given the speed and price impact with which this occurred, itwas likely the result of a sudden liquidation by a multi-strategy fund orproprietary-trading desk, possibly due to margin calls or a riskreduction. These initial losses then put pressure on a broader set oflong/short and long-only equity portfolios, causing further losses onAugust 9th by triggering stop-loss and de-leveraging policies. Asignificant rebound of these strategies occurred on August 10th, which is also consistent with the sudden liquidation hypothesis. This hypothesissuggests that the quantitative nature of the losing strategies wasincidental, and the main driver of the losses in August 2007 was thefiresale liquidation of similar portfolios that happened to bequantitatively constructed. The fact that the source of dislocation inlong/short equity portfolios seems to lie elsewhere – apparently in acompletely unrelated set of markets and instruments – suggests thatsystemic risk in the hedge-fund industry may have increased in recent years.

November 16, 2007

Alexander Lindner

Technische Universität München, University of Marburg
University of Braunschweig, Germany

A continuous time GARCH process driven by a Levy process

Abstract
A continuous time GARCH process which is driven by a Levy process is introduced. It is shown that this process shares many features with the discrete time GARCH process. In particular, the stationary distribution has heavy tails. Extensions of this process are also discussed. We then turn attention to some first estimation methods for this process, with particular emphasis on a generalized method of moment estimator. Finally, we also report on how the continuous time GARCH process approximates discrete time GARCH processes, when sampled at discrete times. The talk is based on joint work with Stephan Haug (TU Munich), Claudia Klueppelberg (TU Munich) and Ross Maller (Australian National University).

November 30, 2007

Amil Dasgupta  

London School of Economics
Centre for Economic Policy Research

The Price Impact of Institutional Herding

Abstract
We present a simple theoretical model of the price impact of institutional herding. In our model, career-concerned fund managers interact with profit-motivated proprietary traders and monopolistic market makers in a pure dealer-market. The reputational concerns of fund managers generate endogenous conformism, which, in turn, impacts the prices of the assets they trade. In contrast, proprietary traders trade in a contrarian manner. We show that, in markets dominated by fund managers, assets persistently bought (sold) by fund managers trade at prices that are too high (low) and thus experience negative (positive) long-term returns, after uncertainty is resolved. The pattern of equilibrium trade is also consistent with increasing (decreasing) short-term transaction- price paths during or immediately after an institutional buy (sell) sequence. Our results provide a simple and stylized framework within which to interpret the empirical literature on the price impact of institutional herding. In addition, our paper generates several new testable implications. (Joint with with Andrea Prat and Mechela Verardo)

December 10, 2007

Dale Rosenthal

The University of Chicago

Signing and Nearly-Gamma Random Variables

Abstract
Many financial events involve delays. I consider data delays and propose metrics for other phenomena: the mean time to deletion from a financial index, the weighted-average prepayment time for a loan portfolio, and the weighted-average default time for a loan portfolio. Under reasonable conditions, these are all nearly-gamma distributed; thus various small-sample approximations are examined. This approach also yields a metric of loan portfolio diversity similar to one used in rating collateralized debt obligations. Finally, the approximations are used to create a model for signing trades. The model is flexible enough to encompass the midpoint, tick, and EMO methods and yields probabilities of correct predictions.

December 12, 2007

Ilze Kalnina 

London School of Economics

Subsamplig High Frequency Data                                                                                    

Abstract
We investigate the use of subsampling for conducting inference about the integrated volatility of a discretely observed diffusion process under an infill asymptotic scheme. We show that the usual subsampling method of Politis and Romano (1994) is inconsistent when applied to our inference question. Recently, a type of subsampling has been used to do an additive bias correction to obtain a consistent estimator of the integrated volatility of a diffusion process subject to measurement error, Zhang, Mykland, and Ait-Sahalia (2005). This subsampling scheme is also inconsistent when applied to the inference question above. We propose a general method for conducting inference for integrated volatility in the presence of jumps or microstructure noise by subsampling appropriate consistent estimators. We apply our method to realized volatility and bipower variation of order (r, s). Our method is robust to leverage effect and finite activity jumps when the asymptotic distribution of the estimator does not depend on jumps. We also apply our method to the two scales estimator in the presence of endogenous measurement erroras in Kalnina and Linton (2007).

December 14, 2007

Mathieu Rosenbaum

Universite Paris-Est

Integrated Volatility and Round-off Error

Abstract
We consider a microstructure model for a financial asset, allowing for prices discreteness and for a diffusive behavior at large sampling scale. This model, introduced by Delattre and Jacod, consists in the observation at the high frequency n, and with round-off errors αn tending to zero, of a diffusion process. We give from this sample an estimator of the integrated volatility of the asset. Our method is based on variational properties of the process. We prove the accuracy of our estimation procedure is αn ∨ n−1/2. We also give limit theorems in the case of a homogeneous diffusion.

January 11, 2008

Nassim Taleb

From Practice to Theory, the Origins of Model Error: Preasymptotics and Inverse Problems in Quantitative Finance

2nd Friday on Finance

 

February 22, 2008

Stathis Tompaidis

University of Texas

Pricing American-Style Options by Monte Carlo Simulation: Alternatives to Ordinary Least Squares

Abstract
We investigate the performance of the Ordinary Least Squares (OLS) regression method in Monte Carlo simulation algorithms for pricing American options. We compare OLS regression against several alternatives and find that OLS regression underperforms methods that penalize the size of coefficient estimates. The degree of underperformance of OLS regression is greater when the number of simulation paths is small, when the number of functions in the approximation scheme is large, when European option prices are included in the approximation scheme, and when the number of exercise opportunities is large. Based on our findings, instead of using OLS regression we recommend an alternative method based on a modification of Matching Projection Pursuit. (Joint with Chunyu Yang)

April 4, 2008

Qiwei Yao

London School of Economics

Analysing Time Series with Nonstationarity: Common Factors and Curve Series

Abstract
We introduce two methods for modelling time series exhibiting nonstationarity. The first method is in the form of the conventional factor model. However the estimation is carried out via expanding the white noise space step by step, therefore solving a high-dimensional optimization problem by many low-dimensional sub-problems. More significantly it allows the common factors to be nonstationary. Asymptotic properties of the estimation were investigated. The proposed methodology was illustrated with both simulated and real data sets. The second approach is to accommodate some nonstationary features into a stationary curved (or functional) time series framework. It turns that the stationarity, though defined in a Hilbert space, facilitates the estimation for the dimension of the curved series in terms of a standard eigenanalysis.

April 11, 2008

Kenneth Singleton

Stanford University

Why Do Risk Premiums in Sovereign Credit Markets Covary?

April 18, 2008

Mathieu Kessler

Universidad Politecnica de Cartagena, Spain

Exact filters for discretized diffusions

April 23, 2008

Ronnie Sircar

Princeton University

Convex Risk Measures in Financial Mathematics

CAMP Seminar

April 24, 2008

Ronnie Sircar

Princeton University

Homogeneous Groups and Multiscale Intensity Models for Multiname Credit Derivatives

Abstract
The pricing of basket credit derivatives is contingent upon
1. realistic modeling of the firms’ default times and the correlation   between them; and
2. efficient computational methods for computing the portfolio loss distribution from the firms’ marginal default time distributions.
We revisit intensity-based models and, with the aforementioned issues in mind, we propose improvements
1. via incorporating fast mean-reverting stochastic volatility in the default intensity processes; and
2. by considering a hybrid of a top-down and a bottom-up model with homogeneous groups within the original set of firms.
We present a calibration example from CDO data, and discuss the relative performance of the approach.
This is joint work with Evan Papageorgiou.

April 25, 2008

Dag Tjostheim

University of Bergen

Estimation in time series that are both nonlinear and nonstationary

Abstract
Motivated by problems in nonlinear cointegration theory I will look at estimation in time series that are both nonlinear and nonstationary. The models considered are nonlinear generalizations of the random walk. Markov recurrence theory is used to establish asymptotic distributions. The emphasis is on nonparametric estimation, but I will also look at parametric estimation in a nonstationary threshold model.

May 2, 2008

Jianqing Fan

Princeton University

Modeling and Estimation of High-Dimensional Covariance Matrix for Portfolio Allocation and Risk Management

Abstract
Large dimensionality comparable to the sample size is a common feature as in modern portfolio allocation and risk management. Motivated by the Capital Asset Pricing Model, we propose to use a multi-factor model to reduce the dimensionality and to estimate the covariance matrix among those assets. Under some basic assumptions, we have established the rate of convergence and asymptotic normality for the proposed covariance matrix estimator. We identify the situations under which the factor approach can gain substantially the performance and the cases where the gains are only marginal, where compared with covariance matrix. We also introduce the concept of sparse portfolio allocation and propose two efficient algorithms for selecting the optimal subset of the portfolio. The risk of the optimally selected portfolio is thoroughly studied and examined. The performance, in terms of risk and utility, of the sparsely selected portfolio is compared with the classical optimal portfolio of Markowitz (1952).

May 9, 2008

Jostein Paulsen

University of Bergen
The 
University of Chicago

Optimal dividend payments and reinvestments of diffusion processes with both fixed and proportional costs

Abstract
Assets are assumed to follow a diffusion process subject to some conditions.   The owners can pay dividends at their discretion, but whenever assets reach zero, they have to reinvest money so that assets never go negative.  With each dividend payment there is a fixed and a proportional cost, and so with reinvestments.  The goal is to maximize expected value of discounted net cash flow, i.e. dividends paid minus reinvestments.  It is shown that there can be two different solutions depending on the model parameters and the costs.
1. Whenever assets reach a barrier they are reduced by a fixed amount through a dividend payment, and whenever they reach 0 they are increased to another fixed amount by a reinvestment.
2. There is no optimal policy, but the value function is approximated by policies of the form described in Item 1 for increasing barriers. We provide criteria to decide whether an optimal solution exists, and when not, show how to calculate the value function.  It is discussed how the problem can be solved numerically and numerial examples are given.  The talk is based on a paper with the same title to appear in SIAM Journal of Control and Optimization.

May 16, 2008

Richard Thaler   

The University of Chicago

TBA

2nd Friday on Finance

 

May 30, 2008

Sebastian Jaimungal

University of Toronto

Hitting Time Problems with Applications to Finance and Insurance

Abstract
The distribution of the first hitting time of a Brownian motion to a linear boundary is well known. However, if the boundary is no longer linear, this distribution is not in general identifiable. Nonetheless, the boundary and distribution satisfy a variety of beautiful integral equations due to Peskir. In this talk, I will discuss how to generalize those equations and lead to an interesting partial solution to the inverse problem: Given a distribution of hitting times, what is the corresponding boundary? By randomizing the starting point of the Brownian motion, I will show how a kernel estimator of the distribution with gamma kernels can be exactly replicated.
Armed with these tools, there are two natural applications: one to finance and one to insurance. In the financial context, the Brownian motion may drive the value of a firm and through a structural modeling approach I will show how CDS spread curves can be matched. In the insurance context, suppose an individuals health reduces by one unit per annum with fluctuations induced by a Brownian motion and once their health hits zero the individual dies. I will show how life-table data can be nicely explained by this model and illustrate how to perturb the distribution for pricing purposes.

This is joint work with Alex Kreinin and Angelo Valov

June 6, 2008

Matheus Grasselli

McMaster University

Indifference pricing for general semimartingales

Abstract
We prove a duality formula for utility maximization with random endowment in general semimartingale incomplete markets.  The analysis is based on the Orlicz space $L^{\hat u}$ naturally associated with the utility function $u$. Using this formulation we prove several key properties of the indifference price $\pi(B)$ for a claim $B$ satisfying conditions weaker than those assumed in the literature.  In particular, the price functional $\pi$ turns out to be, apart from a sign, a convex risk measure on the Orlicz space $L^{\hat u}$. This is joint work with S. Biagini and M. Frittelli.