High Frequency Data and High Frequency Trading | May 16 – 18, 2013

 

CONFIRMED SPEAKERS

SCIENTIFIC COMMITTEE

LOCAL ORGANIZING COMMITTEE

  • Mary King, Nina Bachkova, Roger Lee, Per Mykland

CONFERENCE SITE

  • 5727 S University Ave

ABSTRACTS

Assessing VPIN Measurement of Order Flow Toxicity Using Perfect Trade Classification
Torben Andersen
Northwestern

Abstract

The VPIN, or Volume-synchronized Probability of INformed trading, metric is introduced by Easley, L\'{o}pez de Prado and O’Hara (ELO) as a real-time indicator of order flow toxicity. They find the measure useful in predicting return volatility and conclude it may help signal impending market turmoil. The VPIN metric involves decomposing volume into active buys and sells. We use the best-bid-offer (BBO) files from the CME Group to construct (near) perfect trade classification measures for the E-mini S&P 500 futures contract. We investigate the accuracy of the ELO’s trade classification and find it inferior to a standard tick rule based on individual transactions. Moreover, when VPIN is constructed from accurate classification, it behaves in a diametrically opposite way to ELO VPIN. We also find that the latter to have forecast power for short-term volatility solely because it generate systematic classification errors that are correlated with trading volume and return volatility. When controlling for trading intensity and volatility, the ELO VPIN measure has no incremental predictive power for future volatility. We conclude that VPIN is not suitable for measuring order flow imbalances.

Risk Management of Low Latency Trading Strategies Using Cox Processes
Yuri Balasanov
University of Chicago and Research Software International

Abstract

We consider trading strategy, which generates dynamic portfolio that changes with low latency in response to changes of the market. Traditional approaches to calculation of risk measures, like VaR or shortfall, do not apply to this case of fast changing portfolio. We model real time P&L process of a low latency trading strategy as Cox process. We use limit theorems for Cox processes to derive approximation for distribution of maximum of the P&L process on fixed time interval when intensity of changes is high. In conclusion we discuss practical applications of the proposed approach.

Structure of Limit Order Books
Rene Carmona
Princeton

Abstract

High Frequency transactions represent an ever growing proportion of all financial trades. Most markets have now switched to an electronic order book system. We study such an exchange structure and propose continuous time equations which generalize the self-financing relationships in frictionless markets to electronic markets with limit order books.
The paper has two main contributions. First, it establishes a structural link between price and trade dynamics. Second, by taking a diffusion limit, we bridge the gap between high and low frequency, leading to a more tractable framework that is closer to traditional asset pricing theories. It is likely that this model will find applications beyond the concerns of the high frequency community.

Robust Market Making
Alvaro Cartea
UCL

Abstract

An agent who wishes to make markets by posting limit buy and sell orders is faced with modelling the arrival rate and volume of marker orders which hit/lift their posted orders. No model can capture the true behaviour of the market’s data generating process (DGP), hence, simplifying assumptions are often made. A natural question then arises: “how can the agent account for the fact that they know their model is inaccurate?” i.e., how can uncertainty in the Knightian sense be addressed? In this talk, we formulate the question through a robust optimal control problem in which the agent is ambiguity averse to Poisson random measures. Specifically, the agent considers a reference measure (representing the simplified model) and all equivalent measures (representing candidate models) and penalises them according to a quasi relative entropy. Surprisingly, the robust control problem can be reduced to solving a coupled non-linear system of ODEs, which in certain limiting cases can be solved exactly. The optimal postings show that the agents protects themselves from ambiguity in distinct ways depending from where the ambiguity stems. Interestingly, in some cases, the agent behaves as if they have perfect knowledge of the DGP but apply CARA utility; however, in general the ambiguity averse agent cannot be recast as a risk-averse one. Numerical experiments will illustrate several interesting economic insights into the problem.

Optimal order placement in limit order markets (Slides)
Rama Cont
Imperial

Abstract

To execute a trade, participants in electronic equity markets may choose to submit limit orders or market orders across various exchanges where a stock is traded. This order placement decision depends on the “fill rates” for limit orders, which are in turn influenced by the characteristics of the order flow and queue sizes in each limit order book, as well as the structure of transaction fees and rebates across exchanges. We propose a quantitative framework for studying this order placement problem by formulating it as a convex optimization problem. This formulation allows to study how the interplay between the state of order books, the fee structure, order flow properties and preferences of a trader determine the optimal placement decision. In the case of a single exchange, we derive an explicit solution for the optimal split between limit and market orders. For the general problem of order routing across multiple exchanges, we propose a stochastic algorithm for computing the optimal policy and study the sensitivity of the solution to various parameters using a numerical implementation of the algorithm. Joint work with Arseniy Kukanov.

Bayesian estimation and forecasting in stochastic volatility models of low-frequency returns powered up by high-frequency volatility measures
Dobrislav Dobrev

Abstract

We introduce asymptotically exact high-frequency volatility measurement equations in the state space form of stochastic volatility models of low-frequency equity returns and propose a Bayesian estimation approach. Our modeling framework parsimoniously accounts for heterogeneous arrival rates of information surprises associated with jumps in returns and volatility. In particular, we do not assume identical intraday and overnight jump activity as certain major market moving news announcements such as earnings typically take place outside of trading hours. We assess the incremental information content of high frequency volatility measures in the context of the trade-off between in-sample model fit and out-of-sample multi-horizon density return forecasting, taking also into account parameter uncertainty. A simulation study and an empirical illustration on stock returns during the financial crisis of 2007-2008 reveal that compared to model inference without high-frequency data, our approach delivers more accurate short-term density return forecasts and can help reduce underestimation of risk during bad times or overestimation of risk during good times.

Integrated volatility: a new efficient estimation method in the presence of jumps with high activity
Jean Jacod
Paris

Abstract

Estimation of the integrated volatility is possible with the efficient rate sqrt(n) (with n the number of observations, in a high-frequency setting), when jumps have a degree of activity (or Blumenthal-Getoor index) less than 1: this is well established, and for this one can use truncated realized volatility or multipower variations. Here we present a different method, based on the empirical characteristic function of the returns, evaluated locally in time. This method allows us to obtain estimators with the efficient rate sqrt(n), and even with the efficient asymptotic variance, when the degree of activity of jumps is arbitrary large. For this we need the (relatively) weak assumption that the jump part occurs as a stochastic integral with respect to a Levy process with a “stable like” behavior of small jumps, the integrand being itself an arbitrary Ito semimartingale. Joint with Viktor Todorov.

The Algorithmic Trading of Multiple Assets using Limit and Market Orders
Sebastian Jaimungal
Toronto

Abstract

We develop a high frequency trading strategy (with limit and market orders) in a multi-asset economy where the assets are not only correlated, but can also be structurally dependent. To model the structural dependence, the mid-price processes are treated as a reflected Brownian motion on the closure of a no-arbitrage region. The optimal strategy is shown to contain two components. The first, is a market-making component when the mid-price processes are sufficiently far from the no-arbitrage boundary dictated by the assets’ spreads. The second, is a statistical arbitrage component that seeks to profit when the mid-price processes are close to the no-arbitrage boundary. In this paper, we establish a formal framework for such an economy, present theoretical results, discuss a numerical scheme to solve for the value function and optimal control, and perform a simulation study to illustrate some financially intuitive characteristics of the optimal strategy.

High Frequency Trading
Andrei Kirilenko
MIT

Abstract

High frequency trading is a recent innovation in financial intermediation that does not fit neatly into a standard liquidity-provision framework. While the net contribution of high frequency trading to market dynamics is still not fully understood, their mere presence has already shaken the confidence of traditional market participants in the stability and fairness of the financial market system as a whole.

The Welfare Analysis of Dark Pools
Ciamac Moallemi
Columbia

Abstract

We investigate the role of a class of alternative market structures known as electronic crossing networks or dark pools. Relative to traditional `lit’ intermediated dealer markets, dark pools offer investors the trade-off of reduced transaction costs in exchange for greater uncertainty of trade. In an equilibrium setting, we analyze the choice between dark and lit venues, and illustrate how this critically depends on the information available to each investor. We establish that while dark pools attract relatively uninformed investors, they still experience adverse selection. Moreover, the informational segmentation created by a dark pool leads to greater transaction costs in lit markets. Finally, in our setting, we establish that the introduction of a dark pool reduces the overall welfare of the market participants. This is joint work with Ramesh Johari (Stanford) and Krishnamurthy Iyer (UPenn).

Edgeworth expansion for functionals of continuous diffusion processes
Mark Podolskij
Heidelberg

Abstract

This talk presents new results on the Edgeworth expansion for high frequency functionals of continuous diffusion processes. We derive asymptotic expansions for weighted functionals of the Brownian motion and and apply them to provide the second order Edgeworth expansion for power variation of diffusion processes. Our methodology relies on martingale embedding, Malliavin calculus and stable central limit theorems for semimartingales. Finally, we demonstrate the density expansion for studentized statistics of power variations. Joint work with Nakahiro Yoshida.

Quadratic covariation matrix estimation under microstructure noise: local method of moments and efficiency
Markus Reiss
HU Berlin

Abstract

An efficient estimator is constructed for the quadratic covariation or integrated covolatility matrix of a multivariate continuous martingale based on noisy and non-synchronous observations under high-frequency asymptotics. Our approach relies on an asymptotically equivalent continuous-time observation model where a local generalised method of moments in the spectral domain turns out to be optimal. Asymptotic semiparametric efficiency is established in the Cram\’er-Rao sense. Main findings are that non-synchronicity of observation times has no impact on the asymptotics and that major efficiency gains are possible under correlation. (Joint work with M. Bibinger, N. Hautsch, P. Malec)

Time Series and Cross-Sectional Properties of Equity Market Liquidity with Applications to the Financial Crisis
Jeffrey Russell
Chicago

Abstract

Financial market liquidity varies over time and is cross-sectionally correlated. Despite a growing literature suggesting that liquidity impacts asset prices and the importance of co-movement in liquidity to investors holding diversified portfolios, relatively little is understood about the economic sources of this co-movement. Most economic theory is built around risk factors faced by market makers trading individual assets. This paper proposes the idea that commonality or co-movement in liquidity comes from co-movement in the risk factors for liquidity across assets. We show that a factor structure in the risk factors implies a factor structure for liquidity that has been documented in the literature. If the common risk factor is not directly observable (such as asymmetric information) we show that observable risk factors can be constructed by taking cross-sectional averages of the asset specific liquidity risk variables that are often used to proxy for the unobserved. Estimates of the factor models on a sample of S&P100 stocks allows us to identify which common risk factors are important in determining co-movement in liquidity. We find that inventory risk due to common volatility shocks, market wide asymmetric information and drying up of liquidity suppliers are responsible for a large part of the co-movement in liquidity during the financial crisis. Interestingly, counterparty risk appears to affect liquidity, but does not imply broad co-movement in liquidity.

Optimal Asset Liquidation Using Limit Order Book Information (Slides)
Sasha Stoikov
Cornell

Abstract

We consider an asset liquidation problem at the market microstructure level, where we use limit order book information to construct a measure of the instantaneous supply and demand imbalance in the market. In this context, it is optimal to submit sell orders when this imbalance is low, indicating that a price drop is imminent. Identifying good trading times is equivalent to solving an optimal stopping problem, where the objective is to stop whenever the supply-demand imbalance is small. The solution to such an optimal stopping problem divides the state space into a “trade” and “no trade” region. We present structural properties of the optimal policy and use a dynamic programming formulation to find good approximations to the optimal trade region. We also investigate shapes of the trade and no trade regions given different underlying price processes, input parameters and assumptions on the latency of the trade execution. Finally, we calculate efficiently the cost of latency in the trade execution and demonstrate that the advantage of observing the limit order book can dissipate quickly as latency increases. In the empirical studies section, we show that our optimal policy significantly outperforms the benchmark TWAP algorithm in liquidating on-the-run U.S. treasury bonds, saving on average approximately 1/3 of the spread per share liquidated if trades are executed with low latency (~10 milliseconds).

Limit Theorems for the Empirical Distribution Function of Scaled Increments of Ito Semimartingales at high frequencies
Viktor Todorov
Northwestern

Abstract

We derive limit theorems for the empirical distribution function of “devolatilized” increments of an Ito semimartingale observed at high frequencies. These “ devolatilized” increments are formed by suitably rescaling and truncating the raw increments to remove the effects of stochastic volatility and “large” jumps. We derive the limit of the empirical cdf of the adjusted increments for any Ito semimartingale whose dominant component at high frequencies has activity index of 1< beta<= 2, where beta = 2 corresponds to diffusion. We further derive an associated CLT in the jump-diffusion case. We use the developed limit theory to construct a feasible and pivotal test for the class of Ito semimartingales with non-vanishing diffusion coefficient against Ito semimartingales with no diffusion component. This is joint work with George Tauchen.

Optimal VWAP Tracking (Slides)
Stathis Tompaidis
Texas

Abstract

We consider the problem of finding a strategy that tracks the volume weighted average price (VWAP) of a stock, a key measure of execution quality for large orders used by institutional investors. We obtain the optimal, dynamic, VWAP tracking strategy in closed form in a model without trading costs with general price and volume dynamics. We build a model of intraday volume using the Trade and Quote dataset to empirically test the strategy, both without trading costs and when trading has temporary and permanent effects, and find that the implementation cost is lower than the cost charged by brokerage houses.

Dependence in volatility: calibration or estimation?
Frederi Viens
Purdue

Abstract

We will begin by explaning how work by I. Florescu and the presenter, published in 2008, on continuous-time stochastic volatility (SV) tracking using discrete-time stock information, can be reinterpreted as a model-free framework for option pricing and model calibration under SV. We will present a recent application of this perspective, with A. Chronopoulou, to long-memory SV calibration based on high-frequency SP500 option data, which appears to show that SV memory reflects market-makers’ anxiety about market conditions. This is a calibration, as opposed to an estimation, methodology. We will provide mathematical evidence that traditional estimation methods useful with HF data are not sufficiently robust to be helpful for memory analysis in the financial context. Time permitting, we will touch on two new directions of ongoing work which may provide additional tools for memory selection with HF financial data: one is a calibration technique using sharp call price asymptotics w.r.t. strike prices, the other is bonafide estimation via a generalized method of moments.

Optimal Execution with a VWAP Benchmark
Nicholas Westray
Deutsche Bank

Abstract

We consider the case of optimal liquidation where the aim is to minimize slippage with respect to the benchmark given by the period VWAP (volume weighted average price). To model the benchmark we suggest a new description of the relative volume curve based upon a gamma bridge. The resulting model allows simultaneously for accurate data fit and mathematical tractability. We provide closed form solutions for the resulting optimization as well as discussing in detail how the resulting strategy can be interpreted and the model fit to data.

Model-Free Leverage Effect Estimators: A Horse Race at High Frequency
Dacheng Xiu
Chicago

Abstract

We propose a new nonparametric estimator of the leverage effect of the S&P500 index, which uses the data on the S&P500 index as well as the VIX. The theoretical justification of our estimator exploits the relationship between the VIX and the spot variance of the S&P500. We show the naive estimator is inconsistent in general. We derive the asymptotic distribution of our new estimator and show good numerical properties in simulations. To benchmark, we conduct asymptotic analysis of alternative nonparametric estimators using stock prices alone. This situation is more challenging because of the necessity to estimate the unobservable spot variance. We demonstrate in theory and simulations that over short periods of time such as one week, the price-only estimators have disastrous properties, even though they perform well over time periods such as 5 years. Finally, we provide a time series of weekly leverage effect from 2003 to 2012, which peaks during the recent financial crisis. Joint work with Ilze Kalnina.

Real-time Stochastic Volatility Estimation via Filtering for a Partially-observed Heston Model
Yong Zeng
UMKC

Abstract

This talk first briefly reviews a general partially-observed framework of Markov processes with marked point process observations recently proposed for ultra-high frequency data; the related posterior distribution and the filtering equation, which is a stochastic partial differential equation(SPDE) with recursiveness; and the Bayes Estimation via Filtering Equation (BEFE). In recent years, Graphics Processing Units (GPUs) evolved from rendering graphics (linear algebra-like computations) for electronic games and video applications to becoming low-cost and green supercomputing units. With harnessing the newly available GPU high performance computing power in mind and targeting a Heston stochastic volatility model, we develop a new easily-parallelized, uniformly consistent, recursive algorithm via BEFE for propagating and updating the joint posterior distributions. We show that the recursive algorithm is well suited for GPU parallel computing. We present simulation and empirical results obtained from supercomputers to demonstrate that the recursive algorithm works. Real time tracking and feeding stochastic volatility is made possible. This is a joint work with Brent Bundick in Boston College.

The Stevanovich Center is supported by the generous philanthropy of University of Chicago Trustee Steve G. Stevanovich, AB ’85, MBA ’90.