Weekly Seminars


 August 26, 2011

High Frequency Trading in Electronic Markets: Implications for Public Policy

Albert S. “Pete” Kyle                                                                           University of Maryland

October 14, 2011

Questions and Observations on Mathematical Models in Finance

Pierre-Louis Lions                                                                               Collège de France
Ecole Polytechnique

October 20, 2011

On Implied Volatility for Options — Some Reasons to Smile and to Correct

Songxi Chen (Iowa link
Peking University, Guanghua School of Business
Iowa State University

We analyze the properties of the implied volatility that is obtained by inverting a single option price via Black-Scholes formula, which is the commonly used volatility estimator for option data.  We show that the implied volatilities is subject to a systematic bias in the presence of pricing errors.  The stylish impacts of the errors can be significance even for at the money short maturity options.  We propose a kernel smoothing based implied volatility estimator, and demonstrate it can automatically correct/remove for the pricing errors. The S&P 500 options data are intensively analyzed to demonstrate the approach.  This is a joint work with a graduate Student Zheng Xu, Department of Statistics, Iowa State University.

October 27, 2011

Multiperiod Corporate Default Prediction — A Forward Inten tity Approach

Jin-Chuan Duan                                                                                   National University of Singapore, Risk Management Institute

A forward intensity model for the prediction of corporate defaults over different future periods is proposed.  Maximum pseudo-likelihood analysis is then conducted on a large sample of the US industrial and financial firms spanning the period 1991-2010 on a monthly basis.  Several commonly used factors and firm-specific attributes are shown to be useful for prediction at both short and long horizons.  Our implementation also factors in momentum in some variables and documents their importance in default prediction.  The prediction is very accurate for shorter horizons.  The accuracy deteriorates somewhat when the horizon is increased to two or three years, but its performance still remains reasonable.  The forward intensity model is also amenable to aggregation, which allows for an analysis of default behavior at the portfolio and/or economy level.  (Joint work with Jie Sun and Tao Wang.)

October 31, 2011

Detecting Financial Bubbles in Real Time

Philip Protter                                                                                       Columbia University

After the 2007 credit crisis, financial bubbles have once again emerged as a topic of current concern.  An open problem is to determine in real time whether or not a given asset’s price process exhibits a bubble.  To do this, one needs to use a mathematical theory of bubbles, which we have recently developed and will briefly explain.  The theory uses the arbitrage-free martingale pricing technology.  This allows us to answer this question based on the asset’s price volatility.  We limit ourselves to the special case of a risky asset’s price being modeled by a Brownian driven stochastic diff erential equation.  Such models are ubiquitous both in theory and in practice.  Our methods use sophisticated volatility estimation techniques combined with the method of reproducing kernel Hilbert spaces.  We illustrate these techniques using several stocks from the alleged internet dot-com episode of 1998 – 2001, where price bubbles were widely thought to have existed.  Our results support these beliefs.  We then consider the special case of the recent IPO of LinkedIn.  The talk is based on several joint works with Robert Jarrow, Kazuhiro Shimbo, and Younes Kchia.

November 7, 2011

The Role of Mathematics in Financial Engineering

Pierre-Louis Lions                                                                               Ecole Polytechnique
Collège de France

November 10, 2011

The Leverage Effect Puzzle: Disentangling Sources of Bias in High Frequency Inference

Yacine Aït-Sahalia                                                                               Princeton University

The leverage effect refers to the generally negative correlation between an asset return and its changes of volatility.  A natural estimate consists in using the empirical correlation between the daily returns and the changes of daily volatility estimated from high-frequency data.  The puzzle lies in the fact that such an intuitively natural estimate yields nearly zero correlation for most assets tested, despite the many economic reasons for expecting the estimated correlation to be negative.  To better understand the sources of the puzzle, we analyze the different asymptotic biases that are involved in high frequency estimation of the leverage effect, including biases due to discretization errors, to smoothing errors in estimating spot volatilities, to estimation error, and to market microstructure noise.  This decomposition enables us to propose a bias correction method for estimating the leverage effect.  (Joint work with Jianqing Fan and Yingying Li.)

November 10, 2011

Complex Trading Mechanisms

Patricia Lassus                                                                                            geodesiXs

I will discuss mechanisms with a richer message space for bidders/traders, which allow them to express conditions on the size of the overall auction/trade they participate in, or on the price impact of their bid/order.  These mechanisms can be used in a one-shot auction (e.g. for corporate or government debt underwriting) or on a continuous trading platform (e.g. for trading equities, bonds, or other asset classes).

November 11, 2011

Searching for Outperformance: Myth or Reality?

Olivier Scaillet                                                                                     Swiss Finance Institute
Université de Genève


November 22, 2011

On Trivial and Non-trivial Optimal Barrier Solutions of the Dividend Problem for a Diffusion under Constant and Proportional Transaction Cost

Lihua Bai                                                                                                   Nankai University


December 1, 2011

Implied Volatility Smirk under Asymmetric Dynamics

José Santiago Fajardo Barbachan                                                      FGV

In this paper focusing on Lévy process, with exponential dampening controlling the skewness, we obtain a result that allow us to relate the impled volatility skew and the asymmetric dynamics of the underlying.  Moreover, with this result in mind we propose alternatives specifications for the implied volatility and test them using S&P 500 options data, obtaining a very good fit.  Although, there is in the literature more general data-generating process, including stochastic volatility models, by focusing on a particular class we can learn a bit more insights about how this particular process generates the skew.  More exactly, the market symmetry parameter is deeply connected with the risk neutral excess of kurtosis, which will allow us to relate the risk neutral skewness and kurtosis with the implied volatility skew

January 19, 2012

About Microstructure Noise: A Statistical Approach

Jean Jacod                                                                                           Université Paris VI


February 2, 2012

On volatility matrix estimation in a multivariate semimartingale model with microstrutcure noise

Markus Bibinger                                                                                  Humboldt-Universität zu Berlin

We consider a multivariate discretely observed semimartingale corrupted by microstructure noise and aim at estimating the (co)volatility matrix.  A concise insight into state-of-the-art approaches for integrated covolatility estimation in the presence of noise and non-synchronous observation schemes will be provided to reveal some intrinsic fundamental features of the statistical model and strategies for estimation.  In an idealized simplified model an asymptotic equivalence result gives rise to a new local parametric statistical approach attaining asymptotic efficiency.  We highlight that a multivariate local likelihood approach allows for efficiency gains for integrated volatility estimation by the information contained in correlated components.

February 9, 2012

The Estimation of Leverage effect with High Frequency Data

D. Christina Wang                                                                                       The University of Chicago

Leverage effect has become an extensively studied phenomenon which describes the (usually) negative relation between stock returns and their volatility.  Although this characteristic of stock re- turns is well acknowledged, most studies of the phenomenon are based on cross-sectional calibration with parametric models.  On the statistical side, most previous work are over daily or longer return horizons, and usually do not specify what parameter is being studied!  In this talk, we provide non-parametric estimation for a class of stochastic measures of leverage e!ect.  The theory covers both the cases with and without microstructure noise, and studies the statistical properties of the estimators when the log price process is a quite general continuous semimartingale.  Volatility is allowed to be stochastic, and our asymptotics reflect a high frequency data sampling regime.  The consistency and limit distribution of the estimators are derived, and simulation results are presented, which corroborate the asymptotic properties.  This estimator also provides the opportunity to study high frequency regression, which leads to the prediction of volatility using not only previous volatility but also the leverage effect.  The work also shows a theoretical connection between skewness and leverage effect, which further leads to the prediction of skewness.  Furthermore, adopting similar ideas to these, it is easy to extend the study to other important aspects of stock returns, such as volatility of volatility.

March 8, 2012

Parametric Inference, Testing and Dynamic State Recovery from Option Panels with Fixed Time Span

Viktor Todorov                                                                                    Northwestern University

We develop a new parametric estimation procedure for option panels observed with error.  Our inference techniques exploit asymptotic approximations under the assumption of an ever increasing set of observed option prices in the moneyness-maturity (cross-sectional) dimension, but with a fixed time span.  The framework allows for considerable heterogeneity over time in the quality of the information inherent in the option data.  We develop consistent estimators of the parameter vector as well as the dynamic realization of the state vector that governs the option price dynamics.  We show that the estimators converge stably to a mixed-Gaussian law and provide feasible estimators for the limiting covariance matrix.  We also provide feasible semiparametric tests for the option price dynamics based on the distance between the diffusive (stochastic) volatility state extracted from the options and the one obtained nonparametrically from high-frequency return data for the underlying asset.  In addition, we construct formal tests for the fit of the option pricing model for a specific region of the volatility surface over a given time period as well as for the stability of the risk-neutral dynamics, or parameter vector, over time.  In an empirical application to S&P 500 index options we extend the double-jump stochastic volatility model of Duffie, Pan and Singleton (2000), popular in option pricing applications, to allow for time-varying risk premia of extreme events, i.e., jumps, as well as a more flexible relation between the risk premia and the level of risk.  We show that both extensions provide a significantly improved characterization, both statistically and economically, of observed option prices.

April 5, 2012

Mixed Frequency Vector Autoregresive Models

Eric Ghysels                                                                                               University of North Carolina at Chapel Hill   

Many time series are sampled at different frequencies.  When we study co-movements between such series we usually analyze the joint process sampled at a common low frequency.  This has consequences in terms of potentially mis-specifying the co-movements and hence the analysis of impulse response functions – a commonly used tool for economic policy analysis.  We introduce a class of mixed frequency VAR models that allows us the measure the impact of high frequency data on low frequency and vice versa.  Our approach does not rely on latent processes/shocks representations.  As a consequence, the mixed frequency VAR is an alternative to commonly used state space models for mixed frequency data.  State space models involve latent processes, and therefore rely on filtering to extract hidden states that are used in order to predict future outcomes.  We also explicitly characterize the mis-specification of a traditional common low frequency VAR and its implied mis-specified impulse response functions.  The class of mixed frequency VAR models can also characterize the timing of information releases for a mixture of sampling frequencies and the real-time updating of predictions caused by the flow of high frequency information.  Hence, they are parameter-driven models whereas mixed frequency VAR models are observation-driven models as they are formulated exclusively in terms of observable data and do not involve latent processes and thus avoid the need to formulate measurement equations, filtering etc.  We also propose various parsimonious parameterizations, in part inspired by recent work on MIDAS regressions.  Various estimation procedures for mixed frequency VAR models are also proposed, both classical and Bayesian.  Numerical and empirical examples quantify the consequences of ignoring mixed frequency data.

April 20 – 22, 2012

Workshop on Functional Programming in Quantitative Finance

Organizer: Niels Nygaard


May 3 and 4, 2012

Conference on Asymptotics in Finance

Organizers: Henri Berestycki and Roger Lee


May 10, 2012

Conference on Macroeconomic Fragility

Organized by Becker Friedman Institute


May 11, 2012

Time Stepping and Numerical Sensitivity Analysis for SDE

Jonathan Goodman                                                                                 New York University, Courant Institute of Mathematical Sciences

This talk discusses some computational issues related to stochastic differential equations (SDEs).  A specific error measure is necessary to design and compare computational methods for SDEs.  We discuss the microscopic total variation, which measures the L1 error in the joint PDF of all the time step values.  This is a path measure (unlike weak error) and independent of a coupling (unlike strong error).  We also consider the coupling distance between the exact and approximate joint PDFs.  This is closer to strong error in spirit and in results, but also allows methods with no natural coupling. Suppose Ω(θ) is a smooth domain that depends on parameters θ.  Let Xt satisfy an SDE and let f(θ) = E[V (Xτ)], where τθ is the hitting time of ∂Ω.  We discuss a good Monte Carlo estimator of ∇θf(θ).  This is used in stochastic optimization for computing optimal stopping rules.  We present computational experiments that show that an affine invariant (in θ space) version of Robbins Munro can be more effective than the original method or some other variants.

May 17, 2012

On Some Conformally Invariant Fully Nonlinear Equations

Yanyan Li


May 24, 2012

Realized Copula

Ostap Okhrin                                                                                         Humboldt-Universität zu Berlin

We introduce the notion of realized copula.  Based on assumptions of the marginal distributions of daily stock returns and a copula family, realized copula is de fined as the copula structure materialized in realized covariance estimated from within-day high-frequency data.  Copula parameters are estimated in a method-of-moments type of fashion through Hoe ffding’s lemma.  Applying this procedure day by day gives rise to a time series of copula parameters that is suitably approximated by an autoregressive time series model.  This allows us to capture time-varying dependency in our framework.  Studying a portfolio risk-management application, we find that time-varying realized copula is superior to standard benchmark models in the literature. (With Matthias R. Fengler.)

June 4 – 6, 2012

Conference on Matching Problems: Economics meets Mathematics

(Joint with the Becker Friedman Institute)

Organizers: Robert McCann, Pierre-Andre Chiappori, Scott Duke Kominers

Skip to toolbar