Weekly Seminars

2013 – 2014

October 17, 2013   Thursday
4:00 PM
5727 S. University Ave, Room 112

Haakon Gjessing

Norwegian Institute of Public Health

Statistical Efficiency, Design and Correction for Multiple Testing in Genetic Association Studies

Abstract
Statistical power is central to all planning of studies searching for disease genes, particularly since multiple testing is often a serious problem.  The available resources for data collection and genotyping are limited, and the problem is to choose a combination of family data and independent controls that gives maximal power for a given total number of individuals. We show that relative efficiency is a much simpler concept to deal with than power in such connections, how the change-of-variance function can be used to select among designs, and how an extended score test can be used to minimize the efficiency loss in multiple testing.

October 24, 2013   Thursday
12:00 PM
5727 S. University Ave, Room 112

Eric Ghysels

University of North Carolina, Chapel Hill

Estimating Volatility Risk Factors Using Large Panels of Filtered or Realized Volatilities


November 7, 2013   Thursday
4:00 PM
5727 S. University Ave, Room 112

David Li

American International Group (AIG)

A Transformed Copula Function Approach to Credit Portfolio Modeling

Abstract
We present a fundamental modification  to the current popularly used copula function approach to the credit portfolio modeling introduced by Li (2000).  The original approach simply uses a copula function to create a joint survival time distribution where individual survival time distributions are already risk neutralized, and given from single name perspectives only.  Based on Buhlmann’s equilibrium pricing model (1980) under certain assumptions on the aggregate risk or the multivariate Esscher and Wang transforms we find that the covariance between each individual risk and the market or aggregate risk should be included in the measure change.  In the Gaussian copula model it is shown that we simply need to adjust the asset return by subtracting an item associated with the covariance risk.

This discovery allows us to theoretically link our credit portfolio modeling with our classical equity portfolio modeling in the CAPM setting.  This can help us solve some practical problems we have been encountering in the credit portfolio modeling.


November 14, 2013   Thursday
4:00 PM
5727 S. University Ave, Room 112

Qiying Wang

University of Sydney

On the asymptotics of Nadaraya-Watson estimator:  Toward a unified approach

Abstract
This paper investigates the asymptotics of Nadaraya-Watson estimator, providing a framework and a unified approach for stationary and non-stationary times series. This paper also establishes an extension to multivariate regression with non-stationary time series and provides a brief overview on the asymptotic results in relation to non-linear cointegrating regression.

November 19, 2013   Tuesday
5:00 PM
5727 S. University Ave, Room 112

Bruno Dupire

Head of Quantitative Research, Bloomberg

Functional Ito Calculus and Financial Applications

Abstract
We expose briefly the Functional Ito Calculus, which gives a natural setting for defining the Greeks for path dependent options and gives a generalized PDE for the price of path dependent options, even in the case of non Markov dynamics. It leads to a variational calculus on volatility surfaces and a fine decomposition of the volatility risk as well to links with super-replication strategies. We examine a few practical examples and analyze the ability to hedge (or not) of some popular structures.

January 7, 2014   Tuesday
4:00 PM
5727 S. University Ave, Room 112

Maryam Farboodi

The University of Chicago

Intermediation and Voluntary Exposure to Counterparty Risk

Abstract
I develop a model of financial sector in which intermediation among banks interacts with debt nature of bank liabilities to generate excessive systemic risk.  The central idea is to explore the possibility that certain financial institutions are able to use their lending and borrowing decisions to tilt the division of surplus in their own favor through capturing intermediation spreads, even if the implied change in the structure of financial system hurts the total surplus of the economy.  The paper predicts that there is excessive exposure among banks who make risky investments and too little exposure among those who mainly provide funding.  Inefficiency arises because the financial institutions who intermediate among other institutions are exposed to excessive counterparty risk: replacing them with certain other banks mitigates the extent of failure when it is inevitable without hurting the optimal level of investment.  However, the equilibrium intermediators choose to over expose themselves to other risky banks and suffer the cost of failure due to contagion if they can absorb enough rents when they survive.

January 16, 2014   Thursday
4:00 PM
5727 S. University Ave, Room 112

Christina Dan Wang

Princeton Univerisity 

The Estimation of Leverage Effect in High Frequency Data


January 23, 2014   Thursday
4:00 PM
5727 S. University Ave, Room 112

Marcelo Alvisio

The University of Chicago

Option Pricing using Perturbation Methods 


January 30, 2014   Thursday
4:00 PM
5727 S. University Ave, Room 112

Jean Jacod

Université Paris VI

Backward Stochastic Differential Equations driven by Point Processes: An Elementary Approach


February 17, 2014   Monday
4:00 PM
Eckhart Hall, room 133
joint with Department of Statistics

Mathieu Rosenbaum

Université Paris VI and École Polytechnique

Limit Theorems for nearly unstable Hawkes Processes

Abstract
Because of their tractability and their natural interpretations in terms of market quantities, Hawkes processes are nowadays widely used in high frequency finance.  However, in practice, the statistical estimation results seem to show that very often, only nearly unstable Hawkes processes are able to fit the data properly.  By nearly unstable, we mean that the L1 norm of their kernel is close to unity.  We study in this work such processes for which the stability condition is almost violated.

Our main result states that after suitable rescaling, they asymptotically behave like integrated Cox Ingersoll Ross models.  Thus, modeling financial order flows as nearly unstable Hawkes processes may be a good way to reproduce both their high and low frequency stylized facts.  We then extend this result to the Hawkes based price model introduced by Bacry et al.  We show that under a similar criticality condition, this process converges to a Heston model.  Again, we recover well known stylized facts of prices, both at the microstructure level and at the macroscopic scale.  (Joint work with Thibault Jaisson, Ecole Polytechnique, Paris).


March 13, 2014   Thursday
12:00 PM
5727 S. University Ave, Room 112

Mathias Vetter

Philipps-Universität Marburg

Estimating the entire quadratic covariation in case of asynchronous observations

Abstract
In this talk we consider the estimation of the quadratic (co)variation of a semimartingale from discrete observations which are irregularly spaced under high-frequency asymptotics.  In the univariate setting, standard results are generalized to the case of irregular observations.  In the two-dimensional setup under non-synchronous observations, we derive a stable central limit theorem for the Hayashi-Yoshida estimator in the presence of jumps.  We reveal how idiosyncratic and simultaneous jumps affect the asymptotic distribution. Observation times generated by Poisson processes are explicitly discussed.

April 3, 2014   Thursday
4:00pm
5727 S. University Ave, Room 112

Jian Sun

Morgan Stanley

Implied Remaining Variance in Derivative Pricing

Abstract
In this note, we give a way to calculate a swaption implied volatility curve in closed form via the well known quadratic root formula.  The closed form expression has 3 free parameters, which parsimoniously govern the assumed dynamics of implied volatility under forward swap measure.  Preliminary empirical work suggests the curve fits swaptions market well (though not perfectly).  Unlike previous models of stochastic implied volatility, the current model has no implications for the dynamics of instantaneous volatility.

April 10, 2014   Thursday
4:00 PM
5727 S. University Ave, Room 112

Ilze Kalnina

Université de Montréal

Model-Free Leverage Effect Estimators at High Frequency

Paper


April 17, 2014   Thursday
12 PM
5727 S. University Ave, Room 112
joint with Department of Statistics

Philippe Rigollet

Princeton University

The Statistical Price to Pay for Computational Efficiency in Sparse PCA

Abstract
Computational limitations of statistical problems have largely been ignored or simply overcome by ad hoc relaxations techniques.  If optimal methods cannot be computed in reasonable time, what is the best possible statistical performance of a computationally ecient procedure?  Building on average case reductions, we establish these fundamental limits in the context of sparse principal component analysis and quantify the statistical price to pay for computational ecientciency.  Our results can be viewed as complexity theoretic lower bounds conditionally on the assumptions that some instances of the planted clique problem cannot be solved in randomized polynomial time. [Joint work with Quentin Berthet.]

May 5, 2014   Monday
4:00 PM
Eckhart Hall, room 133
joint with Department of Statistics

Rainer Dahlhaus

Universität Heidelberg

Volatility Decomposition and Online Volatility-Estimation with Nonlinear Market Microstructure Noise Models

Abstract
A technique for online estimation of spot volatility for high-frequency data is developed. The method uses a price model with time shift in combination with a nonlinear market microstructure noise model. A benefit of the model is that it leads to an identifiable decomposition of spot volatility into spot volatility per transaction and the trading intensity – thus highlighting the influence of trading intensity on volatility. The online algorithm uses a computationally efficient particle filter. It works directly on the transaction data and updates the volatility estimate immediately after the occurrence of a new transaction. It also allows for the approximation of the unknown efficient prices. For volatility estimation a nonparametric recursive EM algorithm is used. We neither assume that the transaction times are equidistant nor do we use interpolated prices. For the theoretical investigations of the estimates we present a theoretical framework with infill asymptotics. [joint work with Jan. C. Neddermeyer and Sophon Tunyavetchakit]

May 9, 2014   Friday
4:00 PM
at CME, joint with UIC

Neil Shephard

Harvard University

Econometric Analysis of Low Latency Financial Data

Event registration: http://www.cvent.com/d/z4qxnw


May 22, 2014   Thursday
4:00 PM
5727 S. University Ave, Room 112

Rudolf Beran

University of California, Davis

The Unbearable Transparency of Stein Estimation

Paper

Abstract
Charles Stein (1956) discovered that, under quadratic loss, the usual unbiased estimator for the mean vector of a multivariate normal distribution is inadmissible if the dimension $p$ of the mean vector exceeds two.  It has since been claimed that Stein’s results and the subsequent James-Stein estimator are counter-intuitive, even paradoxical, and not very useful. In response to such doubts, various authors have presented alternative derivations of Stein shrinkage estimators.  Surely Stein himself did not find his results paradoxical.  This talk argues that assertions of “paradoxical” or “counter-intuitive” or “not practical” have overlooked essential arguments and remarks in Stein’s beautifully written 1956 paper.  Among these overlooked aspects are the asymptotic geometry of quadratic loss in high dimensions that makes Stein estimation transparent; the asymptotic optimality results that can be associated with Stein estimation; his explicit mention of practical multiple shrinkage estimators; and the foreshadowing of Stein confidence balls.  These ideas prove fundamental for studies of modern regularization estimators that rely on multiple shrinkage, whether implicitly or overtly.

May 23, 2014   Friday
1:30 PM
5727 S. University Ave, Room 112

Rudolf Beran

University of California, Davis

Hypercube Estimators: Penalized Least Squares, Submodel Selection, and Numerical Stability

Abstract
Hypercube estimators for the mean vector in a general linear model include algebraic equivalents to penalized least squares estimators with multiple quadratic penalties and to submodel least squares estimators.  Penalized least squares estimators necessarily break down numerically for certain penalty matrices.  Equivalent hypercube estimators resist this source of numerical instability.  Under conditions, adaptation over a class of candidate hypercube estimators, so as to minimize estimated quadratic risk, also minimizes asymptotic risk under the general linear model.  Numerical stability of hypercube estimators assists trustworthy adaptation.  Hypercube estimators have broad applicability to any statistical methodology that involves penalized least squares.  Notably, they extend to general designs the risk reduction achieved by Stein’s (1966) multiple shrinkage estimators for balanced observations on a k-way array of means.