Optimization-Conscious Econometrics Summer School

Topics at the Intersection of Econometrics, Causal Inference, and Optimization

About

The aim of the Summer school is to equip graduate students with tools to carry forefront research at the intersection of optimization and econometrics.

The OCE Summer school will be held on the University of Chicago campus June 3-6, 2024, and students are expected to likewise attend the OCE Conference III on June 7-8, 2024.

Students may apply for full funding to support accommodations and travel. Please contact event organizer Guillaume Pouliot with any questions at guillaumepouliot@uchicago.edu.

Application Requirements

The Summer School is designed for graduate students focused in econometrics who want to strengthen their background in optimization. Qualified applicants should have completed their second year field courses in econometrics.

Students would likewise attend the OCE Conference III, on June 7-8.

Registration

Fill out this form to apply to the Summer school and for funding to attend the OCE Summer School and Conference.

As part of the application form, student applicants will be required to send their transcript and a short statement of purpose explaining how they expect the school to help in their future research.

Application Form

Speakers

Guillaume A. Pouliot

Guillaume A. Pouliot

Assistant Professor at Chicago Harris

Guillaume Pouliot is an Assistant Professor at Chicago Harris. His research focuses on developing statistical methods for nonstandard problems in public policy and economics, the extension of machine learning methods for applications in public policy, and problems at the interface of econometrics and optimization.

Pouliot received his PhD from Harvard University. Previously, he received his B.A. (Honors) in economics as well as his M.S. (concurrent) in statistics from the University of Chicago.

Pierre E. Jacob

Pierre E. Jacob

Professor of Statistics, ESSEC Business School

Pierre E. Jacob is a professor of statistics at ESSEC Business School, in Paris, France.

His research pertains to statistical inference and time series analysis. Jacobs develops Monte Carlo methods to compare models or to estimate latent variables. At ESSEC he teaches courses on forecasting and statistics in general.

Jacob received his PhD from Université Paris Dauphine.  Previous to joining ESSEC, Jacob was Associate Professor at the statistics department at Harvard University.

Pepe Montiel

Pepe Montiel

Associate Professor of Economics at Cornell University

Professor Montiel received his Ph.D. in Economics from Harvard University in 2013. He spent three years (2013-2016) as Assistant Professor at New York University’s Department of Economics. Before moving to Cornell, he was an Assistant Professor at Columbia University’s Department of Economics for six years (2016-2022).

Jose Blanchet

Jose Blanchet

Professor of Management Science and Engineering at Stanford University

Professor Blanchet is a faculty member in the Management Science and Engineering Department at Stanford University – where he earned his Ph.D. in 2004. Prior to joining the Stanford faculty, Jose was a professor in the IEOR and Statistics Departments at Columbia University (2008-2017) and before that he was faculty member in the Statistics Department at Harvard University (2004-2008). Jose is a recipient of the 2009 Best Publication Award given by the INFORMS Applied Probability Society and of the 2010 Erlang Prize. He also received a PECASE award given by NSF in 2010. 

Blaise Melly

Blaise Melly

Department of Economics, University of Bern

Blaise Melly is a Professor of Econometrics at the University of Bern. He obtained his Ph.D. from the University of St. Gallen and previously served as an Assistant Professor in the Economics Department at Brown University. His research focuses on developing microeconometric methods to analyze the impact of policy variables on outcome distributions. Additionally, he has applied interests in labor economics, particularly in economic heterogeneity and inequality.

OCE Summer School Schedule

June 3

Time Location Description Presenter
8:30 a.m. SHFE 201 (Grad Lounge) Breakfast
8:40 a.m. SHFE 203 Welcome and Introductory Remarks
9:00 – 10:30 a.m. SHFE 203 Lecture Guillaume Pouliot
10:30 – 11:00 a.m.
SHFE 201 (Grad Lounge) Coffee Break
11:00 a.m. – 12:30 p.m. SHFE 203 Lecture
Pierre E. Jacob
12:30 – 1:30 p.m. SHFE 201 (Grad Lounge) Lunch
1:30 – 3:00 p.m. SHFE 203 Lecture
Pepe Montiel
3:00 – 3:30 p.m. SHFE 201 (Grad Lounge) Coffee Break
3:30 – 5:00 p.m. SHFE 203 Lecture
Blaise Melly

June 4

Time Location Description Presenter
8:30 a.m. SHFE 201 (Grad Lounge) Breakfast
9:00 – 10:30 a.m. SHFE 203 Lecture Guillaume Pouliot
10:30 – 11:00 a.m. SHFE 201 (Grad Lounge) Coffee Break
11:00 a.m. – 12:30 p.m. SHFE 203 Lecture Pierre E. Jacob
12:30 – 1:30 p.m. SHFE 201 (Grad Lounge) Lunch
1:30 – 3:00 p.m. SHFE 203 Lecture Pepe Montiel
3:00 – 3:30 p.m. SHFE 201 (Grad Lounge) Coffee Break
3:30 – 5:00 p.m. SHFE 203 Lecture Blaise Melly

June 5

Time Location Description Presenter
8:30 a.m. SHFE 201 (Grad Lounge) Breakfast
9:00 – 10:30 a.m. SHFE 203 Lecture  Guillaume Pouliot
10:30 – 11:00 a.m. SHFE 201 (Grad Lounge) Coffee Break
11:00 a.m. – 12:30 p.m. SHFE 203 Lecture Pierre E. Jacob
12:30 – 1:30 p.m. SHFE 201 (Grad Lounge) Lunch
1:30 – 3:00 p.m. SHFE 203 Lecture Pepe Montiel
3:00 – 3:30 p.m. SHFE 201 (Grad Lounge) Coffee Break
3:30 – 5:00 p.m. SHFE 203 Lecture Jose Blanchet

June 6

Time Location Description Presenter
8:30 a.m. SHFE 201 (Grad Lounge) Breakfast
9:00 – 10:30 a.m. SHFE 203 Lecture 2 Guillaume Pouliot
10:30 – 11:00 a.m. SHFE 201 (Grad Lounge) Coffee Break
11:00 a.m. – 12:30 p.m. SHFE 203 Lecture Pierre E. Jacob
12:30 – 1:30 p.m. SHFE 201 (Grad Lounge) Lunch
1:30 – 3:00 p.m. SHFE 203 Lecture Pepe Montiel
3:00 – 3:30 p.m. SHFE 201 (Grad Lounge) Coffee Break
3:30 – 5:00 p.m. SHFE 203 Lecture Jose Blanchet
5:00 p.m. SHFE 203 Closing Comments  

 

Curriculum

Guillaume Pouliot

Quantile Regression and Randomization Inference
Modern Introduction to Quantile Regression
We parallel the best linear predictor development of OLS à la Chamberlain, but for quantile regression. We consider interpretation, standard inference, and their pitfalls.
Quantile Regression Through the Lens of Linear Programming
We consider quantile regression (QR) as a linear program (LP). We use QR as an example to introduce key concepts in LP duality. We derive the key QR identity as a direct consequence of LP duality.
Randomization Inference

We consider exact tests based on invariance properties of test statistics, and consider their asymptotic robustness. 

Modern Topics in Randomization Inference

We consider the connection between randomization inference and Fisher tests, as well as other advanced topics in randomization inference.

Pierre Jacob

Markov Chain Monte Carlo and Couplings
Introduction to Markov Chain Monte Carlo (MCMC)
Introduction to Markov chain Monte Carlo (MCMC) and its relevance in data analysis, for example in parameter inference, hypothesis testing and model choice. Connections are made between MCMC algorithms and iterative optimization methods.
MCMC Theory
Convergence to stationarity and rates, law of large numbers and central limit theorem for MCMC, with an emphasis on the usefulness of couplings and Poisson equations as proof techniques.
Implementable Couplings
How to use couplings in practice to diagnose convergence of Markov chains, to parallelize computation completely or to estimate (without any bias) the asymptotic variance in the central limit theorem for Markov chains.
Designing Couplings
How to construct algorithms that generate pairs of Markov chains that meet exactly after a random number of steps. This will cover the most popular MCMC algorithms as well as more recent sampling algorithms designed for distributions supported on manifolds.

Pepe Montiel

Statistical Decision Theory
A Brief Introduction to Statistical Decision Theory

We introduce basic concepts in Statistical Decision Theory (e.g., Decision Problems, Risk Function, Decision Rules), and present a general definition of Bayes and Minimax Decision Rules. We discuss some general approaches to find Minimax Rules. 

Statistical Decision Theory Applied to a Simple Prediction Problem

We use the framework of Statistical Decision Theory to analyze a simple prediction problem. We use this simple framework to illustrate the benefits of introducing $\ell-2$ regularization. We discuss the role of leave-one-out cross validation to fine-tune predictions. 

Minimax Optimal Best Linear Predictors

We consider the basic problem of finding the best linear predictor of an outcome variable Y in terms of covariates X, but from a minimax perspective. In particular, we allow for the prediction error to be computed under a class of distributions, and we look for the predictors that protect against worst-case performance. We present existing results in the literature showing that the square-root lasso and related estimators solve this “worst-case” best-linear prediction problem.

Benign Overfitting in Prediction Problems

We use the simple prediction problem introduced in Course 2 to discuss the recent literature on “benign overfitting” and the “double-descent” phenomenon. 

Jose Blanchet

Optimal Transport Methods in Distributionally Robust Optimization
Introduction to Optimal Transport Methods in Distributionally Robust Optimization

We will study encompassing formulations by means of optimal transport, duality representations and reduction to convex (deterministic) optimization reformulations.

Optimal Transport Methods in Distributionally Robust Optimization, Hypothesis Testing and Robust Estimation

We will study connections between distributionally robust optimization, optimal choice of regularization, confidence regions and statistical guarantees, and will explore the differences and similarities between distributionally robust estimators and robust estimators in statistics.

Blaise Melly

Quantile Treatment Effects
Quantile Treatment Effects (QTE)
We explore the differences between conditional and unconditional QTE and their connections with quantile regression and unconditional quantile regression. We also briefly consider extensions to panel data.
Inference on QTE with High-dimensional Data

We discuss estimation and inference on QTE after model selection using Neyman orthogonality.