Title: Topics in Time Series Analysis with Macroeconomic Applications
Chair: Associate Professor Edward L. Ionides
Cognate Member: Assistant Professor Matias D. Cattaneo
Member: Professor Naisyin Wang, Associate Professor Yves F. Atchade
Abstract: Time series is widely used in many real-world applications. In statistics, signal processing, econometrics and mathematical finance, a time series is a sequence of data points, measured typically at successive time instants spaced at uniform time intervals. Examples of time series are the daily closing value of the Dow Jones index or the annual flow volume of the Nile River at Aswan. Time series analysis comprises methods for analyzing time series data in order to extract meaningful statistics and other characteristics of the data. Time series forecasting is the use of a model to predict future values based on previously observed values. In this thesis, we will focus on the scenarios of panel data (i.e., multiple time series) and state-space model in economics.
Many investigations have used panel methods to study the relationships between fluctuations in economic activity and mortality. A broad consensus has emerged on the overall procyclical nature of mortality: perhaps counter-intuitively, mortality typically rises above its trend during expansions. This consensus has been tarnished by inconsistent reports on the specific age groups and mortality causes involved. We show that these inconsistencies result, in part, from the trend specifications used in previous panel models. Standard econometric panel analysis involves fitting regression models using ordinary least squares, employing standard errors which are robust to temporal autocorrelation. The model specifications include a fixed effect, and possibly a linear trend, for each time series in the panel. We propose alternative methodology based on nonlinear detrending. Applying our methodology on US data, we obtain more precise and consistent results than previous studies.
Iterated filtering is based on a sequence of particle filtering, which could facilitates likelihood-based inference in Dynamic Stochastic General Equilibrium (DSGE) models. Economics produces examples of quantities with low measurement error (e.g., GDP, unemployment rate), which leads to technical difficulties. Numerous researchers have studied some examples on filtering dynamic economic models. Recent economic turmoil makes reassessment of structural models an urgent problem. Testing new macroeconomic models against recent data is relevant in the context of the recent macroeconomic crisis. We will compare Particle Filter within Markov Chain Monte Carlo (PMCMC) and Iterated Filtering (MIF) in estimating DSGE model using simulated data.
There is a trade-off between numbers of parameter values sampled each filtering and the number of filtering operation needed. PMCMC is at one extreme of this (only 1 new parameter value per filtering operation; thousands of filtering operations needed). Traditional MIF is at the other extreme (1 new parameter per particle per time point; 50 filtering operations needed). We will propose p-MIF (p (0< p< 1) new parameter per particle per time point on average; 50 filtering operations needed) schemes as an intermediate algorithm between these two very different extremes. This is shown to perform better for this sort of problem than either existing methods. We also will apply p-MIF to re-evaluate DSGE model using US data.