Title: Reduced Rank Ridge Regression and Its Kernel Extensions
Advisor: Associate Professor Ji Zhu
Committee Members: Professor Naisyin Wang, Associate Professor Kerby Shedden
Abstract: In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be due to the correlation structure among the predictor variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set-up is also developed.