Date of Award
Doctor of Philosophy
Regularization techniques have become a principled tool for model-based statistics and artificial intelligence research. However, in most situations, these regularization terms are not well interpreted, especially on how they are related to the loss function and data matrix in a given statistic model. In this work, we propose a robust minimax formulation to interpret the relationship between data and regularization terms for a large class of loss functions. We show that various regularization terms are essentially corresponding to different distortions to the original data matrix. This supplies a unified framework for understanding various existing regularization terms, designing novel regularization terms based on perturbation analysis techniques, and inspiring novel generic algorithms. To show how to apply minimax related concepts to real-world learning tasks, we develop a new fault-tolerant classification framework to combat class noise for general multi-class classification problems; further, by studying the relationship between the majorizable function class and the minimax framework, we develop an accurate, efficient, and scalable algorithm for solving a large family of learning formulations. In addition, this work has been further extended to tackle several important matrix-decomposition-related learning tasks, and we have validated our work on various real-world applications including structure-from-motion (with missing data) and latent structure dictionary learning tasks. This work, composed of a unified formulation, a scalable algorithm, and promising applications in many real-world learning problems, contributes to the understanding of various hidden robustness in many learning models. As we show, many classical statistical machine learning models can be unified using this formulation and accurate, efficient, and scalable algorithms become available from our research.
This dissertation is only
available for download to the SIUC community. Others should contact the
interlibrary loan department of your local library or contact ProQuest's Dissertation Express service.