|Computational Mathematics Seminar Series|
|On structured sparsity learning with affine sparsity constraints|
|Hongbo Dong, Washington State University|
|Digital Media Center 1034
March 19, 2019 - 03:30 pm
We introduce a new constraint system, namely affine sparsity constraints (ASC), as a general optimization framework for structured sparse variable selection in statistical learning. Such a system arises when there are nontrivial logical conditions on the sparsity of certain unknown model parameters to be estimated. One classical nontrivial logical condition is the heredity principle in regression models, where interaction terms of predictor variables can be introduced into the model “only if” the corresponding linear terms already exist in the model. Formally, extending a cardinality constraint, an ASC system is defined by a system of linear inequalities of binary indicators, which represent nonzero patterns of unknown parameters in estimation. We study some fundamental properties of such a system, including set closedness and set convergence of approximations, by using tools in polyhedral theory and variational analysis. We will also study conditions under which optimization with ASC can be reduced to integer programs or mathematical programming with complementarity conditions (MPCC), where algorithms and efficient implementation already exist. Finally, we will focus on the problem of regression with heredity principle, with our previous results, we derive nonconvex penalty formulations that are direct extensions of convex penalties proposed in the literature for this problem.
Hongbo Dong received his PhD in Applied Mathematical and Computational Sciences from University of Iowa in 2011. After a two-year postdoc training in University of Wisconsin-Madison, in 2013 he joined the Washington State University as an Assistant Professor in the Department of Mathematics and Statistics, where he received his tenure promotion in 2019. He works in the area of mathematical optimization, and is primarily interested in problems with mixed discrete and continuous structures, as well as applications in statistics and data science. His work has been published in both optimization and statistical journals.
|This lecture has refreshments @ 03:00 pm|