Skip to yearly menu bar Skip to main content


Virtual presentation / top 25% paper

Learning Sparse Group Models Through Boolean Relaxation

Yijie Wang · Yuan Zhou · Xiaoqing Huang · Kun Huang · Jie Zhang · Jianzhu Ma

Keywords: [ General Machine Learning ] [ Convex relaxation ] [ Small sample size ] [ Cardinality-constrained program ] [ Structured sparisity ]


Abstract:

We introduce an efficient algorithmic framework for learning sparse group models formulated as the natural convex relaxation of a cardinality-constrained program with Boolean variables. We provide theoretical techniques to characterize the equivalent condition when the relaxation achieves the exact integral optimal solution, as well as a rounding algorithm to produce a feasible integral solution once the optimal relaxation solution is fractional. We demonstrate the power of our equivalent condition by applying it to two ensembles of random problem instances that are challenging and popularly used in literature and prove that our method achieves exactness with overwhelming probability and nearly optimal sample complexity. Empirically, we use synthetic datasets to demonstrate that our proposed method significantly outperforms the state-of-the-art group sparse learning models in terms of individual and group support recovery when the number of samples is small. Furthermore, we show the out-performance of our method in cancer drug response prediction.

Chat is not available.