pith. machine review for the scientific record. sign in

arxiv: 1805.06258 · v1 · submitted 2018-05-16 · 📊 stat.ML · cs.LG

Recognition: unknown

Structured nonlinear variable selection

Authors on Pith no claims yet
classification 📊 stat.ML cs.LG
keywords nonlinearselectionvariablealgorithmderivativesmodelsproblemproperties
0
0 comments X
read the original abstract

We investigate structured sparsity methods for variable selection in regression problems where the target depends nonlinearly on the inputs. We focus on general nonlinear functions not limiting a priori the function space to additive models. We propose two new regularizers based on partial derivatives as nonlinear equivalents of group lasso and elastic net. We formulate the problem within the framework of learning in reproducing kernel Hilbert spaces and show how the variational problem can be reformulated into a more practical finite dimensional equivalent. We develop a new algorithm derived from the ADMM principles that relies solely on closed forms of the proximal operators. We explore the empirical properties of our new algorithm for Nonlinear Variable Selection based on Derivatives (NVSD) on a set of experiments and confirm favourable properties of our structured-sparsity models and the algorithm in terms of both prediction and variable selection accuracy.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Locally Near Optimal Piecewise Linear Regression in High Dimensions via Difference of Max-Affine Functions

    stat.ML 2026-05 unverdicted novelty 7.0

    ABGD parametrizes piecewise linear functions as difference of max-affine functions and converges linearly to an epsilon-accurate solution with O(d max(sigma/epsilon,1)^2) samples under sub-Gaussian noise, which is min...