Variational Analysis of Sparsity Optimization
Luke, Russel
Abstract
Many problems of interest where more than one solution is possible seek, among these, the one that is sparsest. The objective that most directly accounts for sparsity, the TeX Embedding failed! metric, is usually avoided since this leads to a combinatorial optimization problem. The function TeX Embedding failed! is often viewed as the limit of the TeX Embedding failed! metrics. Naturally, there have been some attempts to use this as an objective for TeX Embedding failed! small, though this is a nonconvex function for TeX Embedding failed!. We propose instead a scaled and shifted Fermi-Dirac entropy with two parameters, one controlling the smoothness of the approximation and the other the TeX Embedding failed! of the metric. Our proposed metric is a convex relaxation for which a strong duality theory holds, yielding dual methods for metrics approaching the desired TeX Embedding failed! function. Without smoothing, we propose a dynamically reweighted subdifferential descent method with ``exact'' line search that is finitely terminating for constraints that are well-separated. This algorithm is shown to recapture in a special case certain well-known ``greedy'' algorithms. Consequently we are able to provide an explicit algorithm whose fixed point, under the appropriate assumptions, is the sparsest possible solution. The variational perspective yields general strategies to make the algorithm more robust.
Details
- ‹ previous
- 2 of 62
- next ›