pith. machine review for the scientific record. sign in

arxiv: 1602.02068 · v2 · submitted 2016-02-05 · 💻 cs.CL · cs.LG· stat.ML

Recognition: unknown

From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label Classification

Authors on Pith no claims yet
classification 💻 cs.CL cs.LGstat.ML
keywords lossclassificationsoftmaxsparsemaxattentionfunctionmulti-labelpropose
0
0 comments X
read the original abstract

We propose sparsemax, a new activation function similar to the traditional softmax, but able to output sparse probabilities. After deriving its properties, we show how its Jacobian can be efficiently computed, enabling its use in a network trained with backpropagation. Then, we propose a new smooth and convex loss function which is the sparsemax analogue of the logistic loss. We reveal an unexpected connection between this new loss and the Huber classification loss. We obtain promising empirical results in multi-label classification problems and in attention-based neural networks for natural language inference. For the latter, we achieve a similar performance as the traditional softmax, but with a selective, more compact, attention focus.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Selectivity and Shape in the Design of Forward-Forward Goodness Functions

    cs.LG 2026-03 unverdicted novelty 7.0

    Shape- and peak-sensitive goodness functions for Forward-Forward deliver up to 72pp gains over sum-of-squares, reaching 98.2% on MNIST and 89% on Fashion-MNIST.

  2. HuggingFace's Transformers: State-of-the-art Natural Language Processing

    cs.CL 2019-10 accept novelty 6.0

    Hugging Face releases an open-source Python library that supplies a unified API and pretrained weights for major Transformer architectures used in natural language processing.