pith. machine review for the scientific record. sign in

arxiv: 1805.06201 · v1 · submitted 2018-05-16 · 💻 cs.CL · cs.LG

Recognition: unknown

Contextual Augmentation: Data Augmentation by Words with Paradigmatic Relations

Authors on Pith no claims yet
classification 💻 cs.CL cs.LG
keywords wordsaugmentationsentencesmodelcontextualdatalanguageother
0
0 comments X
read the original abstract

We propose a novel data augmentation for labeled sentences called contextual augmentation. We assume an invariance that sentences are natural even if the words in the sentences are replaced with other words with paradigmatic relations. We stochastically replace words with other words that are predicted by a bi-directional language model at the word positions. Words predicted according to a context are numerous but appropriate for the augmentation of the original words. Furthermore, we retrofit a language model with a label-conditional architecture, which allows the model to augment sentences without breaking the label-compatibility. Through the experiments for six various different text classification tasks, we demonstrate that the proposed method improves classifiers based on the convolutional or recurrent neural networks.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Text and Code Embeddings by Contrastive Pre-Training

    cs.CL 2022-01 unverdicted novelty 6.0

    Contrastive pre-training on unsupervised data at scale creates text and code embeddings that set new state-of-the-art results on classification and semantic search benchmarks.