Diversity regularized autoencoders for text generation

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 59
  • Download : 0
In this paper, we propose a simple yet powerful text generation model, called diversity regularized autoencoders (DRAE). The key novelty of the proposed model lies in its ability to handle various sentence modifications such as insertions, deletions, substitutions, and maskings, and to take them as input. Because the noise-injection strategy enables an encoder to make the latent distribution smooth and continuous, the proposed model can generate more diverse and coherent sentences. Also, we adopt the Wasserstein generative adversarial networks with a gradient penalty to achieve stable adversarial training of the prior distribution. We evaluate the proposed model using quantitative, qualitative, and human evaluations on two public datasets. Experimental results demonstrate that our model using a noise-injection strategy produces more natural and diverse sentences than several baseline models. Furthermore, it is found that our model shows the synergistic effect of grammar correction and paraphrase generation in an unsupervised way.
Publisher
Association for Computing Machinery
Issue Date
2020-04
Language
English
Citation

35th Annual ACM Symposium on Applied Computing, SAC 2020, pp.883 - 891

DOI
10.1145/3341105.3373998
URI
http://hdl.handle.net/10203/299597
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0