DGST: a Dual-Generator Network for Text Style Transfer

Xiao Li, Guanyi Chen, Chenghua Lin, Ruizhe Li

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

We propose DGST, a novel and simple Dual-Generator network architecture for text Style Transfer. Our model employs two generators only, and does not rely on any discriminators or parallel corpus for training. Both quantitative and qualitative experiments on the Yelp and IMDb datasets show that our model gives competitive performance compared to several strong baselines with more complicated architecture designs.
Original languageEnglish
Title of host publicationProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
EditorsBonnie Webber, Trevor Cohn, Yulan He, Yang Liu
PublisherAssociation for Computational Linguistics
Pages7131-7136
Number of pages6
DOIs
Publication statusPublished - 15 Nov 2020
EventProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) -
Duration: 15 Nov 2020 → …
https://2020.emnlp.org/

Conference

ConferenceProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Period15/11/20 → …
Internet address

Fingerprint

Dive into the research topics of 'DGST: a Dual-Generator Network for Text Style Transfer'. Together they form a unique fingerprint.

Cite this