A Dual-Attention Hierarchical Recurrent Neural Networkfor Dialogue Act Classification

Ruizhe Li, Chenghua Lin, Matthew Collinson, Xiao Li, G. Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.
Original languageEnglish
Title of host publicationProceedings of the 23rd Conference on Computational Natural Language Learning
Place of PublicationHong Kong, China
PublisherAssociation for Computational Linguistics
Pages383-392
Number of pages10
DOIs
Publication statusPublished - 3 Nov 2019
Event 23rd Conference on Computational Natural Language Learning - Hong Kong, China
Duration: 3 Nov 20194 Nov 2019
https://www.conll.org/2019

Conference

Conference 23rd Conference on Computational Natural Language Learning
Country/TerritoryChina
CityHong Kong
Period3/11/194/11/19
Internet address

Fingerprint

Dive into the research topics of 'A Dual-Attention Hierarchical Recurrent Neural Networkfor Dialogue Act Classification'. Together they form a unique fingerprint.

Cite this