Abstract
Open Information Extraction (OIE) is a challenging task of extracting relation tuples from an unstructured corpus. While several OIE algorithms have been developed in the past decade, only few employ deep learning techniques. In this paper, a novel OIE neural model that leverages Recurrent Neural Networks (RNN) using Gated Recurrent Units (GRUs) is presented. Moreover, we integrate the innovative contextual word embeddings into our OIE model, which further enhances the performance. The results demonstrate that our proposed neural OIE model outperforms the existing state-of-art on two datasets.
Original language | English |
---|---|
Title of host publication | NLDB 2019: International Conference on Applications of Natural Language to Information Systems |
Editors | E. et al. Métais |
Publisher | Springer |
Pages | 359–367 |
Number of pages | 9 |
Volume | 11608 |
ISBN (Electronic) | 978-3-030-23281-8 |
ISBN (Print) | 978-3-030-23280-1 |
DOIs | |
Publication status | Published - 2019 |
Keywords
- Open Information Extraction
- Word embeddings
- RNN
- GRU
- LSTM