CRNN: a joint neural network for redundancy detection

Fu, Xinyu and Ch’ng, Eugene and Aickelin, Uwe and See, Simon (2017) CRNN: a joint neural network for redundancy detection. In: 3rd IEEE International Conference on Smart Computing (Smartcomp 2017), 29-31 May 2017, Hong Kong, China.

[img]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (8MB) | Preview

Abstract

This article proposes a novel framework for detecting redundancy in supervised sentence categorisation. Unlike traditional singleton neural network, our model incorporates character-aware convolutional neural network (Char-CNN) with character-aware recurrent neural network (Char-RNN) to form a convolutional recurrent neural network (CRNN). Our model benefits from Char-CNN in that only salient features are selected and fed into the integrated Char-RNN. Char-RNN effectively learns long sequence semantics via sophisticated update mechanism. We compare our framework against the state-of-the-art text classification algorithms on four popular benchmarking corpus. For instance, our model achieves competing precision rate, recall ratio, and F1 score on the Google-news data-set. For twenty-news-groups data stream, our algorithm obtains the optimum on precision rate, recall ratio, and F1 score. For Brown Corpus, our framework obtains the best F1 score and almost equivalent precision rate and recall ratio over the top competitor. For the question classification collection, CRNN produces the optimal recall rate and F1 score and comparable precision rate. We also analyse three different RNN hidden recurrent cells’ impact on performance and their runtime efficiency. We observe that MGU achieves the optimal runtime and comparable performance against GRU and LSTM. For TFIDF based algorithms, we experiment with word2vec, GloVe, and sent2vec embeddings and report their performance differences.

Item Type: Conference or Workshop Item (Paper)
Additional Information: ISBN 978-1-5090-6517-2 © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Keywords: Logic gates, Training, Redundancy, Recurrent neural networks, Benchmark testing, Computational modeling
Schools/Departments: University of Nottingham Ningbo China > Faculty of Science and Engineering > School of Computer Science
University of Nottingham, UK > Faculty of Science > School of Computer Science
Identification Number: 10.1109/SMARTCOMP.2017.7946996
Related URLs:
Depositing User: Aickelin, Professor Uwe
Date Deposited: 03 May 2017 14:46
Last Modified: 12 Oct 2017 22:41
URI: http://eprints.nottingham.ac.uk/id/eprint/42463

Actions (Archive Staff Only)

Edit View Edit View