Da'u, Aminu and Salim, Naomie (2019) Aspect extraction on user textual reviews using multi-channel convolutional neural network. PeerJ Computer Science, 2019 . pp. 1-16. ISSN 2376-5992
|
PDF
1MB |
Official URL: http://dx.doi.org/10.7717/peerj-cs.191
Abstract
Aspect extraction is a subtask of sentiment analysis that deals with identifying opinion targets in an opinionated text. Existing approaches to aspect extraction typically rely on using handcrafted features, linear and integrated network architectures. Although these methods can achieve good performances, they are time-consuming and often very complicated. In real-life systems, a simple model with competitive results is generally more effective and preferable over complicated models. In this paper, we present a multichannel convolutional neural network for aspect extraction. The model consists of a deep convolutional neural network with two input channels: a word embedding channel which aims to encode semantic information of the words and a part of speech (POS) tag embedding channel to facilitate the sequential tagging process. To get the vector representation of words, we initialized the word embedding channel and the POS channel using pretrained word2vec and one-hot-vector of POS tags, respectively. Both the word embedding and the POS embedding vectors were fed into the convolutional layer and concatenated to a one-dimensional vector, which is finally pooled and processed using a Softmax function for sequence labeling. We finally conducted a series of experiments using four different datasets. The results indicated better performance compared to the baseline models.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | aspect extraction, convolutional neural network, deep learning |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Computing |
ID Code: | 88973 |
Deposited By: | Yanti Mohd Shah |
Deposited On: | 26 Jan 2021 08:36 |
Last Modified: | 26 Jan 2021 08:36 |
Repository Staff Only: item control page