Mickel Hoang
Chalmers University of Technology, Gothenburg, Chalmers University of Technology, Sweden
Oskar Alija Bihorac
Chalmers University of Technology, Gothenburg, Chalmers University of Technology, Sweden
Jacobo Rouces
Språkbanken, University of Gothenburg, Sweden
Ladda ner artikelIngår i: Proceedings of the 22nd Nordic Conference on Computational Linguistics (NoDaLiDa), September 30 - October 2, Turku, Finland
Linköping Electronic Conference Proceedings 167:20, s. 187--196
NEALT Proceedings Series 42:20, p. 187--196
Publicerad: 2019-10-02
ISBN: 978-91-7929-995-8
ISSN: 1650-3686 (tryckt), 1650-3740 (online)
Sentiment analysis has become very popular in both research and business due to the increasing amount of opinionated text from Internet users. Standard sentiment analysis deals with classifying the overall sentiment of a text, but this doesn’t include other important information such as towards which entity, topic or aspect within the text the sentiment is directed. Aspect-based sentiment analysis (ABSA) is a more complex task that consists in identifying both sentiments and aspects. This paper shows the potential of using the contextual word representations from the pre-trained language model BERT, together with a fine-tuning method with additional generated text, in order to solve out-of-domain ABSA and outperform previous state-of-the-art results on SemEval-2015 Task 12 subtask 2 and SemEval-2016 Task 5. To the best of our knowledge, no other existing work has been done on out-of-domain ABSA for aspect classification.
BERT
ASPECT-BASED SENTIMENT ANALYSIS
SENTIMENT ANALYSIS
PRE-TRAINED LANGUAGE MODEL
Inga referenser tillgängliga