Modeling public mood and emotion: Stock market trend prediction with anticipatory computing approach

Mu Yen Chen, Chien Hsiang Liao, Ren Pao Hsieh

Research output: Contribution to journalArticlepeer-review

52 Citations (Scopus)

Abstract

The science and technology is more and more developed. Digital media such as articles, commentary, videos, animations and others on the Internet is becoming more and more important. English semantic analysis has many basic technologies, many applications are also gradually budding in this basic technology. On the other hand, there is no uniform or complete reorganization of the basic technologies in Chinese semantic analysis. Chinese semantic analysis is difficult than English semantic analysis because it is difficult to judge the true meaning of Chinese words and sentences. This study collects articles about common news sites in Taiwan and related to individual stocks. After the data is preprocessed and Skip-gram, each word is converted to word features using Word2Vec. The Lexicon stores the most relevant words around the keyword. In the prediction stage, this study calculates the impact of new articles on the stock price according to the full training lexicon. Finally, this study uses the deep learning approach - LSTM (Long Short-Term Memory) to evaluate the final results. The aim of this study is to adopt anticipatory computing to explore the public mood and emotion from news articles. Then this study can predict the future stock market trend and can be the reference model to the related industries.

Original languageEnglish
Pages (from-to)402-408
Number of pages7
JournalComputers in Human Behavior
Volume101
DOIs
Publication statusPublished - 2019 Dec

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Human-Computer Interaction
  • General Psychology

Fingerprint

Dive into the research topics of 'Modeling public mood and emotion: Stock market trend prediction with anticipatory computing approach'. Together they form a unique fingerprint.

Cite this