TY - GEN
T1 - Rethinking Relation between Model Stacking and Recurrent Neural Networks for Social Media Prediction
AU - Hsu, Chih Chung
AU - Tseng, Wen Hai
AU - Yang, Hao Ting
AU - Lin, Chia Hsiang
AU - Kao, Chi Hung
N1 - Funding Information:
This study was supported in part by the Ministry of Science and Technology (MOST), Taiwan, under Grants MOST 107-2218-E-020-002-MY3, 109-2637-H-020 -002, 109-2218-E-006-032, and 109-2634-F-007-013; partly by the Young Scholar Fellowship Program (EINSTEIN Program) of MOST under Grant MOST 108-2636-E-006-012; partly by Higher Education Sprout Project of Ministry of Education (MOE) to the Headquarters of University Advancement at National Cheng Kung University (NCKU). We thank SMP Challenge organizers for providing the large-scale social media dataset.
Publisher Copyright:
© 2020 ACM.
PY - 2020/10/12
Y1 - 2020/10/12
N2 - Popularity prediction of social posts is one of the most critical issues for social media analysis and understanding. In this paper, we discover a more dominant feature representation of text information, as well as propose a singe ensemble learning model to obtain the popularity scores, for social media prediction challenge. However, most social media prediction techniques focus on predicting the popularity score of social posts based on a single model, such as deep learning-based or ensemble learning-based approaches. However, it is well-known that the model stacking strategy is a more effective way to boost the performance on various regression tasks. In this paper, we also show that the model stacking can be modeled as a simple recurrent neural network problem with comparable performance on predicting popularity scores. Firstly, a single strong baseline is proposed based on the deep neural network with a prediction branch. Then, the partial feature maps of the last layer of our strong baseline are used to establish a new branch with an isolated predictor. It is easy to obtain multi-prediction by repeating the above two steps. These preliminary predicted scores are then formed as the input of the recurrent unit to learn the final predicted scores, called Recurrent Stacking Model (RSM). Our experiments show that the proposed ensemble learning approach outperforms other state-of-the-art methods. Furthermore, the proposed RSM also shows the superiority over our ensemble learning approach, having verified that the model stacking problem can be transformed into the training problem of a recurrent neural network.
AB - Popularity prediction of social posts is one of the most critical issues for social media analysis and understanding. In this paper, we discover a more dominant feature representation of text information, as well as propose a singe ensemble learning model to obtain the popularity scores, for social media prediction challenge. However, most social media prediction techniques focus on predicting the popularity score of social posts based on a single model, such as deep learning-based or ensemble learning-based approaches. However, it is well-known that the model stacking strategy is a more effective way to boost the performance on various regression tasks. In this paper, we also show that the model stacking can be modeled as a simple recurrent neural network problem with comparable performance on predicting popularity scores. Firstly, a single strong baseline is proposed based on the deep neural network with a prediction branch. Then, the partial feature maps of the last layer of our strong baseline are used to establish a new branch with an isolated predictor. It is easy to obtain multi-prediction by repeating the above two steps. These preliminary predicted scores are then formed as the input of the recurrent unit to learn the final predicted scores, called Recurrent Stacking Model (RSM). Our experiments show that the proposed ensemble learning approach outperforms other state-of-the-art methods. Furthermore, the proposed RSM also shows the superiority over our ensemble learning approach, having verified that the model stacking problem can be transformed into the training problem of a recurrent neural network.
UR - http://www.scopus.com/inward/record.url?scp=85106995090&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85106995090&partnerID=8YFLogxK
U2 - 10.1145/3394171.3417332
DO - 10.1145/3394171.3417332
M3 - Conference contribution
AN - SCOPUS:85106995090
T3 - MM 2020 - Proceedings of the 28th ACM International Conference on Multimedia
SP - 4585
EP - 4589
BT - MM 2020 - Proceedings of the 28th ACM International Conference on Multimedia
PB - Association for Computing Machinery, Inc
T2 - 28th ACM International Conference on Multimedia, MM 2020
Y2 - 12 October 2020 through 16 October 2020
ER -