Visual weather property prediction by multi-task learning and two-dimensional RNNs

Wei Ta Chu, Yu Hsuan Liang, Kai Chia Ho

研究成果: Article同行評審

摘要

We attempted to employ convolutional neural networks to extract visual features and developed recurrent neural networks for weather property estimation using only image data. Four common weather properties are estimated, i.e., temperature, humidity, visibility, and wind speed. Based on the success of previous works on temperature prediction, we extended them in terms of two aspects. First, by considering the effectiveness of deep multi-task learning, we jointly estimated four weather properties on the basis of the same visual information. Second, we propose that weather property estimations considering temporal evolution can be conducted from two perspectives, i.e., day-wise or hour-wise. A two-dimensional recurrent neural network is thus proposed to unify the two perspectives. In the evaluation, we show that better prediction accuracy can be obtained compared to the state-of-the-art models. We believe that the proposed approach is the first visual weather property estimation model trained based on multi-task learning.

原文English
文章編號584
期刊Atmosphere
12
發行號5
DOIs
出版狀態Published - 2021

All Science Journal Classification (ASJC) codes

  • 環境科學(雜項)

指紋

深入研究「Visual weather property prediction by multi-task learning and two-dimensional RNNs」主題。共同形成了獨特的指紋。

引用此