Attention Network Block Module Drives Deep Neural Network and Their Applications

  • 王 麒詳

Student thesis: Doctoral Thesis


In recent years the deep neural network (DNN) successfully applied on many different tasks such as computer visions (CV) natural language processing (NLP) and recommender system (RS) Therefore many studies working on increasing model performance in practice In general three different ideas that can improve the DNN included the (1) deeper and wider network architecture (2) automatic architecture search and (3) the convolutional attention block The attention network block module is a flexible and lower-cost approach It makes the model to extract more efficient features Therefore we designed novel attention network block modules to improve models on RS and CV tasks We developed two different attention network modules FuzzAttention and Topic Diversity Discovering (TDD) on RS to improve the user preference representation then increase the recommended ability On the other side we also designed an Uncertainty Attention (UA) that is an attention network to discover the potential information from the uncertain region on feature maps in CV tasks We evaluated our designed attention network blocks to combine with existed models on RS and CV tasks In the experimental results our attention modules outperformed the original model and other attention approaches According to the results showed that the attention network is one of the efficient ways to increase the model’s performance
Date of Award2020
Original languageEnglish
SupervisorJung-Hsien Chiang (Supervisor)

Cite this