Dynamic Neural Networks : Apply Neural Mutual Information for Text classification

  • 陳 冠君

Student thesis: Doctoral Thesis

Abstract

Learning text representation is important for text classification text generation and other Natural Language Processing (NLP) tasks Recently diverse model structure has been proposed to learn text representation But such manually designed model can have infinite combinations so we don't know which model structure is optimal Recently Neural Architecture Search (NAS) techniques were developed to solve such problems However most of NAS techniques tried to archieve high classification accuracy of dataset instead of focusing on learning input representation which can benefit on the classification accuracy of dataset Hence we compute mutual information between input representation and output representation of each layer of neural network and maximize it Through maximizing mutual information we can learn text representation We proposed a method which applied NAS to do model structure search and use mutual information as objective function in text classification Our method outperforms other models in text classification and perform the state-of-art result in scarce data setting
Date of Award2020
Original languageEnglish
SupervisorKuo-Jung Lee (Supervisor)

Cite this

'