DDNAS: Discretized Differentiable Neural Architecture Search for Text Classification

Kuan Chun Chen, Cheng Te Li, Kuo Jung Lee

研究成果: Article同行評審

1 引文 斯高帕斯(Scopus)

摘要

Neural Architecture Search (NAS) has shown promising capability in learning text representation. However, existing text-based NAS neither performs a learnable fusion of neural operations to optimize the architecture nor encodes the latent hierarchical categorization behind text input. This article presents a novel NAS method, Discretized Differentiable Neural Architecture Search (DDNAS), for text representation learning and classification. With the continuous relaxation of architecture representation, DDNAS can use gradient descent to optimize the search. We also propose a novel discretization layer via mutual information maximization, which is imposed on every search node to model the latent hierarchical categorization in text representation. Extensive experiments conducted on eight diverse real datasets exhibit that DDNAS can consistently outperform the state-of-the-art NAS methods. While DDNAS relies on only three basic operations, i.e., convolution, pooling, and none, to be the candidates of NAS building blocks, its promising performance is noticeable and extensible to obtain further improvement by adding more different operations.

原文English
文章編號88
期刊ACM Transactions on Intelligent Systems and Technology
14
發行號5
DOIs
出版狀態Published - 2023 10月 3

All Science Journal Classification (ASJC) codes

  • 理論電腦科學
  • 人工智慧

指紋

深入研究「DDNAS: Discretized Differentiable Neural Architecture Search for Text Classification」主題。共同形成了獨特的指紋。

引用此