Skip to content

Latest commit

 

History

History
11 lines (8 loc) · 2.69 KB

Wang2020TextNAS.md

File metadata and controls

11 lines (8 loc) · 2.69 KB

Title

TextNAS: A Neural Architecture Search Space Tailored for Text Representation

Author

Yujing Wang, Yaming Yang, Yiren Chen, Jing Bai, Ce Zhang, Guinan Su, Xiaoyu Kou, Yunhai Tong, Mao Yang, Lidong Zhou

Abstract

Learning text representation is crucial for text classification and other language related tasks. There are a diverse set of text representation networks in the literature, and how to find the optimal one is a non-trivial problem. Recently, the emerging Neural Architecture Search (NAS) techniques have demonstrated good potential to solve the problem. Nevertheless, most of the existing works of NAS focus on the search algorithms and pay little attention to the search space. In this paper, we argue that the search space is also an important human prior to the success of NAS in different applications. Thus, we propose a novel search space tailored for text representation. Through automatic search, the discovered network architecture outperforms state-of-the-art models on various public datasets on text classification and natural language inference tasks. Furthermore, some of the design principles found in the automatic network agree well with human intuition.

Bib

@article{Wang_Yang_Chen_Bai_Zhang_Su_Kou_Tong_Yang_Zhou_2020, title={TextNAS: A Neural Architecture Search Space Tailored for Text Representation}, volume={34}, url={https://ojs.aaai.org/index.php/AAAI/article/view/6462}, DOI={10.1609/aaai.v34i05.6462}, abstractNote={<p>Learning text representation is crucial for text classification and other language related tasks. There are a diverse set of text representation networks in the literature, and how to find the optimal one is a non-trivial problem. Recently, the emerging Neural Architecture Search (NAS) techniques have demonstrated good potential to solve the problem. Nevertheless, most of the existing works of NAS focus on the search algorithms and pay little attention to the search space. In this paper, we argue that the search space is also an important human prior to the success of NAS in different applications. Thus, we propose a novel search space tailored for text representation. Through automatic search, the discovered network architecture outperforms state-of-the-art models on various public datasets on text classification and natural language inference tasks. Furthermore, some of the design principles found in the automatic network agree well with human intuition.</p>}, number={05}, journal={Proceedings of the AAAI Conference on Artificial Intelligence}, author={Wang, Yujing and Yang, Yaming and Chen, Yiren and Bai, Jing and Zhang, Ce and Su, Guinan and Kou, Xiaoyu and Tong, Yunhai and Yang, Mao and Zhou, Lidong}, year={2020}, month={Apr.}, pages={9242-9249} }