Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: he, Jia-longa | zhang, Xiao-Lina; * | wang, Yong-Pinga | zhang, Huan-Xianga | gao, Lua | xu, En-Huib
Affiliations: [a] School of Information Engineering, Inner Mongolia University of Science and Technology, Baotou, China | [b] China Nanhu Academy of Electronics and Information Technology, Jiaxing, China
Correspondence: [*] Corresponding author. Xiao-Lin zhang, School of Information Engineering, Inner Mongolia University of Science and Technology, Baotou, China. E-mail: [email protected].
Abstract: In recent years, contrastive learning has been very successful in unsupervised tasks of representation learning and has received a lot of attention in supervised tasks. In supervised tasks, the discrete nature of natural language makes the construction of sample pairs difficult and the models are poorly robust to adversarial samples, so it remains a challenge to make contrastive learning effective for text classification tasks and to guarantee the robustness of the models. This paper presents a contrastive adversarial learning framework built using data augmentation with labeled insertion data. Specifically,By adding perturbation to the word-embedding matrix, adversarial samples are generated as positive examples of contrastive learning, and external semantic information is introduced to construct negative examples. Contrastive learning is used to improve the sensitivity and generalization ability of the model, and adversarial training is used to improve robustness, thereby improving the classification accuracy. In addition, the momentum contrast from unsupervised tasks is also introduced into the text classification task to increase the number of sample pairs. Experimental results on several datasets show that the proposed approach outperforms the baseline comparison approach, and in addition some experiments are conducted to verify the effectiveness of the proposed framework under low-resource conditions.
Keywords: Contrastive learning, adversarial training, text classification
DOI: 10.3233/JIFS-230787
Journal: Journal of Intelligent & Fuzzy Systems, vol. 45, no. 2, pp. 3473-3484, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]