Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Shi, Xuefenga | Hu, Mina; * | Ren, Fujib; * | Shi, Piaoa
Affiliations: [a] School of Computer and Information, Hefei University of Technology, Anhui, China | [b] School of Computer Science and Engineering, University of Electronic Science and Technology of China, Sichuan, China
Correspondence: [*] Corresponding authors: Min Hu, School of Computer and Information, Hefei University of Technology, Anhui, China. E-mail: [email protected]. Fuji Ren, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Sichuan, China. E-mail: [email protected].
Abstract: Active Learning (AL) is a technique being widely employed to minimize the time and labor costs in the task of annotating data. By querying and extracting the specific instances to train the model, the relevant task’s performance is improved maximally within limited iterations. However, rare work was conducted to fully fuse features from different hierarchies to enhance the effectiveness of active learning. Inspired by the thought of information compensation in many famous deep learning models (such as ResNet, etc.), this work proposes a novel TextCNN-based Two ways Active Learning model (TCTWAL) to extract task-relevant texts. TextCNN takes the advantage of little hyper-parameter tuning and static vectors and achieves excellent results on various natural language processing (NLP) tasks, which are also beneficial to human-computer interaction (HCI) and the AL relevant tasks. In the process of the proposed AL model, the candidate texts are measured from both global and local features by the proposed AL framework TCTWAL depending on the modified TextCNN. Besides, the query strategy is strongly enhanced by maximum normalized log-probability (MNLP), which is sensitive to detecting the longer sentences. Additionally, the selected instances are characterized by general global information and abundant local features simultaneously. To validate the effectiveness of the proposed model, extensive experiments are conducted on three widely used text corpus, and the results are compared with with eight manual designed instance query strategies. The results show that our method outperforms the planned baselines in terms of accuracy, macro precision, macro recall, and macro F1 score. Especially, to the classification results on AG’s News corpus, the improvements of the four indicators after 39 iterations are 40.50%, 45.25%, 48.91%, and 45.25%, respectively.
Keywords: Active learning, TextCNN, maximum normalized log-probability, global information, local feature
DOI: 10.3233/IDA-230332
Journal: Intelligent Data Analysis, vol. 28, no. 5, pp. 1189-1211, 2024
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]