Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Issue title: Special section: Selected papers of LKE 2019
Guest editors: David Pinto, Vivek Singh and Fernando Perez
Article type: Research Article
Authors: Kolesnikova, Olgaa; * | Gelbukh, Alexanderb
Affiliations: [a] Superior School of Computer Sciences, Instituto Politécnico Nacional, Mexico City, Mexico | [b] Center for Computing Research, Instituto Politécnico Nacional, Mexico City, Mexico
Correspondence: [*] Corresponding author. O. Kolesnikova, Superior School of Computer Sciences, Instituto Politécnico Nacional, 07738 Mexico City, Mexico. E-mail: [email protected].
Abstract: In this work, we report the results of our experiments on the task of distinguishing the semantics of verb-noun collocations in a Spanish corpus. This semantics was represented by four lexical functions of the Meaning-Text Theory. Each lexical function specifies a certain universal semantic concept found in any natural language. Knowledge of collocation and its semantic content is important for natural language processing, as collocation comprises the restrictions on how words can be used together. We experimented with word2vec embeddings and six supervised machine learning methods most commonly used in a wide range of natural language processing tasks. Our objective was to study the ability of word2vec embeddings to represent the context of collocations in a way that could discriminate among lexical functions. A difference from previous work with word embeddings is that we trained word2vec on a lemmatized corpus after stopwords elimination, supposing that such vectors would capture a more accurate semantic characterization. The experiments were performed on a collection of 1,131 Excelsior newspaper issues. As the experimental results showed, word2vec representation of collocations outperformed the classical bag-of-words context representation implemented in a vector space model and fed into the same supervised learning methods.
Keywords: Word embeddings, word2vec, supervised machine learning, lexical function, Meaning-Text Theory
DOI: 10.3233/JIFS-179866
Journal: Journal of Intelligent & Fuzzy Systems, vol. 39, no. 2, pp. 1993-2001, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]