Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Gałka, Jakub; * | Jaciów, Paweł
Affiliations: AGH University of Science and Technology, Department of Electronics, Kraków, Poland
Correspondence: [*] Corresponding author. Jakub Gałka, AGH University of Science and Technology, Department of Electronics, Kraków, Poland. E-mail: [email protected].
Abstract: In this paper, we propose a new method for concurrent accuracy and computational efficiency optimization using a fuzzy clusters tree for i-vector speaker identification. The design assumptions and an algorithm for a new type of fuzzy i-vector tree construction were introduced. The obtained solution was evaluated using the NIST 2014 i-Vector Speaker Recognition Machine Learning Challenge dataset. A 15% relative equal error rate reduction for a 74% reduction in computation time was achieved when compared to the baseline with only a 5.5% relative identification rate loss for discussed tree configurations.
Keywords: speaker recognition, fast identification, i-vector, fuzzy clustering, decision trees
DOI: 10.3233/JIFS-181359
Journal: Journal of Intelligent & Fuzzy Systems, vol. 37, no. 4, pp. 4937-4949, 2019
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]