Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Demertzis, Konstantinosa; * | Iliadis, Lazarosb | Kikiras, Panagiotisc | Pimenidis, Eliasd
Affiliations: [a] School of Science and Technology, Informatics Studies, Hellenic Open University, Greece | [b] School of Civil Engineering, Democritus University of Thrace, Kimmeria, Xanthi, Greece | [c] Department of Electrical and Computer Engineering, University of Thessaly, Volos, Greece | [d] Computer Science and Creative Technologies, University of the West of England, Bristol, UK
Correspondence: [*] Corresponding author: Konstantinos Demertzis, School of Science and Technology, Informatics Studies, Hellenic Open University, Greece. E-mail: [email protected].
Abstract: Training a model using batch learning requires uniform data storage in a repository. This approach is intrusive, as users have to expose their privacy and exchange sensitive data by sending them to central entities to be preprocessed. Unlike the aforementioned centralized approach, training of intelligent models via the federated learning (FEDL) mechanism can be carried out using decentralized data. This process ensures that privacy and protection of sensitive information can be managed by a user or an organization, employing a single universal model for all users. This model should apply average aggregation methods to the set of cooperative training data. This raises serious concerns for the effectiveness of this universal approach and, therefore, for the validity of FEDL architectures in general. Generally, it flattens the unique needs of individual users without considering the local events to be managed. This paper proposes an innovative hybrid explainable semi-personalized federated learning model, that utilizes Shapley Values and Lipschitz Constant techniques, in order to create personalized intelligent models. It is based on the needs and events that each individual user is required to address in a federated format. Explanations are the assortment of characteristics of the interpretable system, which, in the case of a specified illustration, helped to bring about a conclusion and provided the function of the model on both local and global levels. Retraining is suggested only for those features for which the degree of change is considered quite important for the evolution of its functionality.
Keywords: Decentralized learning, federated learning, privacy-preserving architecture, explainable AI, local and global interpretability, shapley values, lipschitz constant
DOI: 10.3233/ICA-220683
Journal: Integrated Computer-Aided Engineering, vol. 29, no. 4, pp. 335-350, 2022
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]