MAFD: A Federated Distillation Approach with Multi-head Attention for Recommendation Tasks

Aming Wu, Young Woo Kwon

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The key challenges that recommendation systems must overcome are data isolation and privacy protection issues. Federated learning can efficiently train global models using decentralized data while preserving privacy. In real-world applications, however, it is difficult to achieve high prediction accuracy due to the heterogeneity of devices, the lack of data, and the limited generalization capacity of models. In this research, we introduce a personalized federated knowledge distillation model for a recommendation system based on a multi-head attention mechanism for recommendation systems. Specifically, we first employ federated distillation to improve the performance of student models and introduce a multi-head attention mechanism to enhance user encoding information. Next, we incorporate Wasserstein distance into the objective function of combined distillation to reduce the distribution gap between teacher and student networks and also use an adaptive learning rate technique to enhance convergence. We show that the proposed approach achieves better effectiveness and robustness through benchmarks.

Original languageEnglish
Title of host publicationProceedings of the 38th ACM/SIGAPP Symposium on Applied Computing, SAC 2023
PublisherAssociation for Computing Machinery
Pages1221-1224
Number of pages4
ISBN (Electronic)9781450395175
DOIs
StatePublished - 27 Mar 2023
Event38th Annual ACM Symposium on Applied Computing, SAC 2023 - Tallinn, Estonia
Duration: 27 Mar 202331 Mar 2023

Publication series

NameProceedings of the ACM Symposium on Applied Computing

Conference

Conference38th Annual ACM Symposium on Applied Computing, SAC 2023
Country/TerritoryEstonia
CityTallinn
Period27/03/2331/03/23

Keywords

  • federated learning
  • multi-head attention
  • recommendation systems
  • wasserstein distance

Fingerprint

Dive into the research topics of 'MAFD: A Federated Distillation Approach with Multi-head Attention for Recommendation Tasks'. Together they form a unique fingerprint.

Cite this