下载链接

0x00 FL main challenge

  • device heterogeneity in terms of storage, computation, and communication capabilities;
  • data heterogeneity arising due to non-IID distribution of data;
  • model heterogeneity arising from situations where different clients need models specifically customized to their environment.

0x01 Personalization

Two steps:

  • In the first step, a global model is built in a collaborative fashion.

  • In the second step, the global model is personalized for each client using the client’s private data.

In order to make federated learning personalization useful in practice, the three following objectives must all be addressed simultaneously and not independently:

(1) developing improved personalized models that benefit a large majority of clients;
(2) developing an accurate global model that benefits clients who have limited private data for personalization;
(3) attaining fast model convergence in a small number of training rounds.

0x02 Techniques

A. Adding User Context

为了实现个性化,需要保证全局模型具有很强的可塑性,能够很好的接收用户个性化的情景信息。

但大多数公共数据集是不包含上下文信息的!!!

为实验研究增加了困难,建立新的数据集成本花销会很大!

###B. Transfer Learning

用本地数据对模型顶层参数进行微调。

顶层是指靠近输出端的layer

C. Multi-task Learning

同时训练多个任务,联邦体现在相似的任务之间共享参数。

缺点是每个节点都要参与每一轮的聚合

D. Meta-Learning

从多个学习任务中获得具有高度适应性的原始模型,再根据具体的任务去更改这个原始模型。

这个过程与联邦学习的个性化过程很相似,可以将元学习的方法应用到联邦学习。

F. Base + Personalization Layers

基础层联邦化训练,个性化层在本地训练。

与迁移学习不同的是,个性化层是由迁移学习对联邦顶层微调得到的(联邦+本地微调);这个是直接由本地训练获得的(本地直接训练)。

后者在个性化层消除了其他节点数据的影响。

G. Mixture of Global and Local Models

每个节点同时训练本地模型和全局模型,通过这两个模型获得个性化模型。

相关论文:Federated Learning of a Mixture of Global and Local Models

提出了一种新的梯度计算方式,在模型聚合时不是将所有的参数都聚合。