Federated Learning
▶ Introduction
 

Federated learning (FL) involves training statistical models over remote devices or siloed data centers, such as mobile phones or hospitals, while keeping data localized. This means on-device learning agents collaboratively train a global learning model without sharing their local datasets. However, training over heterogeneous and potentially massive networks introduces novel challenges that require a fundamental departure from standard approaches towards large-scale machine learning systems, distributed optimization techniques, and privacy-preserving data analysis.

Figure 1 Schematics of FL Process [4]



In a typical Federated Learning setting, we perform following four steps iteratively:

  • Step 1. Local Training: Every UE trains a local model by using their available local data samples.
  • Step 2. Upload local model: UEs transmit the local learning model and global gradient updates to the server.
  • Step 3. Update global model: A global model is constructed based on the weight parameters of local models at the server. The global model can be a simple aggregation of local model parameters, or may follow some different model aggregation method.
  • Step 4. Broadcast global model: The updated global model and gradient are broadcast back to all UEs for the next round of local training.


  • Figure 2 [Alternative figure] Illustration of iterative training process in FL [5].

    ▶ Research Challenges

     
  • Federated learning in edge networks and smart cities
  • Resource allocation and learning convergence analysis in Federated Learning
  • A Crowdsourcing and Game formation for resource allocation in Federated Learning
  • Clustering and hierarchical Federated learning
  • Blockchain-assisted Federated Learning
  • Model-compression in Federated Learning
  • Model generation for Federated Learning


  • ▶ References

     
    1. Jakub Konecnˇ y, H. Brendan McMahan, Daniel Ramage, and Peter Richt ´ arik. Federated optimization: Distributed machine learning for on-device intelligence. arXiv preprint arXiv:1610.02527, 2016.
    2. Kairouz, Peter, et al. "Advances and open problems in federated learning." arXiv preprint arXiv:1912.04977 (2019).
    3. T. Li, A. K. Sahu, A. Talwalkar, and V. Smith, “Federated learning: Challenges, methods, and future directions,” IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 50?60, 2020.
    4. W. Y. B. Lim, N. C. Luong, D. T. Hoang, Y. Jiao, Y.-C. Liang, Q. Yang, D. Niyato, and C. Miao, “Federated learning in mobile edge networks: A comprehensive survey,” IEEE Communications Surveys & Tutorials, 2020.
    5. Shashi Raj Pandey, Nguyen H. Tran, Mehdi Bennis, Yan Kyaw Tun, Aunas Manzoor and Choong Seon Hong, "A Crowdsourcing Framework for On-Device Federated Learning," IEEE Transactions on Wireless Communications, DOI:10.1109/TWC.2020.2971981


    ▶ Achievements

     
    1. Shashi Raj Pandey, Nguyen H. Tran, Mehdi Bennis, Yan Kyaw Tun, Aunas Manzoor and Choong Seon Hong, "A Crowdsourcing Framework for On-Device Federated Learning," IEEE Transactions on Wireless Communications, DOI:10.1109/TWC.2020.2971981
    2. Nguyen H. Tran, Wei Bao, Albert Zomaya, Minh N. H. Nguyen and Choong Seon Hong, “Federated Learning over Wireless Networks: Optimization Model Design and Analysis,” IEEE International Conference on Computer Communications (INFOCOM 2019), April 29 - May 2, 2019, Paris, France.
    3. Minh N. H. Nguyen, Nguyen H. Tran, Yan Kyaw Tun, Zhu Han, Choong Seon Hong, “Toward Multiple Federated Learning Services Resource Sharing in Mobile Edge Networks” Submitted to IEEE Transactions on Mobile Computing (Major Revision)
    4. Canh Dinh, Nguyen H. Tran, Minh N. H. Nguyen, Choong Seon Hong, Wei Bao, Albert Zomaya, Vincent Gramoli, “Federated Learning over Wireless Networks: Convergence Analysis and Resource Allocation,” arXiv:1910.13067, 2019. Submitted to IEEE/ACM Transactions on Networking (Major Revision)
    5. Huong Tra Le, Nguyen H. Tran, Yan Kyaw Tun, Zhu Han, Choong Seon Hong, "Auction Based Incentive Design for Efficient Federated Learning in Cellular Wireless Networks," IEEE Wireless Communications and Networking Conference (WCNC 2020), May 25-28, 2020
    6. Latif Ullah Khan, Madyan Alsenwi, Zhu Han, Choong Seon Hong, "Self-Organizing Federated Learning over Wireless Networks: A Socially Aware Clustering Approach," The International Conference on Information Networking (ICOIN 2020), January 7-10, 2020, Barcelona, Spain
    7. Minh N. H. Nguyen, Huy Q. Le, Choong Seon Hong, "Decentralized Operation of Multiple Federated Learning Services in Multi-access Edge Computing," 2020년 한국컴퓨터종합학술대회(KCC 2020), 2020.07.02~04. (Best paper award)
    8. Minh N. H. Nguyen and Choong Seon Hong, "A Multiple Federated Learning Services Orchestrator in Edge Computing", 2019년 한국소프트웨어종합학술대회(KSC 2019), 2019.12.18~12.20. (Best paper awards)
    9. Umer Majeed, Choong Seon Hong, "Blockchain-assisted Ensemble Federated Learning for Automatic Modulation Classification in Wireless Networks," 2020년 한국컴퓨터종합학술대회(KCC 2020), 2020.07.02~04
    10. 김유준, 홍충선, "연합학습 과정에서의 양자화 매개변수 학습," 2020년 한국컴퓨터종합학술대회(KCC 2020), 2020.07.02~04.