Short CourseBeginner to Intermediate2 Hours 58 Minutes

Federated Learning

Instructors: Daniel J. Beutel, Nicholas D. Lane

Flower Labs
  • Beginner to Intermediate
  • 2 Hours 58 Minutes
  • 13 Video Lessons
  • 8 Code Examples
  • Instructors: Daniel J. Beutel, Nicholas D. Lane
    • Flower Labs
    Flower Labs

What you'll learn

  • Explore the components of federated learning systems and learn to customize, tune, and orchestrate them for better model training.

  • Leverage federated learning to enhance LLMs by effectively managing key privacy and efficiency challenges.

  • Learn how techniques like parameter-efficient fine-tuning and differential privacy are crucial for making federated learning secure and efficient.

About this course

Join Federated Learning! In this two-part course series, you will use Flower, a popular open source framework, to build a federated learning system, and learn about federated fine-tuning of LLMs with private data in part two.

Federated learning allows models to be trained across multiple devices or organizations without sharing data, improving privacy and security. Federated learning also has many practical uses, such as training next-word prediction models on mobile keyboards without transmitting sensitive keystrokes onto a central server.

First, you’ll learn about the federated training process, how to tune and customize it, how to increase data privacy, and how to manage bandwidth usage in federated learning.

Then, you’ll learn to apply federated learning to LLMs. You’ll explore challenges like data memorization and the computational resources required by LLMs, and explore techniques for efficiency and privacy enhancement, such as Parameter-Efficient Fine-Tuning (PEFT) and Differential Privacy (DP).

This two-part course series is self-contained. If you already know what federated learning is, you can start directly with part two of the course.

In detail, here’s what you’ll do in part one: 

  • Learn how federated learning is used to train a variety of models, ranging from those for processing speech and vision all the way to the large language models, across distributed data while offering key data privacy options to users and organizations.
  • Learn how to train AI on distributed data by building, customizing, and tuning a federated learning project using Flower and PyTorch.
  • Gain intuition on how to think about Private Enhancing Technologies (PETs) in the context of federated learning, and work through an example using Differential Privacy, which protects individual data points from being traced back to their source. 
  • Learn about two types of differential privacy – central and local – along with the dual approach of clipping and noising to protect private data.
  • Explore the bandwidth requirements for federated learning and how you can optimize it by reducing the update size and communication frequency.

In the second part, you’ll learn how to train powerful models with your own data in a federated way, called federated LLM fine-tuning: 

  • Understand the importance of safely training LLMs using private data.
  • Learn about the limitations of current training data and how Federated LLM Fine-tuning can help overcome these challenges.
  • Build an LLM that is fine-tuned with private medical data to answer complex questions, where you’ll see the benefits of federated methods when using private data.
  • Learn how federated LLM fine-tuning works and how it simplifies access to private data, reduces bandwidth with Parameter-Efficient Fine-Tuning (PEFT), and increases privacy to training data with Differential Privacy. 
  • Understand how LLMs can leak training data, how federated LLMs can lower this risk.

Who should join?

Anyone who has a basic background in Python and machine learning, has an understanding of LLMs, and wants to learn how to build models, including large language models, on private distributed data using the Flower framework.

Course Outline

13 Lessons・8 Code Examples
  • Introduction

    Video4 mins

  • Why Federated Learning

    Video with code examples18 mins

  • Federated Training Process

    Video with code examples16 mins

  • Tuning

    Video with code examples10 mins

  • Data Privacy

    Video with code examples9 mins

  • Bandwidth

    Video with code examples9 mins

  • Conclusion

    Video1 mins

Instructors

Daniel J. Beutel

Daniel J. Beutel

Co-Founder & CEO of Flower Labs

Nicholas Lane

Nicholas Lane

Co-founder and Chief Scientific Officer of Flower Labs

Course access is free for a limited time during the DeepLearning.AI learning platform beta!

Want to learn more about Generative AI?

Keep learning with updates on curated AI news, courses, and events, as well as Andrew’s thoughts from DeepLearning.AI!