Short CourseBeginner1 Hour 21 Minutes

Pretraining LLMs

Instructors: Sung Kim, Lucy Park

Upstage
  • Beginner
  • 1 Hour 21 Minutes
  • 8 Video Lessons
  • 6 Code Examples
  • Instructors: Sung Kim, Lucy Park
    • Upstage
    Upstage

What you'll learn

  • Gain in-depth knowledge of the steps to pretrain an LLM, encompassing all the steps, from data preparation, to model configuration and performance assessment.

  • Explore various options for configuring your model’s architecture, including modifying Meta’s Llama models to create larger or smaller versions and initializing weights either randomly or from other models.

  • Learn innovative pretraining techniques like Depth Upscaling, which can reduce training costs by up to 70%.

About this course

In Pretraining LLMs you’ll explore the first step of training large language models using a technique called pretraining. You’ll learn the essential steps to pretrain an LLM, understand the associated costs, and discover how starting with smaller, existing open source models can be more cost-effective.

Pretraining involves teaching an LLM to predict the next token using vast text datasets, resulting in a base model, and this base model requires further fine-tuning for optimal performance and safety. In this course, you’ll learn to pretrain a model from scratch and also to take a model that’s already been pretrained and continue the pretraining process on your own data. 

In detail: 

  • Explore scenarios where pretraining is the optimal choice for model performance. Compare text generation across different versions of the same model to understand the performance differences between base, fine-tuned, and specialized pre-trained models.
  • Learn how to create a high-quality training dataset using web text and existing datasets, which is crucial for effective model pretraining.
  • Prepare your cleaned dataset for training. Learn how to package your training data for use with the Hugging Face library.
  • Explore ways to configure and initialize a model for training and see how these choices impact the speed of pretraining.
  • Learn how to configure and execute a training run, enabling you to train your own model.
  • Learn how to assess your trained model’s performance and explore common evaluation strategies for LLMs, including important benchmark tasks used to compare different models’ performance.

After taking this course, you’ll be equipped with the skills to pretrain a model—from data preparation and model configuration to performance evaluation.

Who should join?

This course is ideal for AI enthusiasts, data scientists, and machine learning engineers who want to learn the complete process of pretraining LLMs. Basic knowledge of Python and large language models is recommended.

Course Outline

8 Lessons・6 Code Examples
  • Introduction

    Video6 mins

  • Why Pre-training

    Video with code examples12 mins

  • Data Preparation

    Video with code examples22 mins

  • Packaging Data for Pretraining

    Video with code examples8 mins

  • Model Initialization

    Video with code examples12 mins

  • Training in Action

    Video with code examples9 mins

  • Evaluation

    Video with code examples7 mins

  • Conclusion

    Video1 min

Instructors

Sung Kim

Sung Kim

CEO of Upstage

Lucy Park

Lucy Park

Chief Scientific Officer of Upstage

Course access is free for a limited time during the DeepLearning.AI learning platform beta!

Want to learn more about Generative AI?

Keep learning with updates on curated AI news, courses, and events, as well as Andrew’s thoughts from DeepLearning.AI!