Large Language Models: Foundation Models from the Ground Up (edX)

Large Language Models: Foundation Models from the Ground Up (edX)
Course Auditing
Categories
Effort
Certification
Languages
Intermediate-level experience with Python. Understanding of deep learning concepts Completing the LLM: Application through Production course is highly recommended, but not strictly required prior to taking this course.
Misc

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Large Language Models: Foundation Models from the Ground Up (edX)
This course dives into the details of foundation models in natural language processing (NLP). You will learn the innovations that led to the proliferation of transformer-based models, including BERT, GPT, and T5, and the key breakthroughs that led to applications such as ChatGPT.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

This course dives into the details of foundation models in natural language processing (NLP). You will learn the innovations that led to the proliferation of transformer-based models, including encoder models such as BERT, decoder models such as GPT, and encoder-decoder models like T5, and the key breakthroughs that led to applications such as ChatGPT. You will learn about transfer learning techniques such as few-shot learning and knowledge distillation to improve large language models (LLMs). The course concludes with an overview of new LLM developments such as multi-modal models and LLM decision making, looking toward the future in this ever-changing, fast-paced landscape.

This course is part of the Large Language Models Professional Certificate.


What you'll learn

- Understand the theory behind foundation models, including attention, decoders, and encoders, and how these innovations led to GPT-4.

- How to to leverage transfer learning techniques such as one-shot and few-shot learning as well as knowledge distillation to reduce the size of LLMs while retaining performance

- Insights into the direction this domain is headed with new applications and topics of current LLM research and developments.


Syllabus


Module 1 - Transformer Architecture: Attention & Transformer Fundamentals

Module 2 - Inside the Transformer I: Encoder Models

Module 3 - Inside the Transformer II: Decoder Models

Module 4 - Transfer Learning & Knowledge Distillation

Module 5 - Future Directions of LLMs



MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Course Auditing
91.00 EUR
Intermediate-level experience with Python. Understanding of deep learning concepts Completing the LLM: Application through Production course is highly recommended, but not strictly required prior to taking this course.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.