Generative AI Language Modeling with Transformers (Coursera)

Generative AI Language Modeling with Transformers (Coursera)
Course Auditing
Categories
Effort
Certification
Languages
Basic knowledge of Python and PyTorch. You should also be familiar with machine learning and neural network concepts.
Misc

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Generative AI Language Modeling with Transformers (Coursera)
This course provides you with an overview of how to use transformer-based models for natural language processing (NLP). In this course, you will learn to apply transformer-based models for text classification, focusing on the encoder component. You’ll learn about positional encoding, word embedding, and attention mechanisms in language transformers and their role in capturing contextual information and dependencies.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Additionally, you will be introduced to multi-head attention and gain insights on decoder-based language modeling with generative pre-trained transformers (GPT) for language translation, training the models, and implementing them in PyTorch.

Further, you’ll explore encoder-based models with bidirectional encoder representations from transformers (BERT) and train using masked language modeling (MLM) and next sentence prediction (NSP).

Finally, you will apply transformers for translation by gaining insight into the transformer architecture and performing its PyTorch implementation.

The course offers practical exposure with hands-on activities that enables you to apply your knowledge in real-world scenarios.

This course is part of a specialized program tailored for individuals interested in Generative AI engineering.

This course requires a working knowledge of Python, PyTorch, and machine learning.


What you'll learn

- Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.

- Describe language modeling with the decoder-based GPT and encoder-based BERT.

- Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.

- Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.


Syllabus


Fundamental Concepts of Transformer Architecture

In this module, you will learn the techniques to achieve positional encoding and how to implement positional encoding in PyTorch. You will learn how attention mechanism works and how to apply attention mechanism to word embeddings and sequences. You will also learn how self-attention mechanisms help in simple language modeling to predict the token. In addition, you will learn about scaled dot-product attention mechanism with multiple heads and how the transformer architecture enhances the efficiency of attention mechanisms. You will also learn how to implement a series of encoder layer instances in PyTorch. Finally, you will learn how to use transformer-based models for text classification, including creating the text pipeline and the model and training the model.


Advanced Concepts of Transformer Architecture

In this module, you will learn about decoders and GPT-like models for language translation, train the models, and implement them using PyTorch. You will also gain knowledge about encoder models with Bidirectional Encoder Representations from Transformers (BERT) and pretrain them using masked language modeling (MLM) and next sentence prediction (NSP). You will also perform data preparation for BERT using PyTorch. Finally, you learn about the applications of transformers for translation by understanding the transformer architecture and performing its PyTorch Implementation. The hands-on labs in this module will give you good practice in how you can use the decoder model, encoder model, and transformers for real-world applications.



MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Course Auditing
45.00 EUR
Basic knowledge of Python and PyTorch. You should also be familiar with machine learning and neural network concepts.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.