MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
You will learn about the types of generative AI and its real-world applications. You will gain the knowledge to differentiate between various generative AI architectures and models, such as Recurrent Neural Networks (RNNs), Transformers, Generative Adversarial Networks (GANs), Variational AutoEncoders (VAEs), and Diffusion Models. You will learn the differences in the training approaches used for each model. You will be able to explain the use of LLMs, such as Generative Pre-Trained Transformers (GPT) and Bidirectional Encoder Representations from Transformers (BERT).
You will also learn about the tokenization process, tokenization methods, and the use of tokenizers for word-based, character-based, and subword-based tokenization. You will be able to explain how you can use data loaders for training generative AI models and list the PyTorch libraries for preparing and handling data within data loaders. The knowledge acquired will help you use the generative AI libraries in Hugging Face. It will also prepare you to implement tokenization and create an NLP data loader.
For this course, a basic knowledge of Python and PyTorch and an awareness of machine learning and neural networks would be an advantage, though not strictly required.
What you'll learn
- Differentiate between generative AI architectures and models, such as RNNs, Transformers, VAEs, GANs, and Diffusion Models.
- Describe how LLMs, such as GPT, BERT, BART, and T5, are used in language processing.
- Implement tokenization to preprocess raw textual data using NLP libraries such as NLTK, spaCy, BertTokenizer, and XLNetTokenizer.
- Create an NLP data loader using PyTorch to perform tokenization, numericalization, and padding of text data.
Syllabus
Generative AI Architecture
In this module, you will learn about the significance of generative AI models and how they are used across a wide range of fields for generating various types of content. You will learn about the architectures and models commonly used in generative AI and the differences in the training approaches of these models. You will learn how large language models (LLMs) are used to build NLP-based applications. You will build a simple chatbot using the transformers library from Hugging Face.
Data Preparation for LLMs
In this module, you will learn to prepare data for training large language models (LLMs) by implementing tokenization. You will learn about the tokenization methods and the use of tokenizers. You will also learn about the purpose of data loaders and how you can use the DataLoader class in PyTorch. You will implement tokenization using various libraries such as nltk, spaCy, BertTokenizer, and XLNetTokenizer. You will also create a data loader with a collate function that processes batches of text.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.