MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
You will explore other tools and programming languages to interact with these LLMs and using LLMs via via Hugging Face Candle and Mozilla llamafile.
What you'll learn
- Local Large Language Models (LLMs)
- Tools for running LLMs locally like Llamafile
Syllabus
Local LLMOps
This week, you will learn mitigation strategies, evaluate task performance, and operationalize workflows by identifying risks in notebooks and deploying an LLM application.
Production Workflows and Performance of LLMs
This week, you will explore different types of generative AI applications, including API-based, embedded model, and multi-model systems. You'll learn the fundamentals of building robust applications using techniques like Retrieval Augmented Generation (RAG) to improve context. Through hands-on exercises, you'll gain experience evaluating real-world performance of large language models using Elo ratings coded in Python, Rust, R, and Julia. Then you'll explore production LLM workflows using tools like skypilot, Lorax, and Ludwig for fine-tuning models like Mistral-7b. Finally, you'll gain hands-on experience testing an application locally and deploying it on the cloud.
Responsible Generative AI
This week you will learn foundations of generative AI and responsible deployment strategies to benefit from the latest advancements while maintaining safety, accuracy, and oversight. By directly applying concepts through hands-on labs and peer discussions, you will gain practical experience putting AI into production.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.