Open Source LLMOps Solutions (Coursera)

Open Source LLMOps Solutions (Coursera)
Course Auditing
Categories
Effort
Certification
Languages
Basic programming experience and understanding of Bash and Linux.
Misc

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Open Source LLMOps Solutions (Coursera)
Learn the fundamentals of large language models (LLMs) and put them into practice by deploying your own solutions based on open source models. By the end of this course, you will be able to leverage state-of-the-art open source LLMs to create AI applications using a code-first approach.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

You will start by gaining an in-depth understanding of how LLMs work, including model architectures like transformers and advancements like sparse expert models. Hands-on labs will walk you through launching cloud GPU instances and running pre-trained models like Code Llama, Mistral, and stable diffusion.

The highlight of the course is a guided project where you will fine-tune a model like LLaMA or Mistral on a dataset of your choice. You will use SkyPilot to easily scale model training on low-cost spot instances across cloud providers. Finally, you will containerize your model for efficient deployment using model servers like LoRAX and vLLM.

By the end of the course, you will have first-hand experience leveraging open source LLMs to build AI solutions. The skills you gain will enable you to further advance your career in AI.

This course is part of the Large Language Model Operations (LLMOps) Specialization.


What you'll learn

- Run local large language models

- Fine-tune LLMs

- Use open-source generative AI


Syllabus


Getting Started with Open Source Ecosystem

This week, you will learn how to leverage pre-trained natural language processing models to build NLP applications. We will explore popular open source models like BERT. You will learn how to access these models using libraries like HuggingFace Transformers and use them for tasks like text classification, question answering, and text generation. A key skill will be using large language models to synthetically augment datasets. By feeding the model examples and extracting the text it generates, you can create more training data. Through hands-on exercises, you will build basic NLP pipelines in Python that use pre-trained models to perform tasks like sentiment analysis. By the end of the week, you'll have practical experience using state-of-the-art NLP techniques to create capable language applications.


Using Local LLMs from LLamafile to Whisper.cpp

This week, you run language models locally. Keep data private. Avoid latency and fees. Use Mixtral model and llamafile.


Applied Projects

This week you will use models in the browser with Transformers.js and ONNX. You will gain experience on porting models to the ONNX runtime and experience how to put them on the browser. You will also use the Cosmopolitan project to build a phrase generator that is easily portable on different systems.


Recap and Final Challenges

This week you will focus on completing several external labs and hands-on examples that will allow you to feel comfortable running local LLMs, connect to them with APIs using Python as well as building solutions with the Rust programming language



MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Course Auditing
45.00 EUR/month
Basic programming experience and understanding of Bash and Linux.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.