Basics of Machine Learning (edX)

Basics of Machine Learning (edX)
Course Auditing
Categories
Effort
Certification
Languages
Basic linear algebra (vectors, matrices) Basics of stochastics & statistics (mean, variance, random variables, normal distribution) Basic programming skills in python
Misc

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Basics of Machine Learning (edX)
"Basics of Machine Learning" introduces participants to the fundamental concepts and tools of machine learning, including probability density estimation, linear regression, classification techniques, ensemble methods, and deep neural networks.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

"Basics of Machine Learning" is designed to provide participants with a comprehensive understanding of the fundamental concepts and tools of machine learning. The course covers key topics such as probability density estimation, linear regression, classification techniques like linear discriminants, logistic regression, and support vector machines, as well as ensemble methods such as bagging and boosting. Additionally, the course introduces the basics of deep neural networks, laying the groundwork for more advanced learning techniques.

Throughout the course, students will gain a solid foundation in the fundamental approaches of machine learning. By working on practical exercises, participants will cement their understanding of the techniques covered and gain valuable hands-on experience.

By the end of the course, students will have the knowledge and skills required to confidently utilize machine learning tools and techniques in their own projects, providing a strong foundation for further study or professional development in this rapidly evolving field.


What you'll learn

- Definition of Statistical Machine Learning

- Probability density estimation

- Definition and behavior of linear discriminant models

- Linear regression

- Logistic regression

- Support Vector Machines

- Ensemble Methods

- Basics of Neural Networks


Syllabus


Week 1: Introduction, Definitions, and Core Principles

In the first week, we will provide an overview of the course and introduce the fundamental concepts of machine learning. Students will learn about the different types of learning, such as supervised, unsupervised, and reinforcement learning, as well as the key steps involved in developing a machine learning model, from data preprocessing to model evaluation and optimization.


Week 2: Probability Density Estimation

In week two, students will delve into probability density estimation, an essential technique for understanding the underlying structure of data. We will cover various methods, such as parametric and non-parametric approaches, and how they can be used for building machine learning models.


Week 3: Linear Discriminants

During the third week, we will focus on linear discriminants and their use in classifying data points. Students will learn about the concept of decision boundaries and how to derive them using linear discriminant functions. We will also discuss how to solve decision problems by minimizing a least-squares objective, the limitations of the resulting linear classifiers, and introduce strategies for handling non-linearly separable data.


Week 4: Linear Regression

In week four, students will be introduced to linear regression, a fundamental technique for modeling continuous data. We will cover the basics of simple and multiple linear regression, discuss the concept of least squares estimation, and explore regularization as a measure against overfitting.


Week 5: Logistic Regression

The fifth week will be dedicated to logistic regression, a powerful technique for binary classification tasks. Students will learn how to derive the logistic regression model, perform iterative optimization using first- and second-order methods, apply regularization, and explore the relations between generative and discriminative methods.


Week 6: Support Vector Machines

In week six, we will explore support vector machines (SVMs), a versatile and very robust algorithm for classification tasks. Students will learn about the key concepts behind SVMs, such as maximum margin and kernel functions, and gain hands-on experience implementing SVMs using popular machine learning libraries.


Week 7: Ensembling Methods

During the seventh week, we will delve into ensemble methods, which combine multiple models to improve the overall performance of a machine learning system. Students will explore popular techniques such as bagging and boosting, and learn how to implement the AdaBoost algorithm.


Week 8: Neural Network Basics

In the final week, students will be introduced to the foundations of deep learning and neural networks. We will cover the basics of artificial neurons, feedforward networks, and backpropagation. This week will provide the groundwork for more advanced topics in deep learning, preparing students for further study or simple practical applications in the field.



MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Course Auditing
90.00 EUR
Basic linear algebra (vectors, matrices) Basics of stochastics & statistics (mean, variance, random variables, normal distribution) Basic programming skills in python

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.