MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
Computer vision and AI at the edge are becoming instrumental in powering everything from factory assembly lines and retail inventory management to hospital urgent care medical imaging equipment like X-ray and CAT scans. This program will teach fluency in some of the most cutting-edge technologies. The course will introduce students to the Intel® Distribution of OpenVINO™ Toolkit, which allows developers to deploy pre-trained deep learning models through a high-level C++ or Python inference engine API integrated with application logic. Based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance.
What is Edge AI? In Edge AI, the AI algorithms are processed locally on a hardware device, without requiring any connection. It uses data that is generated from the device and processes it to give real-time insights in less than few milliseconds. AI Edge processing today is focused on moving the inference part of the AI workflow to the device, keeping data constrained to the device.
What You Will Learn
Lesson 1
Leveraging Pre-Trained Models
- Leverage a pre-trained model for computer vision inferencing
Lesson 2
The Model Optimizer
- Convert pre-trained models into the framework-agnostic intermediate representation with the Model Optimizer
Lesson 3
The Inference Engine
- Perform efficient inference on deep learning models through the hardware-agnostic Inference Engine
Lesson 4
Deploying an Edge App
- Deploy an app on the edge including sending information through MQTT and analyze model performance and use cases
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.
MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.