Natural Language Processing with Attention Models (Coursera)

Natural Language Processing with Attention Models (Coursera)
Course Auditing
Categories
Effort
Certification
Languages
Misc

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Natural Language Processing with Attention Models (Coursera)
This course is for students of machine learning or artificial intelligence as well as software engineers looking for a deeper understanding of how NLP models work and how to apply them.

MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

In Course 4 of the Natural Language Processing Specialization, offered by DeepLearning.AI, you will:

a) Translate complete English sentences into German using an encoder-decoder attention model,

b) Build a Transformer model to summarize text,

c) Use T5 and BERT models to perform question-answering, and

d) Build a chatbot using a Reformer model.

Course 4 of 4 in the Natural Language Processing Specialization


Syllabus


WEEK 1

Neural Machine Translation

Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.


WEEK 2

Text Summarization

Compare RNNs and other sequential models to the more modern Transformer architecture, then create a tool that generates text summaries.


WEEK 3

Question Answering

Explore transfer learning with state-of-the-art models like T5 and BERT, then build a model that can answer questions.


WEEK 4

Chatbot

Examine some unique challenges Transformer models face and their solutions, then build a chatbot using a Reformer model.



MOOC List is learner-supported. When you buy through links on our site, we may earn an affiliate commission.