In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.
By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering.
Week 1: Recurrent Neural Networks
Discover recurrent neural networks (RNNs) and several of their variants, including LSTMs, GRUs and Bidirectional RNNs, all models that perform exceptionally well on temporal data.
Week 2: Natural Language Processing and Word Embeddings
Use word vector representations and embedding layers to train recurrent neural networks with an outstanding performance across a wide variety of applications, including sentiment analysis, named entity recognition, and neural machine translation.
Week 3: Sequence Models and the Attention Mechanism
Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs, explore speech recognition and how to deal with audio data, and improve your sequence models with the attention mechanism.
Week 4: Transformers
Build the transformer architecture and tackle natural language processing (NLP) tasks such as attention models, named entity recognition (NER) and Question Answering (QA).