Size: 3.00 GB
Natural Language Processing with Sequence-to-sequence (seq2seq), Attention, CNNs, RNNs, and Memory Networks!
What you’ll learn
- Build a text classification system (can be used for spam detection, sentiment analysis, and similar problems)
Build a neural machine translation system (can also be used for chatbots and question answering)
Build a sequence-to-sequence (seq2seq) model
- Build an attention model
- Build a memory network (for question answering based on stories)
- Understand what deep learning is for and how it is used
- Decent Python coding skills, especially tools for data science (Numpy, Matplotlib)
- Preferable to have experience with RNNs, LSTMs, and GRUs
- Preferable to have experience with Keras
- Preferable to understand word embeddings
DescriptionIt’s hard to believe it’s been been over a year since I released my first course on Deep Learning with NLP (natural language processing). A lot of cool stuff has happened since then, and I’ve been deep in the trenches learning, researching, and accumulating the best and most useful ideas to bring them back to you.
- text classification (examples are sentiment analysis and spam detection)
- neural machine translation
- question answering
- bidirectional RNNs
- seq2seq (sequence-to-sequence)
- memory networks
- Decent Python coding skills
- Understand RNNs, CNNs, and word embeddings
- Know how to build, train, and evaluate a neural network in Keras
- Watch it at 2x.
- Take handwritten notes. This will drastically increase your ability to retain the information.
- Write down the equations. If you don’t, I guarantee it will just look like gibberish.
- Ask lots of questions on the discussion board. The more the better!
- The best exercises will take you days or weeks to complete.
- Write code yourself, don’t just sit there and look at my code. This is not a philosophy course!
- Students in machine learning, deep learning, artificial intelligence, and data science
- Professionals in machine learning, deep learning, artificial intelligence, and data science
- Anyone interested in state-of-the-art natural language processing