Description
Price: 5.00 USD | Size: 3.06 GB | Duration: 11.58+
BRAND: Expert TRAINING | ENGLISH | INSTANT DOWNLOAD
Complete guide on deriving and implementing word2vec, GloVe, word embeddings, and sentiment analysis with recursive nets
What youโll learn
Understand and implement word2vec
Understand the CBOW method in word2vec
Understand the skip-gram method in word2vec
Understand the negative sampling optimization in word2vec
Understand and implement GloVe using gradient descent and alternating least squares
Use recurrent neural networks for parts-of-speech tagging
Use recurrent neural networks for named entity recognition
Understand and implement recursive neural networks for sentiment analysis
Understand and implement recursive neural tensor networks for sentiment analysis
Use Gensim to obtain pretrained word vectors and compute similarities and analogies
Understand important foundations for OpenAI ChatGPT, GPT-4, DALL-E, Midjourney, and Stable Diffusion
Ever wondered how AI technologies likeย OpenAIย ChatGPT,ย GPT-4,ย DALL-E,ย Midjourney, andย Stable Diffusionย really work? In this course, you will learn the foundations of these groundbreaking applications.
In this course we are going to look atย NLP (natural language processing)ย withย deep learning.
Previously, you learned about some of the basics, like how many NLP problems are just regularย machine learningย andย data scienceย problems in disguise, and simple, practical methods likeย bag-of-wordsย and term-document matrices.
These allowed us to do some pretty cool things, likeย detect spamย emails,ย write poetry,ย spin articles, and group together similar words.
In this course Iโm going to show you how to do even more awesome things. Weโll learn not just 1, butย 4ย new architectures in this course.
First up isย word2vec.
In this course, Iโm going to show you exactly how word2vec works, from theory to implementation, and youโll see that itโs merely the application of skills you already know.
Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:
- king โ man = queen โ woman
- France โ Paris = England โ London
- December โ Novemeber = July โ June
For those beginners who find algorithms tough and just want to use a library, we will demonstrate the use of theย Gensimย library to obtain pre-trained word vectors, compute similarities and analogies, and apply those word vectors to build text classifiers.
We are also going to look at theย GloVeย method, which also finds word vectors, but uses a technique calledย matrix factorization, which is a popular algorithm forย recommender systems.
Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and itโs way easier to train.
We will also look at some classical NLP problems, likeย parts-of-speech taggingย andย named entity recognition, and useย recurrent neural networksย to solve them. Youโll see that just about any problem can be solved using neural networks, but youโll also learn the dangers of having too much complexity.
Lastly, youโll learn aboutย recursive neural networks,ย which finally help us solve the problem of negation inย sentiment analysis. Recursive neural networks exploit the fact that sentences have a tree structure, and we can finally get away from naively using bag-of-words.
All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work inย Numpy,ย Matplotlib,ย andย Theano. I am always available to answer your questions and help you along your data science journey.
This course focuses on โhow to build and understandโ, not just โhow to useโ. Anyone can learn to use an API in 15 minutes after reading some documentation. Itโs not about โremembering factsโ, itโs aboutย โseeing for yourselfโ via experimentation. It will teach you how to visualize whatโs happening in the model internally. If you wantย moreย than just a superficial look at machine learning models, this course is for you.
See you in class!
โIf you canโt implement it, you donโt understand itโ
- Or as the great physicist Richard Feynman said: โWhat I cannot create, I do not understandโ.
- My courses are the ONLY courses where you will learn how to implement machine learning algorithms from scratch
- Other courses will teach you how to plug in your data into a library, but do you really need help with 3 lines of code?
- After doing the same thing with 10 datasets, you realize you didnโt learn 10 things. You learned 1 thing, and just repeated the same 3 lines of code 10 timesโฆ
Suggested Prerequisites:
- calculus (taking derivatives)
- matrix addition, multiplication
- probability (conditional and joint distributions)
- Python coding: if/else, loops, lists, dicts, sets
- Numpy coding: matrix and vector operations, loading a CSV file
- neural networks and backpropagation, be able to derive and code gradient descent algorithms on your own
- Can write a feedforward neural network in Theano or TensorFlow
- Can write a recurrent neural network / LSTM / GRU in Theano or TensorFlow from basic primitives, especially the scan function
- Helpful to have experience with tree algorithms
WHATย ORDERย SHOULDย Iย TAKEย YOURย COURSESย IN?:
- Check out the lecture โMachine Learning and AIย Prerequisite Roadmapโ (available in the FAQ of any of my courses, including the free Numpy course)
UNIQUEย FEATURES
- Every line of code explained in detail โ email me any time if you disagree
- No wasted time โtypingโ on the keyboard like other courses โ letโs be honest, nobody can really write code worth learning about in just 20 minutes from scratch
- Not afraid of university-level math โ get important details about algorithms that other courses leave out
Who this course is for:
- Students and professionals who want to create word vector representations for various NLP tasks
- Students and professionals who are interested in state-of-the-art neural network architectures like recursive neural networks
- SHOULD NOT: Anyone who is not comfortable with the prerequisites.
ย
Discover more from Expert Training
Subscribe to get the latest posts sent to your email.


















Reviews
There are no reviews yet.