• Courses
  • Tutorials
  • Jobs
  • Practice
  • Contests
November 07, 2022 |1.3K Views
Word Embeddings in Natural Language Processing
  Share  1 Like
Description
Discussion

In this video, we will understand what is Word Embeddings in NLP.


Word Embeddings in NLP

NLP is the short form for Natural Language Processing where we try to build different types of applications by combining raw text and deep learning concepts. But the main problem that is faced during model training is that the computer does not understand the text it can only deal with numbers, so the conversion of raw text to the numeric form which contains information about the context of the corpus as well is exactly what is achieved by word embeddings. 

Word embeddings are also referred as featurized vector representations of raw text. 

King - Man + Woman = Queen 
The above analogy seems reasonable to us but can a computer also infer this, answer is yes it can but if there is a way to represent the words using vectors. This is what embeddings help us to achieve. 

One of the methods which were used in the earlier days of NLP is the one-hot encoding but the main drawback of this method was that it was not able to capture the context of the corpus. This method also required a lot of computations as a one-hot vector for a corpus of text becomes very highly dimensional. 

Some of the state-of-the-art which use word embedding in their complex architecture are: 
1) CBOW, Skip Gram Based: Word2Vec, GloVe 
2) Transformers: BERT, GPT 
3) LSTM Based: ELMO 

Word embedding in NLP
https://www.geeksforgeeks.org/word-embeddings-in-nlp/