Drawbacks of Word2Vec

  1. Same Word → Same Vector (No Context Understanding)

The biggest problem with Word2Vec is that a word always has the same meaning in the model, even when the context changes.

Example sentence:

  1. He deposited money in the bank.
  2. He sat near the river bank.

Here the word bank has two meanings:

  • Financial institution
  • River side

But Word2Vec gives the same vector to “bank” in both sentences.

So the computer cannot understand which meaning is correct.

Word2Vec cannot understand the meaning of a word based on the sentence context.

2. Cannot Handle Polysemy (Multiple Meanings)

Many words in English have multiple meanings.

Example:

  • Bat
    • Animal (a flying mammal)
    • Cricket bat

Word2Vec stores only one representation for the word, so it mixes the meanings.

  • Does Not Capture Long-Distance Relationships

Word2Vec works with small context windows.

Example:

The boy who was playing in the garden is very happy today.

Words like boy and happy are related, but they are far apart in the sentence.

Word2Vec may fail to capture this relationship.


4. Static Embeddings

Word2Vec creates fixed embeddings during training.

Once trained:

  • word → vector never changes

But language meaning depends on context.

Modern models need dynamic embeddings.