Efficient Estimation of Word Representations in Vector Space

E906311

Efficient Estimation of Word Representations in Vector Space is the influential 2013 paper that introduced the word2vec models for learning distributed word embeddings, significantly advancing natural language processing.

Jump to: Surface forms Statements Referenced by

Observed surface forms (1)

Statements (47)

Predicate Object
instanceOf conference paper
scientific paper
affiliationOfAuthors Google NERFINISHED
approach shallow neural networks
author Greg Corrado NERFINISHED
Jeffrey Dean NERFINISHED
Kai Chen NERFINISHED
Tomas Mikolov NERFINISHED
citationStatus highly cited
datasetUsed Google News corpus NERFINISHED
demonstratedProperty semantic regularities in word embeddings
syntactic regularities in word embeddings
word analogies in vector space
designedFor large-scale text corpora
embeddingType word embeddings
evaluationTask word analogy
word similarity
field computational linguistics
machine learning
natural language processing
impact influenced development of modern word embedding methods
widely adopted in NLP research and applications
influenced GloVe NERFINISHED
deep learning for NLP
fastText NERFINISHED
neural machine translation
introducedTerm word2vec NERFINISHED
language English
mainContribution efficient training of distributed word representations
introduction of word2vec models
popularization of neural word embeddings
optimizationGoal efficient estimation of word vectors from large datasets
proposedModel Continuous Bag-of-Words model
Skip-gram model NERFINISHED
publicationYear 2013
relatedConcept distributed representations
neural language models
vector space semantics
shortTitle word2vec paper
task language modeling
learning distributed word representations
technique hierarchical softmax
negative sampling
title Efficient Estimation of Word Representations in Vector Space NERFINISHED
trainingObjective predicting context words from target word
predicting target word from context words
trainingSpeed significantly faster than previous neural language models

Referenced by (2)

Full triples — surface form annotated when it differs from this entity's canonical label.

Tomas Mikolov notableWork Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov notableWork Efficient Estimation of Word Representations in Vector Space
this entity surface form: Distributed Representations of Words and Phrases and their Compositionality