Sebastian Ruder
  • About
  • Tags
  • Papers
  • Talks
  • News
  • FAQ
  • Sign up for NLP News
  • NLP Progress
  • Contact

word embeddings

Posts about different aspects of word embeddings.

AAAI 2019 Highlights: Dialogue, reproducibility, and more
events

AAAI 2019 Highlights: Dialogue, reproducibility, and more

This post discusses highlights of AAAI 2019. It covers dialogue, reproducibility, question answering, the Oxford style debate, invited talks, and a diverse set of research papers.

  • Sebastian Ruder
    Sebastian Ruder
11 min read
EMNLP 2018 Highlights: Inductive bias, cross-lingual learning, and more
events

EMNLP 2018 Highlights: Inductive bias, cross-lingual learning, and more

This post discusses highlights of EMNLP 2018. It focuses on talks and papers dealing with inductive bias, cross-lingual learning, word embeddings, latent variable models, language models, and datasets.

  • Sebastian Ruder
    Sebastian Ruder
11 min read
A Review of the Neural History of Natural Language Processing
language models

A Review of the Neural History of Natural Language Processing

This post expands on the Frontiers of Natural Language Processing session organized at the Deep Learning Indaba 2018. It discusses major recent advances in NLP focusing on neural network-based methods.

  • Sebastian Ruder
    Sebastian Ruder
29 min read
Word embeddings in 2017: Trends and future directions
word embeddings

Word embeddings in 2017: Trends and future directions

Word embeddings are an integral part of current NLP models, but approaches that supersede the original word2vec have not been proposed. This post focuses on the deficiencies of word embeddings and how recent approaches have tried to resolve them.

  • Sebastian Ruder
    Sebastian Ruder
17 min read
Highlights of EMNLP 2017: Exciting datasets, return of the clusters, and more
natural language processing

Highlights of EMNLP 2017: Exciting datasets, return of the clusters, and more

This post discusses highlights of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 2017). These include exciting datasets, new cluster-based methods, distant supervision, data selection, character-level models, and many more.

  • Sebastian Ruder
    Sebastian Ruder
10 min read
A survey of cross-lingual word embedding models
cross-lingual

A survey of cross-lingual word embedding models

Monolingual word embeddings are pervasive in NLP. To represent meaning and transfer knowledge across different languages, cross-lingual word embeddings can be used. Such methods learn representations of words in a joint embedding space.

  • Sebastian Ruder
    Sebastian Ruder
41 min read
Highlights of EMNLP 2016: Dialogue, deep learning, and more
natural language processing

Highlights of EMNLP 2016: Dialogue, deep learning, and more

This post discusses highlights of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP 2016). These include work on reinforcement learning, dialogue, sequence-to-sequence models, semantic parsing, natural language generation, and many more.

  • Sebastian Ruder
    Sebastian Ruder
4 min read
On word embeddings - Part 3: The secret ingredients of word2vec
word embeddings

On word embeddings - Part 3: The secret ingredients of word2vec

Word2vec is a pervasive tool for learning word embeddings. Its success, however, is mostly due to particular architecture choices. Transferring these choices to traditional distributional methods makes them competitive with popular word embedding methods.

  • Sebastian Ruder
    Sebastian Ruder
9 min read
On word embeddings - Part 2: Approximating the Softmax
word embeddings

On word embeddings - Part 2: Approximating the Softmax

The softmax layer is a core part of many current neural network architectures. When the number of output classes is very large, such as in the case of language modelling, computing the softmax becomes very expensive. This post explores approximations to make the computation more efficient.

  • Sebastian Ruder
    Sebastian Ruder
33 min read
On word embeddings - Part 1
word embeddings

On word embeddings - Part 1

Word embeddings popularized by word2vec are pervasive in current NLP applications. The history of word embeddings, however, goes back a lot further. This post explores the history of word embeddings in the context of language modelling.

  • Sebastian Ruder
    Sebastian Ruder
15 min read
Sebastian Ruder © 2019
Latest Posts Twitter Ghost