Skip to yearly menu bar Skip to main content


Poster

Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word Vectors

Vitalii Zhelezniak · Aleksandar D Savkov · April Shen · Francesco Moramarco · Jack Flann · Nils Hammerla

Great Hall BC #50

Keywords: [ jaccard index ] [ max-pooling ] [ word vector compositionality ] [ bag-of-words ] [ fuzzy sets ] [ distributed representations ] [ sentence representations ] [ word vectors ] [ unsupervised learning ]


Abstract:

Recent literature suggests that averaged word vectors followed by simple post-processing outperform many deep learning methods on semantic textual similarity tasks. Furthermore, when averaged word vectors are trained supervised on large corpora of paraphrases, they achieve state-of-the-art results on standard STS benchmarks. Inspired by these insights, we push the limits of word embeddings even further. We propose a novel fuzzy bag-of-words (FBoW) representation for text that contains all the words in the vocabulary simultaneously but with different degrees of membership, which are derived from similarities between word vectors. We show that max-pooled word vectors are only a special case of fuzzy BoW and should be compared via fuzzy Jaccard index rather than cosine similarity. Finally, we propose DynaMax, a completely unsupervised and non-parametric similarity measure that dynamically extracts and max-pools good features depending on the sentence pair. This method is both efficient and easy to implement, yet outperforms current baselines on STS tasks by a large margin and is even competitive with supervised word vectors trained to directly optimise cosine similarity.

Live content is unavailable. Log in and register to view live content