Distributional Co-Occurrence Word Vector

From GM-RKB
(Redirected from word embeddings)
Jump to navigation Jump to search

A Distributional Co-Occurrence Word Vector is a word vector that is a distributional co-occurrence text-item vector from a distributional word vector space (based on word co-occurrence statistics from some corpus).



References

2015

2014a

2014b

2014c

2014

2013

2012

2011

2010

2008

2006

2003

1991

  • (Elman, 1991) ⇒ J.L. Elman. (1991). “Distributed Representations, Simple Recurrent Networks, and Grammatical Structure.” In: Machine Learning, 7(2).

1990

1986

1984