word2vec Model Instance: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Remove links to pages that are actually redirects to this page.)
m (Remove links to pages that are actually redirects to this page.)
Line 19: Line 19:
=== 2014 ===
=== 2014 ===
* ([[2014_LookingforHyponymsinVectorSpace|Rei & Briscoe, 2014]]) ⇒ [[Marek Rei]], and [[Ted Briscoe]]. ([[2014]]). “[http://www.aclweb.org/anthology/W14-1608 Looking for Hyponyms in Vector Space].” In: Proceedings of CoNLL-2014.  
* ([[2014_LookingforHyponymsinVectorSpace|Rei & Briscoe, 2014]]) ⇒ [[Marek Rei]], and [[Ted Briscoe]]. ([[2014]]). “[http://www.aclweb.org/anthology/W14-1608 Looking for Hyponyms in Vector Space].” In: Proceedings of CoNLL-2014.  
** QUOTE: The [[Window Baseline-based WVSM|window-based]], [[Dependency Parse-based WVSM|dependency-based]] and [[word2vec vector set]]s were all trained on 112M words from the [[British National Corpus]], with preprocessing steps for [[lower-casing]] and [[lemmatising]]. </s> Any numbers were grouped and substituted by more generic tokens. </s>
** QUOTE: The [[Window Baseline-based WVSM|window-based]], [[Dependency Parse-based WVSM|dependency-based]] and [[word2vec Model Instance|word2vec vector set]]s were all trained on 112M words from the [[British National Corpus]], with preprocessing steps for [[lower-casing]] and [[lemmatising]]. </s> Any numbers were grouped and substituted by more generic tokens. </s>


=== 2013 ===
=== 2013 ===

Revision as of 20:45, 23 December 2019

A word2vec Model Instance is a continuous dense distributional word vector space model produced by a word2vec system.



References

2017

2014

2013