Hyperspace Analogue to Language Algorithm

From GM-RKB
Jump to navigation Jump to search

A Hyperspace Analogue to Language Algorithm is a word space modeling algorithm that ..



References

2015

  • http://en.wikipedia.org/wiki/Semantic_memory#Hyperspace_Analogue_to_Language_.28HAL.29
    • The Hyperspace Analogue to Language (HAL) model[1][2] considers context only as the words that immediately surround a given word. HAL computes an NxN matrix, where N is the number of words in its using a 10-word reading frame that moves incrementally through a corpus of text. Like in SAM (see above), any time two words are simultaneously in the frame, the association between them is increased, that is, the corresponding cell in the NxN matrix is incremented. The amount by which the association is incremented varies inversely with the distance between the two words in the frame (specifically, [math]\displaystyle{ \Delta=11-d }[/math], where [math]\displaystyle{ d }[/math] is the distance between the two words in the frame). As in LSA (see above), the semantic similarity between two words is given by the cosine of the angle between their vectors (dimension reduction may be performed on this matrix, as well). In HAL, then, two words are semantically related if they tend to appear with the same words. Note that this may hold true even when the words being compared never actually co-occur (i.e., "chicken" and "canary").
  1. Lund, K., Burgess, C. & Atchley, R. A. (1995). Semantic and associative priming in a high-dimensional semantic space. Cognitive Science Proceedings (LEA), 660-665.
  2. Lund, K. & Burgess, C. (1996). Producing high-dimensional semantic spaces from lexical co-occurrence. Behavior Research Methods, Instruments & Computers, 28(2),203-208.

2009

2000