2015 JointLearningofCharacterandWord

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Character-Enhanced Word Embedding (CWE) Model; Character Embedding System; Word Embedding System

Notes

Cited By

Quotes

Abstract

Most word embedding methods take a word as a basic unit and learn embeddings according to words'™ external contexts, ignoring the internal structures of words. However, in some languages such as Chinese, a word is usually composed of several characters and contains rich internal information. The semantic meaning of a word is also related to the meanings of its composing characters. Hence, we take Chinese for example, and present a character-enhanced word embedding model (CWE). In order to address the issues of character ambiguity and non-compositional words, we propose multiple-prototype character embeddings and an effective word selection method. We evaluate the effectiveness of CWE on word relatedness computation and analogical reasoning. The results show that CWE outperforms other baseline methods which ignore internal character information. The codes and data can be accessed from https://github.com/Leonard-Xu/CWE

References

BibTeX

@inproceedings{2015_JointLearningofCharacterandWord,
  author    = {Xinxiong Chen and
               Lei Xu and
               Zhiyuan Liu and
               Maosong Sun and
               Huan-Bo Luan},
  editor    = {Qiang Yang and
               Michael J. Wooldridge},
  title     = {Joint Learning of Character and Word Embeddings},
  booktitle = {Proceedings of the Twenty-Fourth International Joint Conference on
               Artificial Intelligence (IJCAI 2015)},
  pages     = {1236--1242},
  publisher = {AAAI Press},
  year      = {2015},
  url       = {http://ijcai.org/Abstract/15/178},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2015 JointLearningofCharacterandWordXinxiong Chen
Lei Xu
Zhiyuan Liu
Maosong Sun
Huan-Bo Luan
Joint Learning of Character and Word Embeddings2015