NAACL 2015 Proceedings

更新于 2015年6月1日 机器学习
我来说两句
iB37   网页版 2015-05-31 18:14
会议活动 资源 自然语言处理 PDF 会议
NAACL 2015 Proceedings on ACL Anthology 终于出来了http://t.cn/R2JyN9J 之前未放出的三篇最佳论文之一的 Unsupervised Morphology Induction Using Word Embeddings [Soricut & Och,NAACL’15] http://t.cn/R2JyN9M

      <p class="br">
         
      </p>
回复
回复
取消
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing where words from the vocabulary (and possibly phrases thereof) are mapped to vectors of real numbers in a low dimensional space, relative to the vocabulary size ("continuous space"). There are several methods for generating this mapping. They include neural networks, dimensionality reduction on the word co-occurrence matrix, and explicit representation in terms of the context in which words appear. Word and phrase embeddings, when used as the underlying input representation, have been shown to boost the performance in NLP tasks such as syntactic parsing and sentiment analysis.

The North American Chapter of the Association for Computational Linguistics

回复
取消