Performance analysis of FFNN-based language model in contrast with n-gram

Kwang Ho Kim, Donghyun Lee, Minkyu Lim, Minho Ryang, Gil Jin Jang, Jeong Sik Park, Ji Hwan Kim

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

In this paper, we analyze the performance of feed forward neural network (FFNN)-based language model in contrast with n-gram. The probability of n-gram language model was estimated based on the statistics of word sequences. The FFNN-based language model was structured by three hidden layers, 500 hidden units per each hidden layer, and 30 dimension word embedding. The performance of FFNN-based language model is better than that of n-gram by 1.5 % in terms of WER on the English WSJ domain.

Original languageEnglish
Title of host publicationNatural Language Dialog Systems and Intelligent Assistants
PublisherSpringer International Publishing
Pages253-256
Number of pages4
ISBN (Electronic)9783319192918
ISBN (Print)9783319192901
DOIs
StatePublished - 29 Oct 2015

Keywords

  • Feed forward neural network
  • Language model
  • N-Gram
  • Performance analysis

Fingerprint

Dive into the research topics of 'Performance analysis of FFNN-based language model in contrast with n-gram'. Together they form a unique fingerprint.

Cite this