Abstract
In this paper, we analyze the performance of feed forward neural network (FFNN)-based language model in contrast with n-gram. The probability of n-gram language model was estimated based on the statistics of word sequences. The FFNN-based language model was structured by three hidden layers, 500 hidden units per each hidden layer, and 30 dimension word embedding. The performance of FFNN-based language model is better than that of n-gram by 1.5 % in terms of WER on the English WSJ domain.
Original language | English |
---|---|
Title of host publication | Natural Language Dialog Systems and Intelligent Assistants |
Publisher | Springer International Publishing |
Pages | 253-256 |
Number of pages | 4 |
ISBN (Electronic) | 9783319192918 |
ISBN (Print) | 9783319192901 |
DOIs | |
State | Published - 29 Oct 2015 |
Keywords
- Feed forward neural network
- Language model
- N-Gram
- Performance analysis