Recurrent neural networks for language modeling

Emil Sauer Lynge

AbstractThe goal of the thesis is to explore the mechanisms and tools that enables efficient development of Recurrent Neural Networks, how to train them and what they can accomplish in regard to character level language modelling. Specifically Gated Recurrence Units and Long Short Term Memory are the focal point of the training and language modelling. Choice of data sets, hyper parameters and visualization methods, aims to reproduce parts of [KJL15]. More broadly RNN as a concept is explored through computational graphs and back propagation. Several concrete software tools written in python 3 is developed as part of the project, and discussed briefly in the thesis.
TypeMaster's thesis [Academic thesis]
Year2016
PublisherTechnical University of Denmark, Department of Applied Mathematics and Computer Science
AddressRichard Petersens Plads, Building 324, DK-2800 Kgs. Lyngby, Denmark, compute@compute.dtu.dk
SeriesDTU Compute M.Sc.-2016
NoteSupervisor: Ole Winther, olwi@dtu.dk, DTU Compute
Electronic version(s)[pdf]
Publication linkhttp://www.compute.dtu.dk/English.aspx
BibTeX data [bibtex]
IMM Group(s)Intelligent Signal Processing