Skip to main content

Relating RNN layers with the spectral WFA ranks in sequence modelling

Liza, Farhana Ferdousi, Grzes, Marek (2019) Relating RNN layers with the spectral WFA ranks in sequence modelling. In: ACL workshop on Deep Learning and Formal Languages: Building Bridges, 2 August 2019, Florence, Italy. (Unpublished) (KAR id:74240)

PDF Pre-print
Language: English
Download (373kB) Preview
[thumbnail of liza19DelFol.pdf]
Preview
This file may not be suitable for users of assistive technology.
Request an accessible format

Abstract

We analyse Recurrent Neural Networks (RNNs) to understand the significance of multiple LSTM layers. We argue that the Weighted Finite-state Automata (WFA) trained using a spectral learning algorithm are helpful to analyse RNNs. Our results suggest that multiple LSTM layers in RNNs help learning distributed hidden states, but have a smaller impact on the ability to learn long-term dependencies. The analysis is based on the empirical results, however relevant theory (whenever possible) was discussed to justify and support our conclusions.

Item Type: Conference or workshop item (Speech)
Uncontrolled keywords: deep learning, NLP, LSTM, WFA, spectral learning
Subjects: Q Science > QA Mathematics (inc Computing science) > QA 76 Software, computer programming, > QA76.76 Computer software
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing
Depositing User: Marek Grzes
Date Deposited: 04 Jun 2019 11:14 UTC
Last Modified: 16 Feb 2021 14:04 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/74240 (The current URI for this page, for reference purposes)
Grzes, Marek: https://orcid.org/0000-0003-4901-1539
  • Depositors only (login required):

Downloads

Downloads per month over past year