Neural nets 101 - Recurrent Neural Networks

A brief intro into recurrent neural networks.

NN08 - recurrent networks, BPTT

In this notebook:

  • recurrent networks
  • backpropagation through time (BPTT)
  • time-series models
  • sequence models

Re-current networks

<img src=”rnn.png”,width=300,height=300> Examples include: Hopfield, Boltzmann, Echo-state, Long-short term memory(LSTMs), Nonlinear AutoRegressive network with eXogenous inputs (NARX), Elman and Jordan networks etc.

Can be trained by standard backprop and variations e.g. input/ output vectors are mapped to input+context/output (possibly using a sliding window approach)…

Backprop. through time (BPTT)

Given data sequence x with target values t: (x0,t0),(x1,t1),(x2,t2)… Unfold a recurrent network through time (e.g. k=3 below)

<img src=”bptt.png”,width=400,height=400>

  • Train unfolded net with backprop. but in order, i.e. obtaining o0, o1, o2
  • s0 is normally a vector of zeros
  • Each training example is of the form (st-1, xt, st, xt+1, st+1, xt+2, tt+2)
  • typically use online learning
  • after each example, average the weights to get the same U, V, W.




Written on January 8, 2018