Backpropagator's Review

Copyright 1996-2001 by Donald R. Tveter, commercial use is prohibited. Short quotations are permitted if proper attribution is given. This material CAN be posted elsewhere on the net if the posted files are not altered in any way but please let me know where it is posted. The main location is: http://dontveter.com/bpr/bpr.html

Up to Backpropagator's Review

Recurrent Networks

Last Change to This File: June 19, 1999

Recurrent networks can be used to classify time series or predict future values. They work by taking output and/or hidden layer unit values and copying them down to a short-term memory on the input layer and feeding them back in at the next time step. For my Professional Basis of AI Backprop software I have a web page that gives a more detailed explanation if you need it, it also includes a little about a NARX network.

The ability of a recurrent network to remember the past is somewhat limited and in some experiments I did the influence only extends to about 6 inputs back in time and even at 6 the effects are slight. Lin and others show in an online paper how a conventional recurrent network can be modified to retain the memory of past inputs for a longer time.

For classifying time series see the article by Kalman and Kwasny where they show that averaging the network's output over the entire series of inputs can improve classification performance.

If you have any questions or comments, write me.

To Don's Home Page