25.07.2013 Views

Recursive Generalized Neural Networks (RGNN) for the Modeling of ...

Recursive Generalized Neural Networks (RGNN) for the Modeling of ...

Recursive Generalized Neural Networks (RGNN) for the Modeling of ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Recursive</strong> <strong>Generalized</strong> <strong>Neural</strong><br />

<strong>Networks</strong> (<strong>RGNN</strong>) <strong>for</strong> <strong>the</strong> <strong>Modeling</strong><br />

<strong>of</strong> a Load Sensing Pump<br />

Travis Wiens<br />

Nutaksas Research<br />

Auckland<br />

New Zealand<br />

travis@nutaksas.com<br />

Rich Burton, Greg<br />

Schoenau, Doug Bitner<br />

University <strong>of</strong> Saskatchewan<br />

Saskatoon<br />

Canada


Greetings from Canada…<br />

2


and New Zealand.<br />

3


White Box Models<br />

● Linearized models <strong>for</strong> small fluctuations<br />

only<br />

● Requires understanding<br />

● Parameter determination: expensive and<br />

difficult<br />

● Aids understanding<br />

4


● Krus, P. 1988.<br />

● Erkkila, M. 1999.<br />

White Box Models<br />

● Kim, S.D. and Cho, H.S. 1988.<br />

● Wu, D., et al. 2002.<br />

5


<strong>Neural</strong> Network Black Box<br />

Models<br />

● Difficulty in selecting <strong>for</strong>m:<br />

● Number <strong>of</strong> layers<br />

● Number <strong>of</strong> neurons<br />

● Activation function<br />

● Difficulty in training:<br />

● Approximate derivatives<br />

● Training time<br />

● Does not aid understanding<br />

6


<strong>Neural</strong> Network Black Box<br />

Models<br />

● McNamara, J., et al, 1997.<br />

● Xu,X.P., et al, 1994.<br />

● Watton, J. and Xue, Y., 1997.<br />

● Y. Chen, et al, 1997.<br />

● Xu, X.P., et al, 1994.<br />

● Lamontagne, D., et al, 2003.<br />

● Li, L., et al, 2006 and 2007.<br />

7


Artificial <strong>Neural</strong> Network<br />

● Large number <strong>of</strong> simple units<br />

● Complex ensemble behavior<br />

● Universal approximator<br />

8


u 1<br />

u 2<br />

u N<br />

Single Neuron<br />

w 1<br />

w 2<br />

w N<br />

+<br />

tanh y<br />

y=tanh(u 1*w 1+u 2*w 2+...+u N*w N)<br />

9


<strong>Neural</strong> Network Black Box<br />

Models<br />

● Difficulty in selecting <strong>for</strong>m:<br />

● Number <strong>of</strong> layers<br />

● Number <strong>of</strong> neurons<br />

● Activation function<br />

● Solution: <strong>Generalized</strong> <strong>Neural</strong> Network<br />

10


Recurrent <strong>Generalized</strong> <strong>Neural</strong><br />

● Werbos, 1990<br />

Network<br />

● Maximum number <strong>of</strong> weights without<br />

recurrence<br />

● All o<strong>the</strong>r connections with delay (differs<br />

from Werbos)<br />

● No layers<br />

● One structure parameter: number <strong>of</strong><br />

neurons<br />

11


Recurrent <strong>Generalized</strong> <strong>Neural</strong><br />

Network<br />

12


<strong>Neural</strong> Network Black Box<br />

Models<br />

● Difficulty in training:<br />

● Approximate derivatives<br />

● Training time<br />

● Solution: Complex Method<br />

13


Complex Method <strong>of</strong><br />

Optimization<br />

● Box, M.J., 1965; Andersson, J., 2001; Xu,<br />

X.P., 1994<br />

● No derivatives required<br />

● No function knowledge required (linear or<br />

non-linear)<br />

● Allows bounded parameters<br />

14


Complex Method <strong>of</strong><br />

Optimization<br />

1) Generate Population (random or<br />

selected parameters)<br />

2) Evaluate “Fitness” <strong>of</strong> each individual:<br />

error in training data<br />

3) Reflect worst individual through centroid<br />

<strong>of</strong> o<strong>the</strong>rs (distance plus 30%), checking<br />

bounds<br />

4) If new individual is still worst, move it<br />

toward best individual<br />

15


Problem<br />

● Model a load sensing pump<br />

● Experimental data:<br />

● Source pressure, Ps<br />

● Control piston pressure, Pc<br />

● Pump flow, Q<br />

● 12000 data points at 200 Hz<br />

● Problem: model Q (Pc, Ps)<br />

20


Load Sensing Pump<br />

21


<strong>RGNN</strong>/Complex Parameters<br />

● 30 Neurons<br />

● 810 Weights<br />

● Activation functions: tanh <strong>for</strong> hidden,<br />

linear <strong>for</strong> output<br />

● Bounds: <strong>for</strong>ward weights: +/- 5, feedback<br />

weights +/- 1<br />

● Fitness: negative RMS error, number <strong>of</strong><br />

finite values <strong>for</strong> unstable parameters<br />

22


● Insert Flow Graphs<br />

Results<br />

23


Results<br />

24


Results<br />

25


• Li, 2007<br />

– Error on Training Data:<br />

2.56e­005 m 3 /s<br />

– Error on Verification<br />

Data: 2.29e­005 m 3 /s<br />

– Training Time: 10­12<br />

hours<br />

– Required considerable<br />

“tweaking”<br />

Comparison<br />

• <strong>RGNN</strong><br />

– Error on Training Data:<br />

1.18e­005 m 3 /s<br />

– Error on Verification<br />

Data: 1.52e­005 m 3 /s<br />

– Training Time: 2.5 hours<br />

– Minimal “tweaking”<br />

26


● Initialization Effects<br />

Results<br />

27


● Few Parameters<br />

● Little setup needed<br />

● Faster Training<br />

Conclusion<br />

● Initialization Important<br />

28


● Travis Wiens:<br />

More In<strong>for</strong>mation<br />

● travis@nutaksas.com<br />

● Code/data:<br />

● Nutaksas Blog:<br />

http://blog.nutaksas.com<br />

● Matlab Central File Exchange:<br />

http://www.mathworks.com/matlabcentral/<br />

29

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!