350 rub
Journal Neurocomputers №1 for 2014 г.
Article in number:
Analysis of a spiking recurrent neural network as a part of the liquid state machine
Authors:
E. N. Benderskaya - Ph.D. (Eng.), Associate Professor, Department Computer Systems and Software Technologies, St.-Petersburg State Polytechnical University. E-mail: helen.bend@gmail.com
K. V. Nikitin - Post-graduate Student, Department Computer Systems and Software Technologies, St.-Petersburg State Polytechnical University. E-mail: execiter@mail.ru
Abstract:
This paper presents component analysis of a new perspective model for dynamic pattern recognition - spiking recurrent neural network (RNN). Model under consideration is a part of liquid state machine (LSM) ? pattern recognition system, pertaining to the new type of reservoir computers. Spiking RNN is analyzed in order to prepare afterwards methods for dynamical pattern recognition task solution. In the beginning the scheme of pattern recognition system based on liquid state machine is considered. It consists of reservoir - spiking RNN and several readouts. Input signals are given to RNN input - they are modelled with continuous function of time u(t). RNN in response to input signals comes to a new state xM(t). Then this state is given to readouts. Outputs of readouts are presented with continuous functions of time y(t) ? this outputs are the outputs of the whole recognition system. Spiking RNN perform some nonlinear filter with fading memory LM on input functions u(t). Readouts are modelled with some functions f(.), parameters of which can be tuned during training process. Then the structure of spiking RNN is considered. It is random 3-dimensional grid, elements of which are spiking neurons and connections - synapses. Spiking RNN has many parameters, characterizing type of neurons, type of synapses, connectivity, delays, and noise. In the base LSM model, RNN consists of integrate-and-fire (IaF) spiking neurons. To enhance capabilities of LSM more complex neurons such as Izhikevich neuron could be used. Connections are presented with electrical and chemical synapses. Electrical synapses can transmit signal in analog form, and chemical synapses can transmit signal in form of spike trains. Chemical synapses realize some nonlinear filter with constant delay. Special sorts of synapses can be used to model short-term and long-term memory - dynamical synapses and synapses with some sort of plasticity (spike time dependent plasticity, STDP). Desired input and output signals are often presented in continuous form. Therefore, there is a problem of transforming continuous signals into spike trains and on the contrary, transforming spike trains into continuous signals. Main analysis results, such as mathematical models of recognition system, are joined in a summary table. This table shows difficulty of the spiking RNN model on different levels: on the one hand large dimension of the equation system and on the other hand many adjustable parameters. The model is considered as analytically insoluble - for preparing methods of structural-parametric synthesis of spiking RNN many experiments should be performed. One of the approaches is cybernetic physics. Some modelling environment of spiking neural network is needed with capabilities of varying most of parameters, giving any input signal to the input and reading out all output signals. The investigation should include building parameter hierarchy of the spiking RNN, estimation of the working ranges of all parameters and consecutive performing of experiments in accordance with adjustable real-time research plan.
Pages: 17-22
References

  1. Benderskaya E.N. Nelinejny'e dinamicheskie sistemy' kak osnova novy'x intellektual'ny'x sistem // Stoxasticheskaya optimizacziya v informatike. 2012. T. 8. № 2. S. 136−150.
  2. Benderskaya E.N., Zhukova S.V. Osczillyatorny'e nejronny'e seti s xaoticheskoj dinamikoj v zadachax klasternogo analiza // Nejrokomp'yutery': razrabotka, primenenie. 2011. №7. S. 74−86.
  3. Borisyuk G.N. i dr. Osczillyatorny'e nejronny'e seti. Matematicheskie rezul'taty' i prilozheniya // Matematicheskoe modelirovanie. 1992. T. 4. № 1. S. 3−43.
  4. Fradkov A.L. Kiberneticheskaya fizika: princzipy' i primery'. SPb.: Nauka. 2003. 208 s.
  5. Xajkin S. Nejronny'e seti: polny'j kurs. Per. s angl. 2-e izd. M. : Vil'yams. 2000. 1104 s.
  6. Gerstner W., Kistler W.M. Spiking Neuron Models. Single Neurons, Populations, Plasticity. Cambridge : Cambridge University Press. 2002. 480 p.
  7. Handbook of brain connectivity / Edited by V.K. Jirsa, A.R. McIntosh. Berlin: Springer-Verlag. 2007. 528 p.
  8. Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities // PNAS. 1982. Vol. 79. P. 2554-2558.
  9. Izhikevich E. M. Which model to use for cortical spiking neurons - / IEEE Transactions on Neural Networks. 2004. Vol. 15. № 5. P. 1063-1070.
  10. Maass W., Natschläger T., Markram H. Real-time computing without stable states: a new framework for neural computations based on perturbations // Neural Computation. 2002. Vol. 11. P. 2531-2560.
  11. Markram, H., Wang Y., Tsodyks M. Differential signaling via the same axon of neocortical pyramidal neurons // PNAS. 1998. Vol. 95. P. 5323-5328.
  12. Markram H., Tsodyks M. Redistribution of synaptic efficacy between neocortical pyramidal neurons // Nature. 1996. Vol. 382. P. 807-810.
  13. Schrauwen B., Verstraeten D., Campenhout J.V. An overview of reservoir computing theory, applications and implementations // Proc. of the 15th European Symposium on ANN. 2007. P. 471-482.
  14. Siegelmann H.T. Neural and super-Turing computing // Minds and Machines. 2003. Vol. 13. P. 103-114.
  15. Zak M., Zbilut J.P., Meyers R.E. From instability to intelligence Complexity and predictability in nonlinear dynamics. Berlin: Springer-Verlag. 1997. 552 p.