Radiotekhnika
Publishing house Radiotekhnika

"Publishing house Radiotekhnika":
scientific and technical literature.
Books and journals of publishing houses: IPRZHR, RS-PRESS, SCIENCE-PRESS


Тел.: +7 (495) 625-9241

 

Self-reproducing artificial intelligent neural networks language (SAINNL)

Keywords:

S.D. Ionov – Post-graduate Student, Institute of Mathematics and Mechanics (IMM), Ural Branch of the Russian Academy of Sciences. E-mail: progsdi@gmail.com


In the article «Self-reproducing artificial intelligent neural networks language (SAINNL)» the descriptive language of a specific type of neural networks called the self-reproducing artificial intelligent neural networks (SAINN) is presented. This article is divided into 7 parts. In the beginning of the article the author defines the requirements for the language and provides a review of the existing analogs. Next a general description of the SAINNL elements is presented. The third part provides the features of static and dynamic generation, which are the basis of growth functions during the process of compilation and at runtime. Useful for developers string definitions and commentaries are in the fourth and fifth parts accordingly. In the sixth part based on the proposed main components the examples of SAINNL application for description of separate blocks, as well as, fully functional networks and in particular perceptron are given. In conclusion, the summary on the built SAINNL is made, and it is stated that it can be applied in development of algorithms based on SAINN.
References:

  1. Ionov S. D. Vosproizvodyashchiesya iskusstvennye neyronnye seti: obrabotka i assotsiativnoe svyazyvanie signalov // Sovremennye problemy matematiki: Sb. tezisov / IMM UrO RAN. Yekaterinburg: 2013. S. 313 – 316.
  2. Smith R. G. NeuronC User's Manual. Department of Neuroscience at University of Pennsylvania. SShA, Pensil'vaniya: 1992. http://retina.anatomy.upenn.edu/~rob/ncman2.html
  3. Weitzenfeld A. The neural simulation language: a system for brain modeling. SShA, Massachusets: MIT. 2002. 367 s.
  4. Gleeson P. NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail // PLoS Computational Biology. 6. № 6. 2010. 19 s.
  5. Ionov S. D. Self-reproducing Artificial Intelligent Neural Network Language. Elektron. tekstovye dannye. Yekaterinburg. 2012. https://bitbucket.org/ionsphere/axis4-neural/wiki/Language
  6. Ionov S. D. Programmirovanie neyrosetevogo emulyatora mashiny T'yuringa // XII Vseros. nauch. konf. «Neyrokomp'yutery i ikh primenenie». Sb. tezisov / MGPPU. M.: 2014. S. 44 – 45.
  7. Rosenblatt F. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain // Psychological Review. 65. № 6. SShA, N'yu-York: 1958. S. 386 – 408.
  8. Widrow B. Adaptive switching circuits. IRE WESCON Conv. Rec – SShA, Kaliforniya: 1960. S. 96 – 104.

© Издательство «РАДИОТЕХНИКА», 2004-2017            Тел.: (495) 625-9241                   Designed by [SWAP]Studio