350 rub
Journal Neurocomputers №3 for 2020 г.
Article in number:
Neural network synapses analysis technology for input features study
Type of article: scientific article
DOI: 10.18127/j19998554-202003-05
UDC: 004.853; 004.855.5
Authors:

R.V. Isakov – Ph.D. (Eng.), Associate Professor, Department of Biomedical and Electronic Means and Technologies, Federal state budgetary Educational Institution of Higher Education  «Vladimir State University named after A.G. and N.G. Stoletovs»

E-mail: Isakov-RV@mail.ru

Abstract:

The development of digital computer technology has allowed humanity to accumulate huge amounts of useful practical data from different areas of life in different forms. Artificial neural networks (ANN) are now widely used as a universal means of data analysis for different tasks. The vast majority of work is done in the field of solving the classification problem.

The main problem here is that the neural network model works like a "black box".

Therefore, for the further development and practical application of neural network models, it is relevant to solve the inverse problem of determining the input characteristics that have had the greatest impact on the result of the trained neural network. Today, there are many different topologies of neural networks, but the most studied and theoretically justified is the multilayer perceptron. To estimate the number of neurons in the hidden layers of homogeneous neural networks, one can use the Widrow-Lehr formula, but the task is to find a balance between the accuracy and generalizing ability of the neural network model.

Given the many factors that are difficult to take into account analytically, it is necessary to conduct a series of computational experiments to find the optimal size of the ANN.

To quantify the effectiveness of classifiers, such criteria as sensitivity, specificity and informativeness are used.

The results of numerous computational experiments on training neural networks with different number of outputs showed the superiority of the modular principle of ANN construction.

A neural network that has passed the training and testing procedure is already able to perform the tasks of recognizing certain states of the object (classification), but the information about the found dependencies is in the synaptic connections in an implicit form. Extraction of this information will allow giving new knowledge about the nature of the revealed regularities, to define significant factors, to construct analytical dependence, to understand logic of ANN decision-making. To do this, it is necessary to solve the inverse problem of neural network analysis - the determination of the ANN input parameters, which have the greatest impact on the output. This is possible due to the known mathematical dependencies underlying the neural network. This should take into account the impact of positive and negative connections and their distribution in the network.

It is also possible to build an analytical dependence of the output on the found key features.

Modeling of neural networks was performed using the pyBrain library in the Python environment.

To study the capabilities of the developed technology of analysis of ANN synapses, computational experiments were carried out simulating the entire process of analysis from data collection to the construction of analytical dependence. At the same time, both positive and negative linear and nonlinear connections were determined.

The results of computational experiments have shown that the use of ANN training allows applying a large number of unrelated (noise) parameters to their input without significant loss of classification efficiency.

This technology will allow using feed forward neural networks not only for classification, but also for search of new knowledge about hidden regularities in the studied phenomena. The ability to synthesize analytical dependencies from the neural network speeds up the classification process by reducing the number of calculations and makes the neural network model more obvious to understand.

The set of significant features found as a result of the proposed neural network analysis can be used in the construction of models of other types or to optimize the input variables of neural network modules.

Pages: 45-55
For citation

Isakov R.V. Neural network synapses analysis technology for input features study. Neurocomputers. 2020. V. 22. № 3. P. 45–55. DOI: 10.18127/j19998554-202003-05

References
  1. Bunker R.P., Thabtah F. A machine learning framework for sport result prediction. Applied Computing and Informatics. 2019. V.15. Iss. 1. P. 27-33. DOI: 10.1016/j.aci.2017.09.005
  2. Malone C., Fennell L., Folliard T., Kelly C. Using a neural network to predict deviations in mean heart dose during the treatment of left-sided deep inspiration breath hold patients. Physica Medica. 2019. V. 65. P. 137-142. DOI: 10.1016/j.ejmp.2019.08.014
  3. Heo S., Lee J.H. Parallel neural networks for improved nonlinear principal component analysis. Computers & Chemical Engineering. 2019. V. 127. P. 1-10. DOI: 10.1016/j.compchemeng.2019.05.011
  4. Dong Y., Fu Z., Peng Y., Zheng Y., Yan H., Li X. Precision fertilization method of field crops based on the Wavelet-BP neural network in China. Journal of Cleaner Production. 2019. №118735.5
  5. Zhao S., Liang H., Du P., Pan Y. Adaptive Neural Network Control for A Class of Discrete-Time Nonlinear Interconnected Systems With Unknown Dead-Zone. Journal of the Franklin Institute. 2019. V. 356. Iss. 18. P. 11345-11363. DOI: 10.1016/j.jfranklin.2019.08.024
  6. Voyevoda A.A., Romannikov D.O. Formirovaniye struktury neyronnoy seti posredstvom dekompozitsii iskhodnoy zadachi na primere zadachi upravleniya robotom manipulyatorom. Izvestiya SPBGETU LETI. 2018. № 9. S. 27-32. (in Russian)
  7. Posyagin A.I., Yuzhakov A.A. Razrabotka dvukhsloynoy neyronnoy seti dlya samomarshrutiziruyushchegosya analogotsifrovogo preobrazovatelya na osnove neyronnoy seti. Elektrotekhnika. 2013. №11. S. 10-13. (in Russian)
  8. Chervyakov N.I., Tikhonov E.E., Tikhonov E.E. Primeneniye neyronnykh setey dlya zadach prognozirovaniya i problemy identifikatsii modeley prognozirovaniya na neyronnykh setyakh. Neyrokompyutery: razrabotka. primeneniye. 2003. №10-11. S. 25-31. (in Russian)
  9. Manzhula V.G., Fedyashov D.S. Neyronnyye seti Kokhonena i nechetkiye neyronnyye seti v intellektualnom analize dannykh. Fundamentalnyye issledovaniya. 2011. №4. S. 108-114. (in Russian)
  10. Semeykin V.D. Upravleniye setyu peredachi dannykh s pomoshchyu iskusstvennykh neyronnykh setey. T-COMM: Telekommunikatsii i transport. 2013. T.7. № 7. S. 118-121. (in Russian)
  11. Qatawneh Z., Alshraideh M., Almasri N., Tahat L., Awidi A. Clinical decision support system for venous thromboembolism risk classification. Applied Computing and Informatics. 2019. V.15. Iss. 1. P. 12-18.
  12. Dash Ch.S.K., Behera A.K., Dehuri S., Cho S. Building a novel classifier based on teaching learning based optimization and radial basis function neural networks for non-imputed database with irrelevant features. Applied Computing and Informatics. 2019. DOI: 10.1016/j.aci.2019.03.001
  13. Tharwat A. Classification assessment methods. Applied Computing and Informatics. 2018. DOI:10.1016/j.aci.2018.08.003
  14. Jiang J., Zhang H., Pi D., Dai C. A novel multi-module neural network system for imbalanced heartbeats classification// Expert Systems with Applications: X. 2019. V. 1. DOI: 10.1016/j.eswax.2019.100003
  15. Binkhonain M., Zhao L. A review of machine learning algorithms for identification and classification of non-functional requirements. Expert Systems with Applications: X. 2019. V. 1. DOI: 10.1016/j.eswax.2019.100001
  16. Hecht-Nielsen R. Kolmogorov's Mapping Neural Network Existence Theorem. IEEE First Annual Int. Conf. on Neural Networks. San Diego. 1987. V. 3. P. 11-13.
  17. Muller B., Reinhart J. Neural Networks: an introduction. Springer-Verlag. Berlin Heidelberg. 1990. P.104-112.
  18. Widrow B., Lehr M.A. 30 years of adaptive neural networks: perceptron. madaline. and backpropagation. Proceedings of the IEEE. 1990. V. 78. № 9. P. 1415-1442.
  19. Isakov R.V., Suntsova O.V. Issledovaniye iskusstvennykh neyronnykh setey v zadache identifikatsii lichnosti po elektrokardiosignalu. zaregistrirovannomu ustroystvom CardioQVARK. Neyrokompyutery: razrabotka. primeneniye. 2016. №3. S. 31-38. (in Russian)
  20. Isakov R.V., Al-Mabruk M.A., Lukianova Yu.A., Sushkova L.T. Rezultaty issledovaniya neyronnykh setey v zadachakh raspoznavaniya patologicheskikh izmeneniy elektricheskoy aktivnosti serdtsa. Biomeditsinskaya radioelektronika. 2010. №7. S. 9-13. (in Russian)
  21. Fiziologiya cheloveka: v 3-kh tomakh. T. 1. Per. s angl. / Pod red. R. Shmidta, G. Tevsa. M.: Mir. 1996. 323 s. (in Russian)
  22. Schaul T., Bayer J., Wierstra D., Yi S., Felder M., Sehnke F., Rückstieß T., Schmidhuber J. PyBrain. Journal of Machine Learning Research. 2010. V. 11. P.743-746.
Date of receipt: 13 декабря 2019 г.