I.N. Sinitsyn1, A.I. Ivanov2, A.V. Bezyaev3, I.A. Filipov4
1 Federal Research Center “Informatics and Management” of RAS (Moscow, Russia)
2 Penza Research Electrotechnical Institute JSC (Penza, Russia)
3 Penza Branch of Atlas Scientific and Technical Center JSC (Penza, Russia)
1 sinitsin@dol.ru, 2 bio.ivan.penza@mail.ru, 3 re.wo1f@mail.ru
The purpose of the article is to improve the quality of statistical analysis of data from small samples of about 16 experiments. Technology is being developed to reduce the requirements for sample size through a neural network combination of several statistical criteria. Reducing the sample size is achieved by increasing the complexity of statistical data processing algorithms and representing classical statistical criteria with equivalent artificial neurons. The article examines the neural network combination of three classical criteria: Pearson's chi-square test (1900), the normalized fourth statistical moment test (1931) and the Geary test (1935), aimed at testing the hypothesis of normal and/or uniform distribution laws of a small sample.
It is proposed to improve the technology by increasing the complexity of previously used binary neurons (perceptrons). Instead of binary neurons, it is proposed to use neurons with piecewise linear excitation functions, which leads to the need to configure not only the parameters of the input functional that enriches the data, but also to configure the parameters of the output piecewise linear quantizer. A simple algorithm for tuning piecewise linear functions of artificial neurons is proposed, which makes it possible to predict the probability of errors for each specific neuron based on their response.
Previously, neural network generalization of statistical criteria was performed by single-layer networks, and a positive effect was achieved by discrete procedures for detecting and correcting errors in the output redundant code of a single-layer neural network. The transition to the use of piecewise linear functions allows you to create two-layer neural networks. It is shown that neurons of the second layer perform more efficient convolution of input redundancy in a continuous-discrete state space. It is possible to reduce the probability of missing erroneous states to 28% compared to classical data correction in only discrete space.
Sinitsyn I.N., Ivanov A.I., Bezyaev A.V., Filipov I.A. Algorithm for setting parameters of output piecewise linear functions of Giri artificial neurons, focused on collapsing code redundancy. Highly Available Systems. 2024. V. 20. № 3. P. 19−27. DOI: https://doi.org/ 10.18127/j20729472-202403-02 (in Russian)
- R 50.1.037-2002 Rekomendacii po standartizacii. Prikladnaya statistika. Pravila proverki soglasiya opytnogo raspredeleniya s teoreticheskim. Chast' I. Kriterii tipa χ2. M.: Gosstandart Rossii. 2001. 140 s. (in Russian).
- R 50.1.037-2002 Prikladnaya statistika. Pravila proverki soglasiya opytnogo raspredeleniya s teoreticheskim. Chast' II. Neparametricheskie kriterii. M.: Gosstandart Rossii. 2002. 123 s. (in Russian).
- Kobzar' A.I. Prikladnaya matematicheskaya statistika. Dlya inzhenerov i nauchnyh rabotnikov. M.: Fizmatlit. 2006. 816 s. (in Russian).
- Geary R.C. The ratio of the mean deviation to the standard deviation as a test of normality. Biometrika. 1935. V. 27. P. 310–322.
- Ivanov A.I. Iskusstvennye matematicheskie molekuly: povyshenie tochnosti statisticheskih ocenok na malyh vyborkah (programmy na yazyke MathCAD). Preprint. Penza: Izd-vo Penzenskogo gos. un-ta. 2020. 36 s. (in Russian)
- Ivanov A.I. Nejrosetevoj mnogokriterial'nyj statisticheskij analiz malyh vyborok: Spravochnik. Penza: Izd-vo Penzenskogo gos. un-ta. 2022. 160 s. (in Russian).
- Morelos-Saragosa R. Iskusstvo pomekhoustojchivogo kodirovaniya. M.: Tekhnosfera. 2007. 320 s. (in Russian).
- Volchihin V.I., Ivanov A.I., Bezyaev A.V., Filipov I.A. Raspoznavanie malyh vyborok s zadannym raspredeleniem dannyh pri ispol'zovanii iskusstvennyh nejronov, predskazyvayushchih doveritel'nye veroyatnosti sobstvennyh reshenij. Izv. vuzov. Povolzhskij region. Ser.: Tekhnicheskie nauki. 2023. № 4. S. 31–39 (in Russian).
- Ivanov A.I. Bionika: obuchenie «na letu» s ispol'zovaniem geneticheski po-raznomu predobuchennyh iskusstvennyh nejronov. Sistemy bezopasnosti. 2023. № 4. S. 122–125 (in Russian).
- Hajkin S. Nejronnye seti: polnyj kurs. M.: Izdatel'skij dom «Vil'yams». 2006. S. 1104. (in Russian).
- Nikolenko S., Kudrin A., Arhangel'skaya E. Glubokoe obuchenie. Pogruzhenie v mir nejronnyh setej. SPb.: Izdatel'skij dom «Piter». 2018 (in Russian).
- Aggarval Charu. Nejronnye seti i glubokoe obuchenie. SPb.: Dialektika. 2020. 756 s. (in Russian)
- Ivanov A.P., Ivanov A.I., Bezyaev A.V., Kupriyanov E.N., Bannyh A.G., Perfilov K.A., Lukin V.S., Savinov K.N., Polkovnikova S.A., Serikova Yu.I., Malygin A.Yu. Obzor novyh statisticheskih kriteriev proverki gipotezy normal'nosti i ravnomernosti raspredeleniya dannyh malyh vyborok. Nadezhnost' i kachestvo slozhnyh sistem. 2022. № 2. S. 33–44 (in Russian).
- GOST R 52633.5-2011. Zashchita informacii. Tekhnika zashchity informacii. Avtomaticheskoe obuchenie nejrosetevyh preobrazovatelej biometriya-kod dostupa (in Russian).
- Ivanov A.I. Biometricheskaya identifikaciya lichnosti po dinamike podsoznatel'nyh dvizhenij: Monografiya. Penza: Izd-vo Penzenskogo gos. un-ta. 2000. 178 s. (in Russian)
- Yazov Yu.K., Volchihin V.I., Ivanov A.I., Funtikov V.A., Nazarov I.G. Nejrosetevaya zashchita personal'nyh biometricheskih dannyh. Pod red. Yu.K. Yazova. M.: Radiotekhnika. 2012. 157 s. (in Russian)