350 rub
Journal Neurocomputers №3 for 2019 г.
Article in number:
Neural network development on the basis of knowledge of environment exposures
Type of article: scientific article
DOI: 10.18127/j19998554-201903-01
UDC: 519.764
Authors:

K. R. Akhmetzyanov – Post-graduate Student, Department of Automatics and Telemechanics, Perm National Research Polytechnic University

E-mail: kirill94a@mail.ru

A. A. Yuzhakov – Dr.Sc. (Eng.), Professor, Head of Department of Automatics and Telemechanics, Perm National Research Polytechnic University

E-mail: uz@at.pstu.ru

Abstract:

The article consists of an introduction, four main parts and a conclusion. In the introduction, topicality of the investigation is proved.

The first section presents the results of previous studies on the choice of neural network architecture and improving the accuracy of recognition for the selected network. It also describes the disadvantages of this network that need to be eliminated with the help of our own neural network.

The second section describes the idea of development of the original neural network based on the transfer of knowledge about the environment.

The third section outlines two approaches to developing our own neural network. For the first approach, the proposed network architecture and recognition results are described. Disadvantages are identified for this neural network, which the second approach eliminates.

The fourth section describes the conditions for conducting experiments with the developed neural network architecture, and presents the results of these experiments.

Conclusions are drawn from the study results. Further research directions are determined to increase the classification accuracy.

Pages: 5-13
References
  1. Cherdak: nauka, tekhnologii, budushchee [Elektronnyj resurs]. URL: https://chrdk.ru/news/podschitano-obshchee-kolichestvoplastika (data obrashcheniya: 16.12.2018).
  2. Raspberry Pi – Teach, learn, and make with Raspberry Pi [Elektronnyj resurs]. URL: https://www.raspberrypi.org/ (data obrashcheniya: 16.12.2018).
  3. Akhmetzyanov K.R., Yuzhakov A.A. Sravnenie svertochnykh nejronnykh setej dlya zadach sortirovki musornykh otkhodov. Izvestiya SPbGETU LETI. 2018. № 6. S. 27–32.
  4. Krizhevsky A., Sutskever I., Hinton G.E. ImageNet classification with deep convolutional neural networks. Proceedings of the 25th International Conference on Neural Information Processing Systems. (NIPS'12). Curran Associates Inc. Lake Tahoe, Nevada, USA. 2012. V. 1. P. 1106–1114.
  5. Iandola F.N., Han S., Moskewicz M.W., Ashraf Kh., Dally W.J., Keutzer K. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and < 0.5 MB model size. arXiv preprint arXiv:1602.07360. 2016 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/ 1602.07360.pdf (data obrashcheniya: 16.12.2018).
  6. Howard A.G., Zhu M., Chen B., Kalenichenko D., Wang W., Weyand T., Andreetto M., Adam H. MobileNets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/ pdf/1704.04861.pdf (data obrashcheniya: 16.12.2018).
  7. Jia Y., Shelhamer E., Donahue J., Karayev S., Long J., Girshick R., Guadarrama S., Darrell T. Caffe: Convolutional architecture for fast feature embedding. arXiv preprint arXiv:1408.5093. 2014 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1408.5093.pdf (data obrashcheniya: 16.12.2018).
  8. UKBench Dataset [Elektronnyj resurs]. URL: https://archive.org/details/ukbench (data obrashcheniya: 30.12.2017).
  9. Akhmetzyanov K.R., Yuzhakov A.A. Uvelichenie tochnosti svertochnoj nejronnoj seti za schet vozrastaniya kolichestva dannykh. Nejrokomp'yutery: razrabotka, primenenie. 2018. № 7. S. 14–19.
  10. Wang J., Perez L. The effectiveness of data augmentation in image classification using deep learning [Elektronnyj resurs]. URL: http://cs231n.stanford.edu/reports/2017/pdfs/300.pdf (data obrashcheniya: 16.12.2018).
  11. Vasconcelos C.N., Vasconcelos B.N. Convolutional neural network committees for melanoma classification with classical and expert knowledge base image transforms data augmentation. arXiv preprint arXiv:1702.07025. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1702.07025.pdf (data obrashcheniya: 16.12.2018).
  12. Zhong Z., Zheng L., Kang G., Li S., Yang Y. Random erasing data augmentation. arXiv preprint arXiv:1708.04896. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1708.04896.pdf (data obrashcheniya: 16.12.2018).
  13. Krizhevsky A., Sutskever I., Hinton G.E. ImageNet classification with deep convolutional neural networks. Proceedings of the 25th International Conference on Neural Information Processing Systems. (NIPS'12). Curran Associates Inc. Lake Tahoe, Nevada, USA. 2012. V. 1. P. 1106–1114.
  14. Simonyan K., Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv: 1409.1556. 2014 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1409.1556.pdf (data obrashcheniya: 16.12.2018).
  15. Sabour S., Frosst N., Hinton G.E. Dynamic routing between capsules. arXiv preprint arXiv:1710.09829v2. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1710.09829.pdf (data obrashcheniya: 16.12.2018).
  16. Hinton G.E., Sabour S., Frosst N. Matrix capsules with EM routing. Proceedings of the 6th International Conference on Learning Representations (ICLR’2018). Vancouver, Canada. 2018 [Elektronnyj resurs]. URL: https://openreview.net/ pdf?id=HJWLfGWRb (data obrashcheniya: 16.12.2018).
  17. Long M., Chu H., Wang J., Jordan M.I. Unsupervised domain adaption with residual transfer networks. arXiv preprint arXiv:1602. 04433v2. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1602.04433.pdf (data obrashcheniya: 16.12.2018).
  18. Yan H., Ding Y., Li P., Wang Q., Xu Y., Zuo W. Umind the class weight bias: Weighted maximum mean discrepancy for unsupervised domain adaption. arXiv preprint arXiv:1705.00609v1. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/ 1705.00609.pdf (data obrashcheniya: 16.12.2018).
  19. Long M., Cao Y., Wang J., Jordan M.I. Learning transferable features with deep adaption networks. arXiv preprint arXiv:1502. 02791v2. 2015 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1502.02791.pdf (data obrashcheniya: 16.12.2018).
  20. Long M., Zhu H., Wang J., Jordan M.I. Deep transfer learning with joint adaption networks. arXiv preprint arXiv:1605.06636v2. 2017 [Elektronnyj resurs]. URL: https://arxiv.org/pdf/1605.06636.pdf (data obrashcheniya: 16.12.2018).
  21. Hochreiter S., Schmidhuber J. Long short-term memory. Neural Computation. 1997. V. 9. № 8. P. 1735–1780. DOI: 10.1162/ neco.1997.9.8.1735.
  22. Li F.-F., Fergus R., Perona P. One-shot learning of object categories. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2006. V. 28. № 4. P. 594–611.
  23. Koch G., Zemel R., Salakhutdinov R. Siamese neural networks for one-shot image recognition [Elektronnyj resurs]. URL: https://www.cs.cmu.edu/~rsalakhu/papers/oneshot1.pdf (data obrashcheniya: 16.12.2018).
Date of receipt: 27 июня 2019 г.