Journal Neurocomputers №2 for 2021 г.
Article in number:
Implementation aspects of neural arithmetic adder
Type of article: scientific article
DOI: https://doi.org/10.18127/j19998554-202102-01
UDC: 004.896
Authors:

A.V. Demidovskij , E.A. Babkin

1Higher School of Economics (Nizhny Novgorod, Russia)

Abstract:

The construction of integrated sub-symbolic systems is considered to be an important scientific direction where the expression of symbolic rules in the form of a neural network plays a key role. At the same time, there is an actual task of creating neural architectures for solving complex intellectual tasks without preliminary testing for modeling of computational processes in perspective massively parallel computational environments. The very first step towards solving this task is the creation of a neural network that is capable to perform an exact solution for the specifically selected motivating problem that incorporates various intellectual operations on symbolic structures. The task of Multi-Attribute Linguistic Decision Making can be selected as an example of an appropriate motivating problem. Linguistic assessment aggregation is a key element of fuzzy decision-making models and includes several stages: assessments translation from a form of 2-tuple to a numerical representation, application of aggregation operators, and reverse translation of numerical results to 2-tuple structures. Linguistic assessments are encoded and decoded with the help of rules defined by the Tensor Representations framework. The current work is a continuation of the research dedicated to the implementation of arithmetic operations in a neural form. The neural design is proposed that is capable of performing the arithmetic sum of two numbers, encoded with Tensor Representations without a training stage. The proposed method was implemented and analyzed with the help of the Keras framework. The design of the neural primitive that takes distributed representation of symbolic structures as an input proves the hypothesis about expressing various symbolic rules in a form of neural architectures, such as aggregation of linguistic assessments during the decision-making process. The proposed primitive is based on the arbitrary symbolic structures analysis and can be used as a neural adder. Such a network can be easily extended and supported as well as there is a huge potential for re-use of this network for implementation of other sub-symbolic operations on the neural level. Moreover, the generation approach to network creation that can manipulate structures on the tensor level, can be used in a wide range of cognitive systems.

Pages: 5-14
For citation

Demidovskij A.V., Babkin E.A. Implementation aspects of neural arithmetic adder. Neurocomputers. 2021. V. 23. № 2. Р. 5−14.  DOI: https://doi.org/10.18127/j19998554-202102-01 (in Russian).

References
  1. Besold T.R., Garcez A.D.A., Bader S., Bowman H., Domingos P., Hitzler P., de Penning L. Neural-symbolic learning and reasoning: A survey and interpretation. arXiv. cs. 1711.03902. 2017.
  2. Besold T.R., Kuhnberger K.U. Towards integrated neural–symbolic systems for human-level AI: Two research programs helping to bridge the gaps. Biologically Inspired Cognitive Architectures. 2015. № 14. P. 97–110.
  3. Gallant S.I., Okaywe T.W. Representing objects, relations, and sequences. Neural computation. 2013. V. 25. № 8. P. 2–41.
  4. Fodor J.A., Pylyshyn Z.W. Connectionism and cognitive architecture: A critical analysis. Cognition. 1988. V. 28. № 1–2. P. 3–71.
  5. Pinkas G., Lima P., Cohen S. Representing, binding, retrieving and unifying relational knowledge using pools of neural binders. Biologically Inspired Cognitive Architectures. 2013. V. 6. P. 87–95.
  6. Yousefpour A., Nguyen B.Q., Devic S., Wang G., Kreidieh A., Lobel H., Jue J.P. Failout: Achieving Failure-Resilient Inference in Distributed Neural Networks. arXiv. cs. 2002.07386. 2020.
  7. Wei C., Liao H. A multigranularity linguistic group decision‐making method based on hesitant 2‐tuple sets. International Journal of Intelligent Systems. 2016. V. 31. № 6. P. 612–634.
  8. Demidovskij A.V., Babkin E.A. Developing a distributed linguistic decision-making system. Business-Informatics. 2019. V. 13. № 1.  P. 18–32.
  9. Golmohammadi D. Neural network application for fuzzy multi-criteria decision making problems. International Journal of Production Economics. 2011. V. 131. № 2. P. 490–504.
  10. Demidovskij A. Implementation Aspects of Tensor Product Variable Binding in Connectionist Systems. In Proceedings of SAI Intelligent Systems Conference. 2019.
  11. Demidovskij A. Towards Automatic Manipulation of Arbitrary Structures in Connectivist Paradigm with Tensor Product Variable Binding. Studies in Computational Intelligence. 2019. V. 856. P. 97–110.
  12. Demidovskij A., Babkin E. Designing a Neural Network Primitive for Conditional Structural Transformations. Lecture Notes in Computer Science. 2020. V. 12412. P. 117–133.
  13. Pinkas G. Reasoning, nonmonotonicity and learning in connectionist networks that capture propositional knowledge. Artificial Intelligence. 1995. V. 77. № 2. P. 203–247.
  14. Smolensky P. Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artificial intelligence. 1990. V. 46. № 1–2. P. 159–216.
  15. Smolensky P., Legendre G. The harmonic mind: From neural computation to optimality-theoretic grammar (Cognitive architecture).  V. 1. MIT press. 2006. 590 p.
  16. Smolensky P., Goldrick M., Mathis D. Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition. Cognitive science. 2014. V. 38. № 6. P. 1102–1138.
  17. Cho P.W., Goldrick M., Smolensky P. Incremental parsing in a continuous dynamical system: Sentence processing in Gradient Symbolic Computation. Linguistics Vanguard. 2017. V. 3. № 1. P. 1–9.
  18. McCoy R.T., Linzen T., Dunbar E., Smolensky P. RNNs Implicitly Implement Tensor Product Representations. arXiv. cs. 1812.08718. 2018.
  19. Soulos P., McCoy T., Linzen T., Smolensky P. Discovering the compositional structure of vector representations with role learning networks. arXiv. cs. 1910.09113. 2019.
  20. Huang Q., Smolensky P., He X., Deng L., Wu D. Tensor product generation networks for deep NLP modeling. arXiv. cs. 1709.09118. 2017.
  21. Palangi H., Smolensky P., He X., Deng L. Question-answering with grammatically-interpretable representations. In Thirty-Second AAAI Conference on Artificial Intelligence. 2018. P. 5350–5357.
  22. Demidovskij A. Comparative Analysis of MADM Approaches: ELECTRE, TOPSIS and Multi-level LDM Methodology. In Proceedings of 2020 XXIII International Conference on Soft Computing and Measurements (SCM). 2020. P. 190–193.
  23. Demidovskij A., Babkin E. Towards Designing Linguistic Assessments Aggregation as a Distributed Neuroalgorithm. In Proceedings of 2020 XXIII International Conference on Soft Computing and Measurements (SCM). 2020. P. 161–164.
  24. Browne A., Sun R. Connectionist inference models. Neural Networks. 2001. № 14. P. 1331–1355.
  25. Demidovskij A. Automatic Construction of Tensor Product Variable Binding Neural Networks for Neural-Symbolic Intelligent Systems. In Proceedings of 2020 International Conference on Electrical, Communication, and Computer Engineering. 2020.
  26. Demidovskij A., Babkin E. Designing arithmetic neural primitive for sub-symbolic aggregation of linguistic assessments. J. Phys.: Conf. Ser. 2020. V. 1680. P. 1–7.
  27. Chollet F. Keras: Deep Learning framework – URL: https://keras.io
  28. de Penning H.L.H., Garcez A.S.D.A., Lamb L.C., Meyer J.J.C. A neural-symbolic cognitive agent for online learning and reasoning. In Twenty-Second International Joint Conference on Artificial Intelligence. 2011. P. 1653–1658.
Date of receipt: 16.02.2021
Approved after review: 04.03.2021
Accepted for publication: 15.03.2021