A.N. Kolesenkov – Ph. D. (Eng.), Associate Professor, RGRTU. E-mail: email@example.com
Y.V. Konkin – Ph. D. (Eng.), Associate Professor, RGRTU. E-mail: firstname.lastname@example.org
According to the theorem of Takens predicted values depend on the previous values of nonlinear time series (NTS), so to solve the problem of forecasting HBP is advisable to use neural network methods.
One of the most popular examples of prediction is NTS, is forecasting the behavior of the currency and equity markets. Sequence of quotations for a certain period of time, are a classic example of NTS.
The proposed prediction algorithm NTS using of neural networks.
Pre-treatment of the initial data of the second stage involves scaling and receiving array for network training. Scaling occurs by finding smaller and larger values and calculation of the scaling factor.
To configure a neural network prediction of the third stage of its architecture are described indicating the number of neurons in the input, hidden layer and output parameters are defined, such as the number of epochs, the neural network is initialized.
The network architecture typically depends on the task, in most cases, this study used a neural network class «multilayer perceptron».
When using a neural network multilayer perceptron type, the task of building the network architecture is reduced to the selection and comparative analysis of different neuron activation functions, as well as a variety of teaching methods.
As an activation function of neurons for time series analysis, it was decided to use the hyperbolic tangent or sigmoi-Far function.
Experiments with the multilayer perceptron having a different number of neurons in the hidden layer, showed that the convergence of the algorithm back-propagation when the mean square error of 0.001 for the number of iterations does not exceed 10000 is only guaranteed when using the hyperbolic tangent as the activation function of the neuron.
The application of neural networks for solving the problem of forecasting NTS revealed that artificial neural networks have the lowest average absolute error compared to the other investigated prediction algorithm, as well as higher performance.
Zlobin V.K., Ruchkin V.N.
Nejjroseti i nejjrokompjutery. SPb.: BKHV–Peterburg. 2011. 256 s.