eONPUIR

Definition of the influence of the choice of the pruning procedure parameters on the quality of training of a multilayer perceptron.

Показать сокращенную информацию

dc.contributor.author Galchonkov, Oleg
dc.contributor.author Nevrev, Alexander
dc.contributor.author Shevchuk, Bohdan
dc.contributor.author Baranov, Nikolay
dc.date.accessioned 2022-12-08T12:44:13Z
dc.date.available 2022-12-08T12:44:13Z
dc.date.issued 2022
dc.identifier.citation Galchonkov, O., Nevrev, A., Shevchuk, B., Baranov, N. (2022). Definition of the influence of the choice of the pruning procedure parameters on the quality of training of a multilayer perceptron. Eastern-European Journal of Enterprise Technologies, 1 (9 (115)), 75–83. doi: https://doi.org/10.15587/1729-4061.2022.253103 en
dc.identifier.other UDC 681.3.07: 004.8
dc.identifier.other doi: https://doi.org/10.15587/1729-4061.2022.253103
dc.identifier.uri http://dspace.opu.ua/jspui/handle/123456789/13156
dc.description.abstract Pruning connections in a fully connected neural network allows to remove redundancy in the structure of the neural network and thus reduce the computational complexity of its implementation while maintaining the resulting characteristics of the classification of images entering its input. However, the issues of choosing the parameters of the pruning procedure have not been sufficiently studied at the moment. The choice essentially depends on the configuration of the neural network. However, in any neural network configuration there is one or more multilayer perceptrons. For them, it is possible to develop universal recommendations for choosing the parameters of the pruning procedure. One of the most promising methods for practical implementation is considered – the iterative pruning method, which uses preprocessing of input signals to regularize the learning process of a neural network. For a specific configuration of a multilayer perceptron and the MNIST (Modified National Institute of Standards and Technology) dataset, a database of handwritten digit samples proposed by the US National Institute of Standards and Technology as a standard when comparing image recognition methods, dependences of the classification accuracy of handwritten digits and learning rate were obtained on the learning step, pruning interval, and the number of links removed at each pruning iteration. It is shown that the best set of parameters of the learning procedure with pruning provides an increase in the quality of classification by about 1 %, compared with the worst set in the studied range. The convex nature of these dependencies allows a constructive approach to finding a neural network configuration that provides the highest classification accuracy with the minimum amount of computational costs during implementation en
dc.language.iso en_US en
dc.publisher Eastern-European Journal of Enterprise Technologies en
dc.subject multilayer perceptron en
dc.subject neural network en
dc.subject pruning en
dc.subject learning curve en
dc.subject weight coefficients en
dc.subject image classificatio en
dc.title Definition of the influence of the choice of the pruning procedure parameters on the quality of training of a multilayer perceptron. en
dc.type Article in Scopus en
opu.citation.journal Eastern-European Journal of Enterprise Technologies en
opu.citation.volume 9(115) en
opu.citation.firstpage 75 en
opu.citation.lastpage 83 en
opu.citation.issue 1 en
opu.staff.id galchonkov@op.edu.ua en


Файлы, содержащиеся в элементе

Этот элемент содержится в следующих коллекциях

Показать сокращенную информацию