Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://dspace.opu.ua/jspui/handle/123456789/15233
Полная запись метаданных
Поле DCЗначениеЯзык
dc.contributor.authorGalchonkov, O.-
dc.contributor.authorBaranov, O.-
dc.contributor.authorAntoshchuk, S.-
dc.contributor.authorMaslov, O.-
dc.contributor.authorBabych, M.-
dc.date.accessioned2025-05-20T19:11:40Z-
dc.date.available2025-05-20T19:11:40Z-
dc.date.issued2024-
dc.identifier.citationGalchonkov O. DEVELOPMENT OF A NEURAL NETWORK WITH A LAYER OF TRAINABLE ACTIVATION FUNCTIONS FOR THE SECOND STAGE OF THE ENSEMBLE CLASSIFIER WITH STACKING / O. Galchonkov, O. Baranov, S. Antoshchuk, O. Maslov, M. Babych // Eastern-European Journal of Enterprise Technologies, 5(9(131)), 2024. - 6-13.en
dc.identifier.urihttp://dspace.opu.ua/jspui/handle/123456789/15233-
dc.description.abstractOne of the promising directions for improv-ing the quality of object recognition in images and parallelizing calculations is the use of ensemble classifiers with stacking. A neural network at the second level makes it possible to achieve the result-ing quality of classification, which is significantly higher than each of the networks of the first level separately. The classification quality of the entire ensemble classifier with stacking depends on the efficiency of the neural networks at the first stage, their number, and the quality of the classification of the neural network of the second stage. This paper proposes a neural network architecture for the second stage of the ensemble classifier, which combines the approximating properties of tradi-tional neurons and learning activation functions. Gaussian Radial Basis Functions (RBFs) were chosen to implement the learned activation func-tions, which are summed with the learned weights. The experimental studies showed that when work-ing with the CIFAR-10 data set, the best results are obtained when six RBFs are used. A compar-ison with the use of multilayer perceptron (MLP) in the second stage showed a reduction in classi-fication errors by 0.45–1.9 % depending on the number of neural networks in the first stage. At the same time, the proposed neural network archi-tecture for the second degree had 1.69–3.7 times less learning coefficients than MLP. This result is explained by the fact that the use of an output layer with ordinary neurons allowed us not to enter into the architecture many learning activation func-tions for each output signal of the first stage, but to limit ourselves to only one. Since the results were obtained on the CIFAR-10 universal data set, a similar effect could be obtained on a large number of similar practical data setsen
dc.language.isoen_USen
dc.subjectmultilayer perceptronen
dc.subjectneural net-worken
dc.subjectensemble classifieren
dc.subjectweighting coefficientsen
dc.subjectclassification of objects in imagesen
dc.titleDEVELOPMENT OF A NEURAL NETWORK WITH A LAYER OF TRAINABLE ACTIVATION FUNCTIONS FOR THE SECOND STAGE OF THE ENSEMBLE CLASSIFIER WITH STACKINGen
dc.typeArticleen
opu.citation.firstpage6en
opu.citation.lastpage13en
Располагается в коллекциях:2024

Файлы этого ресурса:
Файл Описание РазмерФормат 
311778-Article Text-726512-1-10-20241023.pdf871.04 kBAdobe PDFПросмотреть/Открыть


Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.