Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://dspace.opu.ua/jspui/handle/123456789/13155
Название: Using a neural network in the second stage of the ensemble classifier to improve the quality of classification of objects in images
Авторы: Galchonkov, Oleg
Babych, Mykola
Zasidko, Andrii
Poberezhnui, Serhii
Ключевые слова: multilayer perceptron
neural network
ensemble classifier
weights
classification of images
Дата публикации: 2022
Издательство: Eastern-European Journal of Enterprise Technologies
Библиографическое описание: Galchonkov. O., Babych, M., Zasidko, A., Poberezhnyi, S. (2022). Using a neural network in the second stage of the ensemble classifier to improve the quality of classification of objects in images. Eastern-European Journal of Enterprise Technologies, 3 (9 (117)), 15–21. doi: https://doi.org/10.15587/1729-4061.2022.258187
Краткий осмотр (реферат): Object recognition in images is used in many areas of practical use. Very often, progress in its application largely depends on the ratio of the quality of object recognition and the required amount of calculations. Recent advances in recognition are related to the development of neural network architectures with a very significant amount of computing that are trained on large data sets over a very long time on state-ofthe-art computers. For many practical applications, it is not possible to collect such large datasets for training and only computing machines with limited computing power can be used. Therefore, the search for solutions that meet these practical restrictions is relevant. This paper reports an ensemble classifier, which uses stacking in the second stage. The use of significantly different classifiers in the first stage and the multilayer perceptron in the second stage has made it possible to significantly improve the ratio of the quality of classification and the required volume of calculations when training on small data sets. The current study showed that the use of a multilayer perceptron in the second stage makes it possible to reduce the error compared to the use of the second stage of majority voting. On the MNIST dataset, the error reduction was 29‒39 %. On the CIFAR-10 dataset, the error reduction was 13‒17 %. A comparison of the proposed architecture of the ensemble classifier with the architecture of the transformer-type classifier demonstrated a decrease in the volume of calculations while reducing the error. For the CIFAR-10 dataset, an error reduction of 8 % was achieved with a calculation volume of less than 22 times. For the MNIST dataset, the error reduction was 62 % when winning by the volume of calculations by 50 times
URI (Унифицированный идентификатор ресурса): http://dspace.opu.ua/jspui/handle/123456789/13155
Располагается в коллекциях:Статті каф. ІС

Файлы этого ресурса:
Файл Описание РазмерФормат 
258187-Article Text-599354-1-10-20220630.pdf328.15 kBAdobe PDFПросмотреть/Открыть


Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.