Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс:
http://dspace.opu.ua/jspui/handle/123456789/15317
Название: | REDUCING THE VOLUME OF COMPUTATIONS WHEN BUILDING ANALOGS OF NEURAL NETWORKS FOR THE FIRST STAGE OF AN ENSEMBLE CLASSIFIER WITH STACKING |
Авторы: | Galchonkov, O. Baranov, O. Chervonenko, P. Babilunga, O. |
Ключевые слова: | multilayer perceptron neural network ensemble classifier weighting coefficients classification of objects in images |
Дата публикации: | 2024 |
Библиографическое описание: | Galchonkov O. REDUCING THE VOLUME OF COMPUTATIONS WHEN BUILDING ANALOGS OF NEURAL NETWORKS FOR THE FIRST STAGE OF AN ENSEMBLE CLASSIFIER WITH STACKING / O. Galchonkov, O. Baranov, P. Chervonenko, O. Babilunga // Eastern-European Journal of Enterprise Technologies, 2(9-128), 2024. - 27-35. |
Краткий осмотр (реферат): | The object of research in this work is ensemble classifiers with stacking, intended for the classification of objects in images with the presence of small sets of labeled data for training. To improve the quality of classification at the first stage of such a classifier, it is necessary to place more primary classifiers that differ in heterogeneous structured processing. However, the number of known neural networks with appropriate characteristics is limited. One approach to solving this problem is to build analogs of known neural networks that make classification errors on other images compared to the base network. The disadvantage of the known methods for constructing such analogs is the need to perform additional floating-point operations. The current paper proposes and investigates a new method to form analogs through random cyclic shifts of rows or columns of input images. This has made it possible to completely eliminate additional floating-point operations. The effectiveness of using this method is explained by the structured processing of input images in basic neural networks. The use of analogs obtained by the proposed method does not impose additional restrictions in practice. This is because the heterogeneity of structured processing in basic neural networks is a typical requirement for them in an ensemble classifier with stacking. The simulation for the CIFAR-10 data set demonstrated that the proposed technique for constructing analogs allows for a comparative quality of classification by the ensemble classifier. Using MLP-Mixer analogs provided an improvement of 4.6 %, and CCT analogs – 5.9 % |
URI (Унифицированный идентификатор ресурса): | http://dspace.opu.ua/jspui/handle/123456789/15317 |
Располагается в коллекциях: | 2024 |
Файлы этого ресурса:
Файл | Описание | Размер | Формат | |
---|---|---|---|---|
Reducing_the_volume_of_computations_when_building_.pdf | 573.56 kB | Adobe PDF | Просмотреть/Открыть |
Все ресурсы в архиве электронных ресурсов защищены авторским правом, все права сохранены.