Аннотация:
The object of research in this work is ensemble classifiers with stacking, intended for the
classification of objects in images with the presence of small sets of labeled data for training.
To improve the quality of classification at the
first stage of such a classifier, it is necessary to
place more primary classifiers that differ in heterogeneous structured processing. However, the
number of known neural networks with appropriate characteristics is limited. One approach
to solving this problem is to build analogs of
known neural networks that make classification errors on other images compared to the base
network. The disadvantage of the known methods for constructing such analogs is the need to
perform additional floating-point operations.
The current paper proposes and investigates a
new method to form analogs through random
cyclic shifts of rows or columns of input images.
This has made it possible to completely eliminate
additional floating-point operations. The effectiveness of using this method is explained by the
structured processing of input images in basic
neural networks. The use of analogs obtained by
the proposed method does not impose additional
restrictions in practice. This is because the heterogeneity of structured processing in basic neural networks is a typical requirement for them in
an ensemble classifier with stacking.
The simulation for the CIFAR-10 data set
demonstrated that the proposed technique for
constructing analogs allows for a comparative
quality of classification by the ensemble classifier. Using MLP-Mixer analogs provided an
improvement of 4.6 %, and CCT analogs – 5.9 %