A Hybrid Flip-and-Hide Data Augmentation Strategy for Improving Convolutional Neural Network Performance in Low-Resource Image Classification

Abstract

Deep learning systems rely heavily on large and diverse datasets to achieve strong generalization performance. However, many real-world applications—particularly in developing regions—lack access to extensive image datasets, resulting in reduced model accuracy and increased overfitting. This research proposes a novel hybrid data augmentation strategy titled Flip-and-Hide, designed to support effective training of Convolutional Neural Networks (CNNs) in low-resource environments. The technique leverages two complementary transformations: spatial flipping and region hiding. Spatial flipping introduces geometric variability, while region hiding selectively masks random patches within an image to improve model robustness to occlusions and background noise. We evaluate the Flip-and-Hide technique on two popular benchmark datasets—Fashion-MNIST and CIFAR-10—using a baseline CNN architecture. The results are compared against three experimental conditions: (i) no augmentation, (ii) flipping only, and (iii) Hide-and-Seek region masking. Experimental findings show that the combined approach delivers more stable learning with reduced validation loss, while achieving an accuracy improvement of up to 4.8% compared to models trained with no augmentation. Statistical significance is confirmed through repeated trials across multiple epochs. This research demonstrates that simple yet strategically combined augmentations can significantly enhance image classification performance without additional dataset acquisition or computational overhead. The proposed technique offers a practical and scalable solution for institutions and researchers operating in data-constrained settings, particularly within developing countries where access to high-volume datasets remains limited.

Authors

    Nagaye Wisdom, Yaw Afriyie

Submitted by
on

Generate Citation
Select a style to generate citation.