YOLOv12-based Detection for Early Breast Cancer Screening with a Portable Ultrasound System
DOI:
https://doi.org/10.15282/mekatronika.v7i1.12523Keywords:
Breast cancer, early screening, portable ultrasound, YOLO, Object detectionAbstract
This study explores the application of deep learning for breast lesion detection in ultrasound images using the YOLOv12 object detection model. Leveraging a compact wireless ultrasound probe and an Android-based inference pipeline, the system was developed to enable portable, AI-assisted screening in resource-limited settings. The model was trained and evaluated on an annotated ultrasound dataset and compared against the previous YOLOv11 version. YOLOv12 achieved a mean average precision (mAP) of 90.6% and an F1 score of 88.65%, outperforming YOLOv11 in both accuracy and inference speed. Processing time was also reduced, with YOLOv12 achieving image detection times between 1.34 to 4.42 seconds, compared to YOLOv11’s slower range. These results confirm YOLOv12’s suitability for real-time deployment on mobile platforms. Visual analyses across several test images show that YOLOv12 offers more consistent detection across varying lesion sizes and positions. The system's lightweight design, combined with its robust performance, makes it a promising tool for expanding diagnostic access in rural and underserved regions. Future work will focus on multiclass lesion classification, expanded datasets, and clinical usability studies to further validate its application in real-world healthcare environments.
References
[1] O. Ginsburg, C. H. Yip, A. Brooks, A. Cabanes, M. Caleffi, J. Y. Dunstan, et al., “Breast cancer early detection: A phased approach to implementation,” Cancer, vol. 126, pp. 2379-2393, 2020.
[2] A. N. Giaquinto, H. Sung, K. D. Miller, J. L. Kramer, L. A. Newman, A. Minihan, et al., “Breast cancer statistics, 2022,” CA: A Cancer Journal for Clinicians, vol. 72, no. 6, pp. 524–541, 2022.
[3] American Cancer Society, Breast cancer facts & figures 2022–2024, Atlanta: American Cancer Society; 2022.
[4] World Health Organization. Breast cancer. Geneva: WHO; 2022. Available from: https://www.who.int/news-room/fact-sheets/detail/breast-cancer
[5] F. Bray, J. Ferlay, I. Soerjomataram, R. L. Siegel, L. A. Torre, and A. Jemal, “Global cancer statistics 2018: GLOBOCAN estimates,” CA: A Cancer Journal for Clinicians, vol. 68, no. 6, pp. 394-424, 2018.
[6] National Breast Cancer Foundation, Early detection of breast cancer, 2024. Available from: https://www.nationalbreastcancer.org/early-detection-of-breast-cancer/
[7] M. Løberg, M. L. Lousdal, M. Bretthauer, and M. Kalager, “Benefits and harms of mammography screening,” Breast Cancer Research, vol. 17, no. 1, p. 63, 2015.
[8] M. S. Jochelson and M. B. I. Lobbes, “Contrast-enhanced mammography: State of the art,” Radiology, vol. 299, no. 1, pp. 36-48, 2021.
[9] B. Mesurolle, M. El Khoury, F. Chammings, M. Zhang, and S. Sun, “Breast sonoelastography: Now and in the future,” Diagnostic and Interventional Imaging, vol. 100, no. 10, pp. 567-577, 2019.
[10] G. L. G. Menezes, F. M. Knuttel, B. L. Stehouwer, R. M. Pijnappel, and M. A. A. J. van den Bosch, “Magnetic resonance imaging in breast cancer: A literature review,” World Journal of Clinical Oncology, vol. 5, no. 2, pp. 61–70, 2014.
[11] A. E. Bohte, J. L. Nelissen, J. H. Runge, O. Holub, S. A. Lambert, L. de Graaf, et al. “Breast magnetic resonance elastography: A review of clinical work and future perspectives,” NMR in Biomedicine, vol. 31, no. 4, pp. e3932, 2018.
[12] M. O. Sana and R. M. Mungwira, “Ultrasound use in resource-limited settings: A systematic review,” Journal of Global Radiology, vol. 4, no. 1, pp. 1-10, 2018.
[13] D. M. Becker, C. A. Tafoya, S. L. Becker, G. H. Kruger, M. J. Tafoya, and T. K. Becker, “The use of portable ultrasound devices in low- and middle-income countries: A systematic review of the literature,” Tropical Medicine & International Health, vol. 21, no. 3, pp. 294–311, 2016.
[14] J. K. Spencer and R. S. Adler, “Utility of portable ultrasound in a community in Ghana,” Journal of Ultrasound in Medicine, vol. 27, no. 12, pp. 1735–1743, 2008.
[15] Z. Wang, B. He, Y. Zhang, Z. Li, R. Yao, and K. Huang, “Design and implementation for portable ultrasound-aided breast cancer screening system,” Journal of Biomedical Engineering, vol. 39, no. 3, pp. 390–397, 2022.
[16] E. Martín-Del-Campo-Mena, P. A. Sánchez-Méndez, E. Ruvalcaba-Limón, F. M. Lazcano-Ramírez, A. Hernández-Santiago, J. A. Juárez-Aburto, et al., “Development and validation of an infrared-artificial intelligence software for breast cancer detection,” Exploration of Targeted Anti-Tumor Therapy, vol. 4, no. 2, pp. 294–306, 2023.
[17] Q. Dan, T. Zheng, L. Liu, D. Sun, and Y. Chen, “Ultrasound for breast cancer screening in resource-limited settings: Current practice and future directions,” Cancers (Basel), vol. 15, no. 7, p. 2112, 2023.
[18] J. Amin, M. A. Anjum, M. Sharif, S. Kadry, A. Nadeem, and S. F. Ahmad, “Liver tumor localization based on YOLOv3 and 3D-semantic segmentation using deep neural networks,” Diagnostics (Basel), vol. 12, no. 4, p. 823, 2022.
[19] C. M. Kim, K. Chung, and R. C. Park, “Anomaly detection model of mammography using YOLOv4-based histogram,” Personal and Ubiquitous Computing, vol. 7, no. 3, pp. 1233–1244, 2023.
[20] M. Anas, I. U. Haq, G. Husnain, and S. A. F. Jaffery, “Advancing breast cancer detection: Enhancing YOLOv5 network for accurate classification in mammogram images,” IEEE Access, vol. 12, pp. 16474–16488, 2024.
[21] M. L. Huang and Y. S. Wu, “GCS-YOLOV4-Tiny: A lightweight group convolution network for multi-stage fruit detection,” Mathematical Biosciences and Engineering, vol. 20, no. 1, pp. 241–268, 2023.
[22] T. Yang, L. Yuan, P. Li, and P. Liu, “Real-time automatic assisted detection of uterine fibroid in ultrasound images using a deep learning detector,” Ultrasound in Medicine & Biology, vol. 49, no. 7, pp. 1616–1626, 2023.
[23] Q. Fu and H. Dong, “Spiking neural network based on multi-scale saliency fusion for breast cancer detection,” Entropy (Basel), vol. 24, no. 11, p. 1543, 2022.
[24] A. D. Mohammed and D. Ekmekci, “Breast cancer diagnosis using YOLO-based multiscale parallel CNN and flattened threshold swish,” Applied Sciences, vol. 14, no. 7, p. 2680, 2024.
[25] K. Vanitha, M. Aridoss, K. Chokkanathan, K. Anitha, M. T. Ramakrishna, V. Kumar, et al., “Attention-based feature fusion with external attention transformers for breast cancer histopathology analysis,” IEEE Access, vol. 12, pp. 126296–126312, 2024.
[26] R. Khanam and M. Hussain, “Yolov11: An overview of the key architectural enhancements,” arXiv, 2024. Available from: https://arxiv.org/abs/2410.17725
[27] Y. Tian, Q. Ye, and D. Doermann, “Yolov12: Attention-centric real-time object detectors,” arXiv, 2025. Available from: https://arxiv.org/abs/2502.12524
[28] A. Shah, “Breast Ultrasound Images Dataset,” Kaggle [Internet]. 2021. Available from: https://www.kaggle.com/datasets/aryashah2k/breast-ultrasound-images-dataset/data
[29] B. Dwyer, J. Nelson, T. Hansen, et al., “Roboflow (Version 1.0),” [Software]. Roboflow [Internet]. 2024. Available from: https://roboflow.com.
[30] R. Sapkota, Z. Meng, M. Churuvija, X. Du, Z. Ma, and M. Karkee, “Comprehensive performance evaluation of yolo11, yolov10, yolov9 and yolov8 on detecting and counting fruitlet in complex orchard environments,” Authorea Preprints, 2024.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 The Author(s)

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

