INVESTIGATION OF CODE SMELLS IN ECLIPSE FRAMEWORK USING SONARQUBE: AN EMPIRICAL ANALYSIS

Authors

  • Simon Kawuma Software and Informatics Engineering Department, Mbarara University Science and of Technology, Uganda
  • David Sabiiti Bamutura Computer Science Department, Mbarara University Science and of Technology, Uganda
  • Aggrey Obbo Software and Informatics Engineering Department, Mbarara University Science and of Technology,
  • Evarist Nabaasa Computer Science Department, Mbarara University Science and of Technology, Uganda

DOI:

https://doi.org/10.15282//ijsecs.11.2.2025.14.0146

Keywords:

Eclipse, Public APIs, Internal APIs, Code Smell, Software Quality

Abstract

The Eclipse Framework provides both public and internal Application Programming Interfaces (APIs). Public APIs are widely supported and encouraged for use, while internal APIs are considered immature and subject to frequent changes. However, the quality of these APIs is not guaranteed, with many users reporting code smells, which could lead to application failures if unresolved. While some studies indicate that not all code smells can be easily fixed, users often face the challenge of either addressing these issues themselves or abandoning problematic APIs. To address this, we conducted an empirical investigation using the SonarQube static code analysis tool on 28 major Eclipse releases, aiming to identify code-smell-free APIs. Our study provides a dataset of 218K code-smell-free public APIs and 321K internal APIs. We found that 87.3% of public APIs and 91.5% of internal APIs in the analyzed releases are free from code smells, highlighting the importance of using these cleaner alternatives for application stability and long-term usability. Furthermore, we have discovered that the number of code smells proportionately increases as the Eclipse framework evolves. The average number of code smells and technical debt is 147K and 2,744 days respectively in all the studied Eclipse releases.  Results from this study can be used by both interface providers and users as a starting point to recognize code smell-free interfaces and estimate efforts needed to fix code smells in each version of Eclipse.

References

[1] Tourwé T, Mens T. Automated support for framework-based software. In: Proceedings of the International Conference on Software Maintenance (ICSM 2003). IEEE; 2003. p. 148–157.

[2] Konstantopoulos D, Marien J, Pinkerton M, Braude E. Best principles in the design of shared software. In: Proceedings of the 33rd Annual IEEE International Computer Software and Applications Conference (COMPSAC 2009). IEEE; 2009. Vol. 2. p. 287–292.

[3] Businge J, Kawuma S, Openja M, Bainomugisha E, Serebrenik A. How stable are Eclipse application framework internal interfaces? In: Proceedings of the 26th IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER). IEEE; 2019. p. 117–127.

[4] de Rivières J. How to use the Eclipse APIs [Internet]. Eclipse Foundation; 2023 Dec [cited 2024 Jan]. Available from: https://www.eclipse.org/articles/article/?file=Article-API-Use/index.html

[5] jBPM Team. The jBPM APIs [Internet]. Red Hat; 2023 Dec [cited 2024 Jan]. Available from: https://docs.jboss.org/jbpm/v5.0/userguide/ch05.html#d0e2099

[6] Oracle. Why developers should not write programs that call “sun” packages [Internet]. Oracle; 2023 Dec [cited 2024 Jan]. Available from: https://www.oracle.com/java/technologies/faq-sun-packages.html

[7] Bechtold S, Herges R, Lang S, et al. JUnit 5 user guide [Internet]. JUnit; 2023 Dec [cited 2024 Jan]. Available from: https://junit.org/junit5/docs/current/user-guide/#api-evolution

[8] Eclipse Foundation. Provisional API guidelines [Internet]. Eclipse Wiki; 2024 Jan [cited 2024 Jan]. Available from: https://wiki.eclipse.org/Provisional_API_Guidelines

[9] Businge J, Serebrenik A, van den Brand M. Eclipse API usage: the good and the bad. Software Quality Journal. 2015;23:107–141.

[10] Hora A, Valente MT, Robbes R, Anquetil N. When should internal interfaces be promoted to public? In: Proceedings of the 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering. ACM; 2016. p. 278–289.

[11] Businge J, Serebrenik A, van den Brand M. Analyzing the Eclipse API usage: putting the developer in the loop. In: Proceedings of the 17th European Conference on Software Maintenance and Reengineering. IEEE; 2013. p. 37–46.

[12] Johannes D, Khomh F, Antoniol G. A large-scale empirical study of code smells in JavaScript projects. Software Quality Journal. 2019;27:1271–1314.

[13] Gupta A, Suri B, Misra S. Code bad smells in Java source code: a systematic literature review. In: Proceedings of ICCSA 2017. Springer; 2017. p. 665–682.

[14] Walker A, Das D, Cerny T. Automated code-smell detection in microservices through static analysis: a case study. Applied Sciences. 2020;10(21):7800.

[15] Martins ADF, Melo CS, Monteiro JM, Machado JC. Class change proneness prediction using software metrics and code smells. In: Proceedings of ICEIS. 2020. p. 140–147.

[16] Alfadel M, Aljasser K, Alshayeb M. Relationship between design patterns and code smells: an empirical study. PLoS One. 2020;15(4):e0231731.

[17] Li F, Zhang M, Wang S, Zhang J, Gu Q. On the relative value of imbalanced learning for code smell detection. Journal of Systems and Software. 2023;198:111566.

[18] Paiva T, Damasceno A, Figueiredo E, Sant’Anna C. On the evaluation of code smells and detection tools. Journal of Software Engineering Research and Development. 2017;5:1–28.

[19] Gradišnik M, Hericko M. Impact of code smells on defect rate: a literature review. CEUR Workshop Proceedings. 2018;2217:27–30.

[20] Menshawy RS, Yousef AH, Salem A. Code smells and detection techniques: a survey. In: Proceedings of MIUCC 2021. IEEE; 2021. p. 78–83.

[21] Mansoor U, Kessentini M, Maxim BR, Deb K. Multi-objective code-smells detection using good and bad design examples. Software Quality Journal. 2017;25:529–552.

[22] Doğan E, Tüzün E. Towards a taxonomy of code review smells. Information and Software Technology. 2022;142:106737.

[23] Tahir A, Yamashita A, Licorish S, Dietrich J, Counsell S. How developers discuss code smells on Stack Overflow. In: Proceedings of EASE 2018. ACM; 2018. p. 68–78.

[24] Campbell A. SonarQube documentation [Internet]. 2022 Jan [cited 2024 Jan]. Available from: https://scm.thm.de/sonar/documentation/user-guide/metric-definitions/

[25] Kawuma S, Businge J, Bainomugisha E. Stable alternatives for unstable Eclipse interfaces. In: Proceedings of ICPC 2016. IEEE; 2016. p. 1–10.

[26] Kawuma S, Nabaasa E. Identification of promoted Eclipse unstable interfaces using clone detection technique. 2018.

[27] Guerrouj L, Azad A, Antoniol G, Guéhéneuc YG. Investigating the relation between lexical smells and change- and fault-proneness. Software Quality Journal. 2017;25:641–670.

[28] Seref B, Tanriover O. Software code maintainability: a literature review. International Journal of Software Engineering and Applications. 2016;7(3):1–16.

[29] Jafari AJ, Costa DE, Abdalkareem R, Shihab E, Tsantalis N. Dependency smells in JavaScript projects. IEEE Transactions on Software Engineering. 2021;48(10):3790–3807.

[30] Lacerda G, Petrillo F, Pimenta M, Guéhéneuc YG. Code smells and refactoring: a tertiary systematic review. Journal of Systems and Software. 2020;167:110610.

[31] Agnihotri M, Chug A. Software metrics, code smells and refactoring techniques: a systematic survey. International Journal of Information Processing Systems. 2020;16(4):915–934.

[32] dos Santos HM, Durelli VH, Souza M, Figueiredo E, Silva LT, Durelli RS. Cleangame: gamifying the identification of code smells. In: Proceedings of the Brazilian Symposium on Software Engineering. 2019. p. 437–446.

[33] Meananeatra P. Identifying refactoring sequences for improving software maintainability. In: Proceedings of ASE 2012. IEEE; 2012. p. 406–409.

[34] Taibi D, Janes A, Lenarduzzi V. How developers perceive smells in source code. Information and Software Technology. 2017;92:223–235.

[35] Yamashita A, Moonen L. Do developers care about code smells? In: Proceedings of WCRE 2013. IEEE; 2013. p. 242–251.

[36] Kim DJ. An empirical study on the evolution of test smells. In: Proceedings of ICSE Companion 2020. IEEE; 2020. p. 149–151.

[37] Jain S, Saha A. An empirical study on research and developmental opportunities in refactoring practices. In: SEKE 2019. 2019. p. 313–318.

[38] Békefi BF, Szabados K, Kovács A. A case study on the effects and limitations of refactoring. In: Proceedings of Informatics 2019. IEEE; 2019. p. 213–218.

[39] Vidal S, Berra I, Zulliani S, Marcos C, Pace JAD. Assessing the refactoring of brain methods. ACM Transactions on Software Engineering and Methodology. 2018;27(1):1–43.

[40] AbuHassan A, Alshayeb M, Ghouti L. Software smell detection techniques: a systematic literature review. Journal of Software Engineering and Process. 2021;33(3):e2320.

[41] Kaur A, Dhiman G. A review on search-based tools and techniques to identify bad code smells in object-oriented systems. International Journal of Applied Engineering Research. 2019;14:909–921.

[42] Tufano M, Palomba F, Bavota G, et al. When and why your code starts to smell bad. In: Proceedings of ICSE 2015. IEEE; 2015. p. 403–414.

[43] Sharma T, Efstathiou V, Louridas P, Spinellis D. Code smell detection by deep direct-learning and transfer-learning. Journal of Systems and Software. 2021;176:110936.

[44] Khleel NAA, Nehéz K. Deep convolutional neural network model for bad code smells detection. International Journal of Electrical Engineering and Computer Science. 2022;26(3):1725–1735.

[45] Dewangan S, Rao RS, Mishra A, Gupta M. Code smell detection using ensemble machine learning algorithms. Applied Sciences. 2022;12(20):10321.

[46] Das AK, Yadav S, Dhal S. Detecting code smells using deep learning. In: TENCON 2019. IEEE; 2019. p. 2081–2086.

[47] Zhang Y, Dong C. MARS: detecting brain class and brain method code smells. Journal of Software Engineering and Process. 2024;36(1):e2403.

[48] Mhawish MY, Gupta M. Predicting code smells and analysis of predictions using machine learning techniques. Journal of Computer Science and Technology. 2020;35(6):1428–1445.

[49] Fawad M, Rasool G, Palma F. Android source code smells: a systematic literature review. Software: Practice and Experience. 2024;54(2):345–372.

[50] Wu Z, Chen X, Lee SJ. A systematic literature review on Android-specific smells. Journal of Systems and Software. 2023;201:111677.

[51] Hurtado Alegría JA, Bastarrica MC, Bergel A. Avispa: a tool for analyzing software process models. Journal of Software Engineering and Process. 2014;26(4):434–450.

[52] Eclipse Foundation. Eclipse project archived downloads [Internet]. 2021 Jan [cited 2024 Jan]. Available from: https://archive.eclipse.org/eclipse/downloads/index.php

[53] Eclipse Foundation. Eclipse IDE for Java developers [Internet]. [cited 2024 Jan]. Available from: https://www.eclipse.org/downloads/

[54] Lenarduzzi V, Sillitti A, Taibi D. Analyzing forty years of software maintenance models. In: Proceedings of ICSE-C 2017. IEEE; 2017. p. 146–148.

[55] Lenarduzzi V, Sillitti A, Taibi D. A survey on code analysis tools for software maintenance prediction. In: SEDA 2018. Springer; 2020. p. 165–175.

[56] Marcilio D, Bonifácio R, Monteiro E, Canedo E, Luz W, Pinto G. Are static analysis violations really fixed? In: Proceedings of ICPC 2019. IEEE; 2019. p. 209–219.

[57] Vassallo C, Panichella S, Palomba F, Proksch S, Gall HC, Zaidman A. How developers engage with static analysis tools in different contexts. Empirical Software Engineering. 2020;25:1419–1457.

[58] Lavazza L, Tosi D, Morasca S. An empirical study on the persistence of SpotBugs issues in open-source software evolution. In: Proceedings of QUATIC 2020. Springer; 2020. p. 144–151.

[59] Businge J, Kawuma S, Bainomugisha E, Khomh F, Nabaasa E. Code authorship and fault-proneness of open-source Android applications. In: Proceedings of PROMISE 2017. ACM; 2017. p. 33–42.

[60] Lenarduzzi V, Lomio F, Huttunen H, Taibi D. Are SonarQube rules inducing bugs? In: Proceedings of SANER 2020. IEEE; 2020. p. 501–511.

[61] Campbell A. Sonar rules [Internet]. 2022 Jan [cited 2024 Jan]. Available from: https://rules.sonarsource.com/java/

[62] Kawuma S, Nabaasa E. An empirical study of bugs in Eclipse stable internal interfaces. 2022.

[63] Kawuma S, Bamutura DS, Obbo A, Mabirizi V, Kabarungi M, Nabaasa E. Eclipse application programming interfaces: how buggy are they? VFAST Transactions on Software Engineering. 2025;13(2):228–244.

Published

2026-01-26

How to Cite

[1]
Simon Kawuma, David Sabiiti Bamutura, Aggrey Obbo, and Evarist Nabaasa, “INVESTIGATION OF CODE SMELLS IN ECLIPSE FRAMEWORK USING SONARQUBE: AN EMPIRICAL ANALYSIS”, IJSECS, vol. 11, no. 2, pp. 176–187, Jan. 2026, doi: 10.15282//ijsecs.11.2.2025.14.0146.

Similar Articles

1-10 of 39

You may also start an advanced similarity search for this article.