In software practice, static analysis tools remain an integral part of detecting defects in software and there have been various tools designed to run the analysis in different programming languages like Java, C++, and Python. This paper presents an empirical comparison of popular static analysis tools for identifying software defects using several datasets using Java, C++, and Python code. The study used popular analysis tools such as SonarQube, PMD, Checkstyle, and FindBugs to perform the comparison based on using the datasets. The study also used various evaluation metrics such as Precision, Recall, and F1-score to determine the performance of each analysis tool. The study results show that SonarQube performs considerably well than all other tools in terms of its defect detection across the various three programming languages. These findings remain consistent with other existing studies that also agree on SonarQube being an effective tool for defect detection in software. The study contributes to much insight on static analysis tools with different programming languages and additional information to understand the strengths and weaknesses of each analysis tool. The study also discusses the implications for software development researchers and practitioners, and future directions in this area. Our research approach aim is to provide a recommendation guideline to enable software developers, practitioners, and researchers to make the right choice on static analysis tools to detect errors in their software codes. Also, for researchers to embark on investigating and improving software analysis tools to enhance the quality and reliability of the software systems and its software development processes practice.
翻译:暂无翻译