Automated Static Analysis Tools (ASATs) are part of software development best practices. ASATs are able to warn developers about potential problems in the code. On the one hand, ASATs are based on best practices so there should be a noticeable effect on software quality. On the other hand, ASATs suffer from false positive warnings, which developers have to inspect and then ignore or mark as invalid. In this article, we ask the question if ASATs have a measurable impact on external software quality, using the example of PMD for Java. We investigate the relationship between ASAT warnings emitted by PMD on defects per change and per file. Our case study includes data for the history of each file as well as the differences between changed files and the project in which they are contained. We investigate whether files that induce a defect have more static analysis warnings than the rest of the project. Moreover, we investigate the impact of two different sets of ASAT rules. We find that, bug inducing files contain less static analysis warnings than other files of the project at that point in time. However, this can be explained by the overall decreasing warning density. When compared with all other changes, we find a statistically significant difference in one metric for all rules and two metrics for a subset of rules. However, the effect size is negligible in all cases, showing that the actual difference in warning density between bug inducing changes and other changes is small at best.
翻译:自动静态分析工具(ASAT)是软件开发最佳做法的一部分。 ASAT能够提醒开发者注意代码中的潜在问题。一方面, ASAT基于最佳做法,因此对软件质量有明显的影响。另一方面, ASAT受到虚假正面警告,开发者必须检查这些警告,然后忽略或标记为无效。在本篇文章中,我们问ASAT是否对外部软件质量有可衡量的影响,使用 Java PMD 的例子。我们调查PMD对每个变化和每个文件的缺陷发出的ASAT警告之间的关系。我们的案例研究包括每个文件的历史数据以及已修改的文件与所包含文件的项目之间的差异。我们调查造成缺陷的文件是否有比项目其余部分更静态的分析警告。此外,我们调查两套不同的ASAT规则的影响。我们发现,错误导出的文件中比当时项目的其他文件含有更少的静态分析警告警告。然而,这可以用总体警告密度下降来解释。与所有其它文件之间的实际密度差异相比较,我们发现,在统计性规则的精确性差异是最小的。