KPI-driven Code Analysis
   HOME

TheInfoList



OR:

{{unreferenced, date=January 2014 KPI driven code analysis (KPI = Key Performance Indicator) is a method of analyzing software source code and source code related IT systems to gain insight into business critical aspects of the development of a software system such as team-performance, time-to-market, risk-management, failure-prediction and much more. The KPI driven code analysis - developed at the
Hasso Plattner Institute The Hasso Plattner Institute for Digital Engineering gGmbH (; HPI) is an information technology non-profit company affiliated with the University of Potsdam in Potsdam, Brandenburg, northeastern Germany. The teaching and research of HPI are f ...
- is a
static program analysis In computer science, static program analysis (also known as static analysis or static simulation) is the analysis of computer programs performed without executing them, in contrast with dynamic program analysis, which is performed on programs duri ...
of source code for the purpose of improving software quality. However, the KPI driven code analysis does not only analyze the source code. Other information sources, such as coding activities, are also included to create a comprehensive impression of the quality and development progress of a software system.


Mode of operation

KPI driven code analysis is a fully automated process which thus enables team activities and modifications to the overall source code of a software system to be monitored in real time. In this way, negative trends become evident as soon as they arise. This “early warning system” thus offers a powerful instrument for reducing costs and increasing development speed. Through the early-warning approach of KPI driven code analysis, every newly introduced level of complexity is discovered in good time and its impact can thus be minimized. Instead of wasting valuable time trying to reduce legacy complexities, developers can use their time for new functionality, helping the team increase productivity.


The human factor

The “human factor” is included in the KPI driven code analysis which means that it also looks at which code was registered by which developer and when. In this way, the quality of software delivered by each individual developer can be determined and any problems in employee qualification, direction and motivation can be identified early and appropriate measures introduced to resolve them.


Sources considered

In order to determine the key performance indicators (KPIs) – figures which are crucial to the productivity and success of software development projects – numerous data sources related to the software code are read out. For this purpose, KPI driven code analysis borrows methods taken from
data mining Data mining is the process of extracting and finding patterns in massive data sets involving methods at the intersection of machine learning, statistics, and database systems. Data mining is an interdisciplinary subfield of computer science and ...
and
business intelligence Business intelligence (BI) consists of strategies, methodologies, and technologies used by enterprises for data analysis and management of business information. Common functions of BI technologies include Financial reporting, reporting, online an ...
, otherwise used in accounting and customer analytics. The KPI driven code analysis extracts data from the following sources and consolidates them in an analysis data model. On this data model, the values of the key performance indicators are calculated. The data sources include, in particular: *
Revision Control Version control (also known as revision control, source control, and source code management) is the software engineering practice of controlling, organizing, and tracking different versions in history of computer files; primarily source code ...
, also known as
version control Version control (also known as revision control, source control, and source code management) is the software engineering practice of controlling, organizing, and tracking different versions in history of computer files; primarily source code t ...
. In this system every step of each individual developer is tracked for the entire life cycle of the software system. The data describes: “Which developer changed what when.” This data provides a basis for answering the question, “What effort or development cost has been invested in which areas of code?” Prominent revision control systems are
Subversion Subversion () refers to a process by which the values and principles of a system in place are contradicted or reversed in an attempt to sabotage the established social order and its structures of Power (philosophy), power, authority, tradition, h ...
,
Git Git () is a distributed version control system that tracks versions of files. It is often used to control source code by programmers who are developing software collaboratively. Design goals of Git include speed, data integrity, and suppor ...
,
Perforce Perforce Software, Inc. is an American developer of software used for developing and running applications, including version control software, web-based repository management, developer collaboration, application lifecycle management, web applic ...
,
Mercurial Mercurial is a distributed revision control tool for software developers. It is supported on Microsoft Windows, Linux, and other Unix-like systems, such as FreeBSD and macOS. Mercurial's major design goals include high performance and scalabi ...
,
Synergy Synergy is an interaction or cooperation giving rise to a whole that is greater than the simple sum of its parts (i.e., a non-linear addition of force, energy, or effect). The term ''synergy'' comes from the Attic Greek word συνεργία ' f ...
,
ClearCase IBM DevOps Code ClearCase (also known as IBM Rational ClearCase) is a family of computer software tools that supports software configuration management (SCM) of source code and other software development assets. It also supports design-data manage ...
, … * Software Test Systems. These provide a read-out as to which parts of the source code have already been tested. With this information, it becomes obvious where there are gaps in testing, possibly even where these gaps were intentionally left (due to the significant cost and effort involved in setting up tests). *
Bug Tracking System Tracking system or defect tracking system is a software application that keeps track of reported software bugs in software development projects. It may be regarded as a type of issue tracking system. Many bug tracking systems, such as those used ...
s (
Bug Tracker Tracking system or defect tracking system is a software application that keeps track of reported software bugs in software development projects. It may be regarded as a type of issue tracking system. Many bug tracking systems, such as those used ...
). This information can be used in combination with the information provided by the revision control system to help draw conclusions on the error rate of particular areas of code. *
Issue tracking system An issue tracking system (also ITS, trouble ticket system, support ticket, request management or incident ticket system) is a computer software package that manages and maintains lists of issues. Issue tracking systems are generally used in collab ...
s. The information produced by these systems, in conjunction with the information from revision control, enables conclusions to be drawn regarding development activity related to specific technical requirements. In addition, precise data on time investment can be utilized for the analysis. * Performance profilers (
Profiling (computer programming) In software engineering, profiling (program profiling, software profiling) is a form of dynamic program analysis that measures, for example, the space (memory) or time complexity of a program, the usage of particular instructions, or the freque ...
). The data on the performance of the software system help to analyze which areas of code consume the most CPU resources.


Analysis results

Due to the many influencing factors which feed into the analysis data model, methods of optimizing the source code can be identified as well as requirements for action in the areas of employee qualification, employee direction and development processes: * Knowledge as to where source code needs to be reworked because it is too complex or has an inferior runtime performance: ** Deep nesting which exponentially increases the number of control flow paths. ** Huge, monolithic code units in which several aspects have been mixed together so that to change one aspect, changes have to be implemented at several points. ** Identification of unnecessary multi-threading. Multi-threading is an extremely large error source. The run-time behavior of multi-threading code is hard to comprehend meaning the cost and effort required for extensions or maintenance to it is correspondingly high. Thus, as a general rule, unnecessary multi-threading should be avoided. * Identification of insufficient exception handling. If there are too few try-catch blocks in the code or if nothing is executed in the catch function, the consequences, if program errors arise, can be serious. * Identification of which sections of source code have been altered since the last software test, i.e. where tests must be performed and where not. This information enables software tests to be planned more intelligently: new functionality can be tested more intensively or resources saved. * Knowledge of how much cost and effort will be required for the development or extension of a particular software module: ** When extending existing software modules, a recommendation for action could be to undertake code refactoring. ** Any newly developed functionality can be analyzed to ascertain whether a target/performance analysis has been performed for the costs and if so why. Were the causes of the deviations from the plan identified, can measures be implemented to increase accuracy in future planning. * By tracing which developer (team) produced which source code and examining the software created over a sustained period, any deficiencies can be identified as either one-off slips in quality, evidence of a need for improved employee qualification or whether the software development process requires further optimization. Finally the analysis data model of the KPI driven code analysis provides IT project managers, at a very early stage, with a comprehensive overview of the status of the software produced, the skills and effort of the employees as well as the maturity of the software development process. One method of representation of the analysis data would be so-called
software map A software map represents static, dynamic, and evolutionary information of software systems and their software development processes by means of 2D or 3D map-oriented information visualization. It constitutes a fundamental concept and tool in softw ...
s.


See also

*
Program analysis (computer science) Program (American English; also Commonwealth English in terms of computer programming and related activities) or programme (Commonwealth English in all other meanings), programmer, or programming may refer to: Business and management * Program m ...
*
Dynamic program analysis Dynamics (from Greek δυναμικός ''dynamikos'' "powerful", from δύναμις ''dynamis'' " power") or dynamic may refer to: Physics and engineering * Dynamics (mechanics), the study of forces and their effect on motion Brands and en ...
*
Shape analysis (software) In program analysis, shape analysis is a static code analysis technique that discovers and verifies properties of linked, dynamically allocated data structures in (usually imperative) computer programs. It is typically used at compile time to fi ...
*
Formal semantics of programming languages In programming language theory, semantics is the rigorous mathematical study of the meaning of programming languages. Semantics assigns computational meaning to valid string (computer science), strings in a programming language syntax. It is cl ...
*
Formal verification In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of a system with respect to a certain formal specification or property, using formal methods of mathematics. Formal ver ...
*
Software testing Software testing is the act of checking whether software satisfies expectations. Software testing can provide objective, independent information about the Quality (business), quality of software and the risk of its failure to a User (computin ...
*
Code audit A software code audit is a comprehensive analysis of source code In computing, source code, or simply code or source, is a plain text computer program written in a programming language. A programmer writes the human readable source code to co ...
*
Documentation generator In software development, a documentation generator is an automation technology that generates documentation. A generator is often used to generate API documentation which is generally for programmers or operational documents (such as a manual) f ...
*
List of tools for static code analysis This is a list of notable tools for static program analysis (program analysis is a synonym for code analysis). Static code analysis tools Languages Ada * * * * * * * * * C, C++ * * Axivion Suite (Bauhaus) * * * ...


External links


Interactive Rendering of Complex 3D-Treemaps








* ttp://www.hpi.uni-potsdam.de/doellner/publications/year/2012/2126/TTD2012.html ViewFusion: Correlating Structure and Activity Views for Execution Traces
A Visual Analysis and Design Tool for Planning Software Reengineerings


* ttp://www.hpi.uni-potsdam.de/doellner/publications/year/2011/2048/BOH11.html Visualization of Execution Traces and its Application to Software Maintenance
Understanding Complex Multithreaded Software Systems by Using Trace Visualization


* ttp://www.hpi.uni-potsdam.de/doellner/publications/year/2009/812/BKD09.html Visualizing Massively Pruned Execution Traces to Facilitate Trace Exploration
Projecting Code Changes onto Execution Traces to Support Localization of Recently Introduced Bugs


Program analysis