ebook img

Open Access proceedings Journal of Physics: Conference series PDF

13 Pages·2012·0.91 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Open Access proceedings Journal of Physics: Conference series

Experiences with Software Quality Metrics in the EMI middleware M Alandes1 E M Kenny2, D Meneses1 and G Pucciani1 1 European Organization for Nuclear Research, CERN CH-1211, Genève 23, Switzerland 2 Trinity College Dublin, College Green, Dublin 2, Ireland E-mail: [email protected] Abstract. The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to extract “code metrics” on the status of the software products and “process metrics” related to the quality of the development and support process such as reaction time to critical bugs, requirements tracking and delays in product releases. 1. Introduction According to the standard ISO 9001, the quality of something can be determined by comparing a set of inherent characteristics with a set of requirements. If those inherent characteristics meet all requirements, high or excellent quality is achieved. If those characteristics do not meet all requirements, a low or poor level of quality is achieved. Quality is, therefore, a question of degree. As a result, the central quality question is: How well does this set of inherent characteristics comply with this set of requirements? In short, the quality of something depends on a set of inherent characteristics and a set of requirements and how well the former complies with the latter. Software Quality Engineering (SQE) is the process that evaluates, assesses, and improves the quality of software. Software quality is often defined as the degree to which software meets requirements for reliability, maintainability, transportability, etc, as contrasted with functional, performance, and interface requirements that are satisfied as a result of software engineering. A quality model helps evaluating the software product and process quality. It helps to set quality goals for software products and processes. The EMI project (European Middleware Initiative) is comprised of 28 software development teams, called product teams (PTs), who develop the 56 EMI software products. EMI PTs are coming from major middleware providers like ARC, dCache, gLite and UNICORE who have been developing software in the grid domain for the past several years. The EMI Quality Model helps to evaluate the quality of the EMI software products taking into account the existing working methods and tools of the PTs. 2. EMI Quality Model The EMI Quality Model uses ISO/IEC 9126 Software Engineering – Product Quality standard to identify a set of characteristics that need to be present in EMI software products and processes to be able to meet EMI quality requirements. EMI quality requirements are based on DCIs quality requirements, like the UMD Quality Criteria from the EGI project, and internal project objectives that influence qualitative aspects of the EMI software, as specified in the EMI Description of Work. 2.1. Quality Requirements EMI quality requirements are defined by taking into account internal EMI quality criteria and quality criteria coming from EMI users, like EGI as defined in the UMD Quality Criteria [R3]. UMD Quality Criteria is summarized below: 1. Functional Description: all products must provide a document with a brief functional description of the product. 2. Release Notes: all products must provide a document with the release notes. 3. User Documentation: all products must provide a document describing how to use it. 4. Online help (man pages): all products with end user command line tools must include man pages or online help. 5. API Documentation: public API of products must be documented. 6. Administrator Documentation: products must provide an administrator guide describing installation, configuration and operation of the system. 7. Service Reference Card: for each of the services that a product runs, document its characteristics with a reference card. 8. Software License: products must have a compatible license for using them in the EGI infrastructure. 9. Release changes testing: changes in a release of a product must be tested. 10. Source Code Availability: products should provide their source code. 11. Source Distribution: technology providers should provide buildable source distributions of products. 12. Binary Distribution: products must be available in the native packaging format of the supported platform. 13. Backwards compatibility: minor/revision releases of a product must be backwards compatible. 14. Service control and status: services run by the product must provide a mechanism for starting, stopping and querying the status of services. 15. Log files: all services should create log files where the service administrator can trace most relevant actions taken. 16. Service Reliability: services must maintain a good performance and reliability over long periods of time with normal operations. 17. Service Robustness: services should not produce unexpected results or become uncontrollable when taxed beyond normal capacity. 18. Automatic configuration: products that provide tools for configuration that cover typical deployments must assure tools work as documented. 19. World writeable files: products must not create world-writeable files or directories. 20. Directory traversal attacks testing: products should assure that directory traversal exploits are not possible using their interfaces. 21. Incident Tracking: EMI must enroll as 3rd level support in the EGI Helpdesk. EMI internal quality criteria are defined as complementary criteria to that of the EMI users and it is based from the EMI project objectives as described in the EMI Description of Work. • EMI Objective 1: Simplify and organize the different middleware services implementations by delivering a streamlined, coherent, tested and standard compliant distribution able to meet and exceed the requirements of EGI, PRACE and other distributed computing infrastructures and their user communities. • EMI Objective 2: Increase the interoperability, manageability, usability and efficiency of the services by developing or integrating new functionality as needed following existing and new requirement of EGI, PRACE and other infrastructures and their user communities. • EMI Objective 3: Support efficient, reliable operations of EGI, PRACE and other DCIs by reactively and proactively supporting and maintaining the middleware distribution and providing users with increasingly user-friendly, maintainable, reliable, stable, and scalable software. • EMI Objective 4: Strengthen the participation and support for user communities in the definition and evolution of middleware services by promoting the EMI achievements, objectives and plans, and move the EMI middleware towards a more sustainable model by expanding the collaboration with national and international research agencies, scientific research programs and with industrial providers. 2.2. Quality Characteristics Once EMI software product quality requirements are defined, the software product quality characteristics which define the quality requirements can be determined. In order to do this, quality characteristics from ISO/IEC 9126 are analysed within the context and objectives of the EMI project. Two more characteristics have been taken into account as well: EPEL and Debian repositories compliance. One of the main goals of the EMI project is to provide a sustainable model at the end of the project. Being able to deliver middleware packages into EPEL and Debian repositories is fundamental to move towards an open source like model where middleware developers can distribute their packages through EPEL and Debian to their user community. For each of the defined characteristics, the following areas have been analysed:  Importance for EMI: the aim is to determine how much attention should be paid to the characteristic to meet EMI quality requirements. Possible values are High, Medium and Low.  Risks: the aim is to determine the possible effects of the problems caused when the characteristic is not present in the EMI software.  Indicators: the aim is to determine which indicators can be used to make the presence of the characteristic visible.  Measures: the aim is to determine which measures are necessary to control the characteristic. Table 1 shows a summary of the thorough analysis of each characteristic: Characteristic Subcharacteristic Quality Requirement Importance Functionality Suitability EMI Objective 2 High UMD 14,15,18,19,20,21 Accuracy EMI Objective 3 Low Interoperability EMI Objective 2 High Security EMI Objective 1 High UMD 19,20 Functionality compliance EMI Objective 1 High Reliability Maturity EMI Objective 3 High UMD 16,17 Recoverability EMI Objective 3 Medium UMD 16,17 Usability Understandability EMI Objective 1 High UMD 1,2,3,4,5,6,7,10 Operability EMI Objective 3 High UMD 14,18 Efficiency Resource utilisation EMI Objective 3 Low UMD 17 Maintainability Changeability EMI Objective 2 High Stability EMI Objective 3 High UMD 13,16,17 Testability EMI Objective 1 High UMD 9 Maintainability Compliance EMI Objective 3 High Portability Adaptability EMI Objective 1 and 4 High Installability EMI Objective 1 High UMD 8,11,12 Replaceability Objective 2 Medium Co-existence Objective 1 High EMI sustainability plan High EPEL and Debian Compliance objective. Table 1 - Quality Characteristics vs. Quality Requirements Identifying the characteristics that need to be present in the software to meet the existing quality requirements, and understanding what we needs to done to measure whether they are present or not, is the basis of the quality model. In the next section, we define which metrics are needed to be able to measure the presence of the software characteristics. 3. Metrics and KPIs EMI metrics are calculated to measure the presence of those quality characteristics evaluated as highly important for the EMI middleware. KPIs are also calculated. They are defined in the EMI Description of Work and they are normally calculated every quarter. KPIs also relate to the analysed software characteristics. Table 2 presents a summary of the metrics that are needed to evaluate each quality characteristic. Metrics Quality characteristic Number of technical objectives Number of user requirements Suitability Number of development tasks. Number of GGUS tickets related to lack of accuracy. Accuracy KPI KJRA1.2 Number of Interoperable Interface Usage. Interoperability Number of EMI security assessments. Number of fixed security vulnerabilities. Security Number of EGI SVG tickets still opened after the defined deadline. KPI KJRA1.1 Number of Adopted Open Standard Interfaces Functionality compliance KPI KSA1.4 Number of urgent changes. Maturity Number of services providing high-availability setups. Recoverability Number of missing mandatory documents. Understandability Number of EMT tasks tracking documentation issues. Number of services providing service control and status mechanisms. Operability Number of services providing configuration tools. Number of GGUS tickets related to resource utilisation issues. Resource utilisation KPI KSA1.2 Incident Resolution Time. Changeability KPI KSA1.5 Change Application Time KPI KSA1.1 Number of incidents Stability KPI KSA1.3 Number of problems. Number of Test Plans. Number of Test Reports per released EMI software product. Number of mandatory tests per EMI software product. Number of RfCs tracking a defect with an associated regression test. Number of RfCs tracking a new feature with an associated Testability functionality test. Number of development tasks tracking a new feature with an associated functionality test. Number of passed certification checks. KPI KJRA1.3 Number of Reduced lines of code. Maintainability KPI KJRA1.4 Number of reduced released products. compliance Number of supported platforms. Adaptability Number of standard installation tools per supported platform. Number of standard package formats per supported platforms per Installability released product. KPI KNA2.4 Number of EMI products included in standard Co-existence repositories, Linux distributions, etc RPMlint and Lintian EPEL and Debian compliance Table 2 - EMI Metrics Metrics in the EMI project are described in detail in the EMI Metrics Specification and are divided into:  Process related metrics: they are related to software changes and user support. They use information stored in the tracking tools and user support tools, as explained in the upcoming sections.  Product related metrics: they are related to the software itself, like RPMlint or SLOC. 3. Tools To cope with the task of producing reports in a regular fashion including all the many metrics that need to be calculated, a high level of automation is needed. Moreover, the sources of information within the EMI project are very heterogeneous: different tracking tools and programming languages are used by the product teams. A common layer on top of the existing tools is also necessary to be able to automate the calculation of metrics in an easy way. The following subsections describe the tools and dashboards that have been developed by the QA team in order to automate the generation of metrics reports. 3.1. ETICS plugins ETICS is the tool that provides a build and packaging infrastructure for the EMI project. The ETICS plugin framework provides the ability of collecting metrics during build and test execution. RPMlint (RPM common problems) and SLOC (number of lines of Code) are some examples of the used ETICS plugins. Figure 1 and figure 2 show a graphical representation of RPMlint and SLOC measurements taken for EMI software products. The metrics plugins are executed during some of the build steps in ETICS. The data generated by the plugins is stored in the ETICS repository. Once the data is stored, it can be queried at any time. In order to generate charts or statistics, the ETICS repository is queried using a web service and converting the data into a specific XML format. A chart generation framework is used later on in the process. The chart generation framework processes the XML data in several ways as defined by its extensions, producing the datasets for the different charts. Its extensibility makes it easy to produce any new charts from the data collected during the builds. Figure 1 – RPMlint errors and warnings per EMI software product Figure 2 – SLOC trend for the EMI Software products released in EMI 1 Update 10 3.2. RfC Dashboard The RfC (Request for Change) Dashboard is a tool that offers a unique entry point to track software changes, like defects and new features, for all EMI products. EMI product teams use different tracking tools of their choice. The EMI QA policies define a common release process that is followed by all product teams, including which is the minimum set of states that need to be present in the tracking tools. Figure 3 shows how the different tracking tools of the middleware providers involved in EMI map to the states defined by the policies. Figure 3 - Mapping of product team tracker states The different tracking tools export their data in an XML file that is used by the RfC Dashboard. In this way the software changes of all products can be tracked in a single place and metrics can be calculated for all of them. The RfC Dashboard uses PHP and HTML forms to provide input to a python based query engine. The query engine produces tabulated results based on the XML files. Figure 4 is a snapshot of the RfC Dashboard that shows a query retrieving results from different product teams. Figure 4 - EMI RfC Dashboard Figure 5 is a graphic showing the number of reported problems per EMI product classified per priority. Metrics like this one can be easily generated after the data collected in the RfC Dashboard. Figure 5 - Number of Problems per EMI product classified per priority 3.3. Verification Dashboard The EMI Verification Dashboard automates the verification of EMI releases in terms of quality. EMI releases are verified against the Production Release Criteria, which is the set of mandatory criteria defined in the EMI QA policies. Most of the checks are done automatically, but some others, like the documentation review, are done manually. The EMI verification dashboard displays all this information for each EMI release assisting the quality control team to carry out this task. Figure 6 presents a snapshot of the dashboard. The EMI verification dashboard retrieves information from the Savannah tool (where EMI releases are tracked) and presents different views to users and other applications, like the RfC Dashboard. It is written in Python, using the web framework Django to present the information through a web interface using standard HTML+ CSS. As a storage backend, it uses the MySQL database, but due to Django's abstraction other databases could be used. For the information retrieval and parsing from Savannah, the BeautifulSoup library is used, solving the problems of inconsistencies in the source HTML code. The EMI Verification Dashboard is not only a very useful tool to automate quality control checks but also a way to easily calculate process and product metrics in various aspects of the software. Figure 7 show graphics on metrics calculated thanks to the data stored in the Verification Dashboard. Thanks to the Dashboard it is possible to calculate statistics and trend diagrams on testing, packaging, documentation and certification. Figure 6 - EMI Verification Dashboard 4. Measurement plan EMI major releases have five major phases from the quality measurement perspective:  EMI major release planning: it’s when the different work area plans containing the technical objectives to be achieved are written and user requirements are gathered.  EMI major release software coding and testing: it is the preparation of the release. The result is a set of packages per software product with the corresponding test and certification reports.  EMI major release availability: it is the moment when software documentation and software repositories are ready and all the new required functionality is available for the users.  EMI major release maintenance: Once the release has made available, software changes to fix defects or introduce new features are released as long as the EMI major release is supported.  EMI major release user support: Once the release has made available, user support is provided as long as the EMI major release is supported. Figure 7 – Metrics calculated with the EMI Verification Dashboard information Metrics are associated to the different phases as presented in table 3. Metrics should be calculated periodically as presented in the Frequency column. Phase Deliverable Metrics Frequency Metrics Report Name EMI major Work Area  Number of technical Every EMI_X_planning_Met release Plans, user objectives major ricsReport, where X is planning requirements release the EMI major release  Number of user 1, 2 or 3. and requirements development  Number of total tasks.

Description:
Service Reference Card: for each of the services that a product runs, document . written in Python, using the web framework Django to present the information
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.