ICSSEA 2012

From Maisqual Wiki

Jump to: navigation, search

We gave a talk during the International Conference on Software & Systems Engineering and their Applications held in Paris in October 2012. Slides can be downloaded from this wiki.

A french translation of the same article has also been published, in the Génie Logiciel quarterly. The article can be downloaded from Academia.edu.

Abstract

Quality has a price. But non-quality is even more expensive. Knowing the cost and consequences of software assets, being able to understand and control the development process of a service, or quickly evaluating the quality of external developments are of primary importance for every company relying on software. Standards and tools have tried with varying degrees of success to address these concerns, but there are many difficulties to be overcome: the diversity of software projects, the measurement process -- from goals and metrics selection to data presentation, or the user's understanding of the reports. These are situations where the SQuORE business intelligence tool introduces a novel decision-based approach to software projects quality assessment by providing a more reliable, more intuitive, and more context-aware view on quality. This in turn allows all project actors of the project to share a common vision of the project progress and performance, which then allows efficient enhancing of the product and process. This position paper presents how SQuORE solves the quality dilemma, and showcases two real-life examples of industrial projects: a unit testing improvement program, and a fully-featured software project management model.


Keywords

  • software quality,
  • key performance indicators,
  • trend analysis,
  • measurement,
  • quality models,
  • process evaluation,
  • business intelligence,
  • project management.


Bibliography

  1. B. W. Boehm, J. R. Brown, and M. Lipow. Quantitative evaluation of software quality. In Proceedings of the 2nd international conference on Software engineering , pages 592-605, San Francisco, California, United States, 1976. IEEE Computer Society Press.
  2. CMMI Product Team. CMMI for Development, Version 1.3. Technical report, Carnegie Mellon University, 2010.
  3. Martin Fowler. Martin Fowler on Technical Debt, 2004.
  4. Maurice H. Halstead. Elements of Software Science (Operating and programming systems series) . Elsevier Science Inc., New York, NY, USA, 1977.
  5. Ahmed E Hassan and Tao Xie. Software Intelligence: The Future of Mining Software Engineering Data. In Proc. FSE/SDP Workshop on the Future of Software Engineering Research (FoSER 2010), pages 161-166. ACM, 2010.
  6. ISO IEC. ISO/IEC 15504-1  Information Technology  Process assessment. Software Process: Improvement and Practice, 2(1):35{50, 2004.
  7. ISO IEC. ISO/IEC 25000  software engineering  software product quality requirements and evaluation (SQuaRE)  guide to SQuaRE. Systems Engineering, page 41, 2005.
  8. ISO. ISO/DIS 26262-1 - Road vehicles Functional safety Part 1 Glossary. Technical report,International Organization for Standardization / Technical Committee 22 (ISO/TC 22), 2009.
  9. ISO/IEC. ISO/IEC 9126  Software engineering  Product quality. 2001.
  10. Ho-Won Jung, Seung-Gweon Kim, and Chang-Shin Chung. Measuring software product quality: A survey of ISO/IEC 9126. IEEE Software, 21:88-92, 2004.
  11. Stephen H. Kan. Metrics and Models in Software Quality Engineering . Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA, 2nd edition, 2002.
  12. C Kaner andWalter P. Bond. Software engineering metrics: What do they measure and how do we know? In 10Th International Software Metrics Symposium, METRICS 2004 , pages 1-12, 2004.
  13. Jean-louis Letouzey. The SQALE method for evaluating Technical Debt. In 2012 Third International Workshop on Managing Technical Debt, pages 31-36, 2012.
  14. Jean-louis Letouzey and Thierry Coq. The SQALE Analysis Model An analysis model compliant with the representation condition for assessing the Quality of Software Source Code. In 2010 Second International Conference on Advances in System Testing and Validation Lifecycle (VALID) , pages 43-48, 2010.
  15. Nancy Leveson and Clark S. Turner. An Investigation of the Therac-25 Accidents. IEEE Computer, 26(7):18{41, 1993.
  16. J.L. Lions. ARIANE 5 Flight 501 failure. Technical report, 1996.
  17. TJ McCabe. A complexity measure. IEEE Transactions on Software Engineering, (4):308-320, 1976.
  18. J.A. McCall. Factors in Software Quality: Preliminary Handbook on Software Quality for an Acquisition Manager. Information Systems Programs, General Electric Company, 1977.
  19. National Aeronautics and Space Administration. Mars Climate Orbiter Mishap Investigation Report. Technical report, National Aeronautics and Space Administration, Washington, DC, 2000.
  20. Peter G. Neumann. Cause of AT&T network failure. The Risks Digest, 9(62), 1990.
  21. National Institute of Standards and Technology (NIST), Department of Commerce. Software errors cost U.S. economy $59.5 billion annually, 2002.
  22. RTCA. DO-178B: Software Considerations in Airborne Systems and Equipment Certification. Technical report, Radio Technical Commission for Aeronautics (RTCA), 1982.
  23. Mary Shaw. Prospects for an engineering discipline of software. IEEE Software, (November):15-24, 1990.
  24. The Standish Group International Inc. The Standish Group International Inc. Chaos Technical report. Technical report, 2004.
  25. United States General Accounting Office. Patriot Missile Defense: Software Problem Led to System Failure at Dhahran, Saudi Arabia. Technical report, United States General Accounting Office, 1992.
Personal tools