Eclipse

From Maisqual Wiki

Jump to: navigation, search

This is the homepage of the work around the Eclipse Quality Model. More information can be found in the following places:


Contents

Model decomposition

Maisqual Quality Model decomposition

Product

Analysability

Goal: Question:

Decomposition:

  • Class Analysability Index: Sum of class Analysability technical debts divided by the total number of classes
    • Number of attributes
    • Number of methods
    • Depth of Inheritance Tree
    • Weighted Methods per Class
    • Comment rate
  • Function Analysability Index: Sum of function Analysability technical debts divided by the total number of functions
    • Control flow complexity
      • Maximum nested structures
      • Non-cyclic paths
      • Cyclomatic complexity
    • Data flow complexity
      • Number of parameters: Number of formal parameters in the function
      • Halstead's Program Vocabulary: Distinct number of operands and operators(n1 + n2) [Halstead76]
      • Halstead's Program Length: Total number of operators and operands (N1 + N2) [Halstead76].
    • Comment rate
  • Number of Analysability Non Compliances: Number of non conformities to Analysability rules found in the artefact.
  • Compliance to Analysability Practices: Ratio of conform analysability practices on the number of checked analysability practices.

Changeability

Goal:

Question:

Decomposition:

  • Class Changeability Index:
    • Number of public attributes
    • Number of public methods
    • Weighted methods per class: Sum of the Cyclomatic Complexities of the functions and methods defined in the class.
  • Function Changeability Index:
    • Control flow complexity
      • Maximum nested structures
      • Non-cyclic paths
      • Cyclomatic complexity
    • Data flow complexity
      • Number of parameters: Number of formal parameters in the function
      • Halstead's Program Vocabulary: Distinct number of operands and operators(n1 + n2) [Halstead76]
      • Halstead's Program Length: Total number of operators and operands (N1 + N2) [Halstead76].
    • Function code cloning
  • Number of Changeability Non Compliances: Number of non conformities to changeability rules found in the artefact.
  • Compliance to Changeability Practices: Ratio of conform changeability practices on the number of checked changeability practices.

Reusability

Goal: Measure the ability for others to reuse the software.

Question: How easy is it to reuse the software artefact?

Metrics:

  • Class Reusability Index: Sum of class Reusability technical debts divided by the total number of classes
    • Number of public attributes
    • Number of public methods
    • Weighted Methods per Class
    • Class Javadoc: number of comment lines in file minus the number of comment lines in file's functions. TODO
    • Class comment rate TODO
  • Number of Reusability Non Compliances: Number of non conformities to Reusability rules found in the artefact.
  • Compliance to Reusability Practices: Ratio of conform Reusability practices on the number of checked Reusability practices.

Reliability

Goal: Measure the capability of the software product to avoid failure as a result of faults in the software.

Question: How easy is it to reuse the software artefact?

Metrics:

  • File Reliability Index: Sum of class Reliability technical debts divided by the total number of classes
    • Ratio of fix-related commits on file.
    • Number of public data in classes
    • Number of test cases for classes. (NOT AVAILABLE FOR NOW)
  • Function Reliability Index: Sum of function Reliability technical debts divided by the total number of functions
    • Control Flow complexity
    • Data Flow complexity
    • Number of test cases for the current function. (NOT AVAILABLE FOR NOW)
  • Registered bugs (NOT AVAILABLE FOR NOW)
    • Mean time to fix bugs.
    • Number of open bugs at the time of analysis.
    • Defect density: overall total number of bugs on product divided by 1000's of lines (KLOC).
  • Number of Reliability Non Compliances: Number of non conformities to Reliability rules found in the artefact.
  • Compliance to Reliability Practices: Ratio of conform Reliability practices on the number of checked Reliability practices.

Process

Change and Configuration Management

Goal: Measure how is managed the history and consistency of the software product.

Question: Is any user able to retrieve any consistent and complete set of files from the project?

Metrics:

  • Completeness
    • Is there a change management tool?
    • Is there a configuration management tool?
  • Consistency and completeness
    • Is everything included to build the project?
  • Collaborative work
    • Are there
    • How many branches are there?
  • Integrated CCM
    • Are there conventions to track commits against CRs?
    • Are there hooks to ensure coding or naming conventions?

Intellectual Property Management

Goal: Make sure that the intellectual property rights are correct.

Question: How much intellectual property rights are respected?

Metrics:

  • User IP log
  • Commits IP log
  • Ratio of licences provided.

Test Management

Goal: Make sure that the software has enough testing.

Question: How much the software is tested?

Metrics:

  • Are there tests?
  • Number of tests
  • Test coverage
  • Continuous testing?

Project Planning

Goal: Make the project has predictable outputs, in time, content, and quality.

Question:

Metrics:

  • Number of milestones
  • Predictability
    • Dates announcement
    • Dates achievement

Build and Release Management

Goal: Make sure that

Question:

Metrics:

  • Continuous Integration:
    • Has CI server
    • Has CT Server
  • Number of failed builds
  • Total number of builds

Community

Activity

Goal: Measure the recent activity of the project.

Question: How much was the project recently modified?

Metrics:

  • Number of commits, computed on the last week, last month, and last three months.
  • Number of committed files, computed on the last week, last month, and last three months.
  • Number of emails exchanged on the dev mailing list, computed on the last week, last month, and last three months.
  • Number of emails exchanged on the user mailing list, computed on the last week, last month, and last three months.

Diversity

Goal: Measure the diversity of actors in the project.

Question: How many different users are involved in the project?

Metrics:

  • Number of distinct committers to the SCM repository, computed on the last week, last month, and last three months.
  • Number of distinct authors on the user mailing list, computed on the last week, last month, and last three months.
  • Number of distinct authors in the dev mailing list, computed on the last week, last month, and last three months.

Responsiveness

Goal: Measure the time to answer to requests.

Question: how fast users gets answers to requests (support).

Metrics:

  • Median response time on the dev mailing list: the median of time to first answer to a request on the dev mailing list, computed on the last week, last month, and last three months.
  • Median response time on the user mailing list: the median of time to first answer to a request on the user mailing list, computed on the last week, last month, and last three months.

Support

Goal: Measure the ability to get support on the project.

Question: how many requests are satisfied?

Metrics:

  • Response ratio on the dev mailing list: the average number of answers to a single request on the dev mailing list, computed on the last week, last month, and last three months.
  • Response ratio on the user mailing list: the average number of answers to a single request on the user mailing list, computed on the last week, last month, and last three months.
  • Number of subjects on the dev mailing list: the number of threads on the dev mailing list, computed on the last week, last month, and last three months.
  • Number of subjects on the user mailing list: the number of threads on the dev mailing list, computed on the last week, last month, and last three months.
Personal tools