共用方式為


Test dashboard (CMMI)

The Test dashboard displays five reports that let you monitor test activities, report on progress, find gaps in test coverage, and identify test areas that might require additional investigation. The data covers testing that has occurred in the past four weeks.

In this topic

  • Data that appears in the dashboard

  • Required activities for tracking testing

  • Monitoring test progress

  • Determining gaps in testing

  • Monitoring test failures and regressions

  • Customizing the test dashboard

Use this dashboard to answer the following questions:

  • Is the authoring of test cases on track?

  • Has the team defined test cases for all requirements?

  • What are the proportions of test cases that are passing, failing, and blocked?

  • Do test failure metrics indicate a problem that requires further investigation?

  • What is the status of last night's build?

  • What are the most recent check-ins?

Requirements

  • You can access the Test dashboard and all other dashboard only if your team project portal has been enabled and is configured to use SharePoint Server Enterprise Edition. For more information, see Dashboards.

  • To view the dashboard, you must be assigned to or belong to a group that has been assigned the Read permission in SharePoint Products for the team project. To modify, copy, or customize a dashboard, you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project.

    To view a work item, you must be a member of the Readers group or your View work items in this node permission must be set to Allow. To create or modify a work item, you must be a member of the Contributors group or your Edit work items in this node permission must be set to Allow.

    For more information, see Add users to team projects.

  • To modify a report in Office Excel, you must be a member of the TfsWarehouseDataReaders security role in SQL Server Analysis Services and you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project. For more information, see Grant permissions to view or create reports in TFS.

  • To view a work item, you must be a member of the Readers group or your View work items in this node permission must be set to Allow. To create or modify a work item, you must be a member of the Contributors group or your Edit work items in this node permission must be set to Allow. For more information, see Add users to team projects.

  • The Test Plan Progress, Test Case Readiness, Requirement Test Status, and Test Activity reports are available only when the team creates test plans and runs tests as described in Plan Manual Tests using Team Web Access.

Data that appears in the Test dashboard

You can use the Test dashboard to understand how well the team is progressing in testing the requirements. To learn about the Web Parts that are displayed on the Test dashboard, refer to the illustration and the table that follow.

Test Dashboard

Burndown, progress, trend charts, and reports Step 1 through Step 5 do not appear when the server that hosts Analysis Services for the team project is not available.

Web Part

Data displayed

Related topic

Step 1

Stacked area graph of the test results of all test grouped into their most recent recorded outcome during the past four weeks. Outcomes include Never Run, Blocked, Failed, and Passed.

Test Plan Progress Excel Report

Test Plan Progress Report

Step 2

Stacked area graph that shows how many test cases have been in the Design or Ready state for the most recent four weeks.

Test Case Readiness Excel Report

Test Case Readiness Report

Step 3

Horizontal bar chart that shows the count of test results for each combination of test case and test configuration that is defined for each requirement. The chart groups the test results according to their most recent test run, where the options are Passed (green), Failed (red), Blocked (purple), or Not Run (gray).

Requirement Test Status Excel report

Requirement Test Status Excel Report (CMMI)

Step 4

Line chart that shows the cumulative count of all results run for all manual test cases during the most recent four weeks.

Test Activity Excel Report

Test Activity Excel Report

Step 5

Stacked area graph that shows the cumulative count of all failed outcome results for tests, sorted by failure type, during the most recent four weeks. Failure types include Regression, New Issue, and Known Issue.

Failure Analysis Excel Report

Failure Analysis Excel Report

Step 6

List of upcoming events. This list is derived from a SharePoint Web Part.

Import Events Web part

Not applicable

Step 7

Count of active, resolved, and closed work items. You can open the list of work items by choosing a number. This list is derived from a Team Web Access Web Part.

Project work items

CMMI process template work item types and workflow

9

List of recent builds and their build status. You can view more details by choosing a specific build. This list is derived from a Team Web Access Web Part.

Recent Builds Web part

Legend:

Build in Progress: Build not started

Build Not Started: Build in progress

Build Succeeded: Build succeeded

Build Failed: Build failed

Build Stopped: Build stopped

Build Partially Succeeded: Build partially succeeded

Managing and Reporting on Builds

10

List of the most recent check-ins. You can view more details by choosing a specific check-in. This list is derived from a Team Web Access Web Part.

Recent Checkins Web part

Develop code and manage pending changes

Required activities for tracking testing

For the reports in the Test dashboard to be useful and accurate, the team must perform the following activities:

  • Define test cases and requirements, and create Tested By links from test cases to requirements.

  • Define test plans, and assign test cases to test plans.

  • For manual tests, mark the results of each validation step in the test case as passed or failed.

    Important

    Testers must mark a test step with a status if it is a validation test step. The overall result for a test reflects the status of all the test steps that the tester marked. Therefore, the test will have a status of failed if the tester marked any test step as failed or did not mark it.

    For automated tests, each test is automatically marked as passed or failed.

  • (Optional) To support filtering, assign Iteration and Area paths to each test case.

Monitoring test progress

You can use the first three reports in the Test dashboard to monitor test progress and answer these questions:

Report

Questions answered

Notes

Test Case Readiness

  • How many test cases has the test team defined?

  • How many test cases are ready to run today?

  • How many test cases must the team still write and review?

  • Does the overall number of test cases appear to be sufficient for the number of requirements that the team is implementing?

  • What percentage of test cases can the test team run today?

  • Will the team be able to prepare all the tests cases by the end of the iteration?

  • Healthy progress shows a steady increase in the number of Test Cases that the team is designing and moving to the ready state.

  • Unhealthy progress shows that no or few Test Cases are ready to be run.

    When all test cases remain in a design state for a long time, an issue might block progress. You might want to investigate the cause of the blockage.

  • A gap in testing might develop if the number of Test Cases does not appear sufficient.

    The number of Test Cases that are defined for a project should be equal to or larger than the number of Requirements that the team is implementing. The number of Test Cases does not appear sufficient.

Test Plan Progress

  • How many test cases are passing?

  • How many test cases are failing?

  • How many test cases are blocked?

  • how many test cases have never run?

  • What percentage of test cases are passing across all test plans?

  • How much testing has the team completed?

  • Is the team likely to finish the testing on time?

  • As the development cycle progresses, the more Test Cases should pass, and fewer Test Cases should stay in other states.

  • Unhealthy progress occurs when too many Test Cases fail. Depending on where you are in the product cycle, you might investigate why so many test cases are failing.

  • If the number of Test Cases that are failing or never run is flat, you might want to investigate the specific causes that affect each area.

Requirement Test Status

  • Are test cases being run for each requirement?

  • If test cases are blocked or not being run, does the team understand the blocking issues and are they being addressed?

  • Healthy progress shows most Test Cases for each Requirement are passing.

  • Unhealthy progress is indicated too many Test Cases for a specific Requirement, which are in a Never Run, Blocked, or Failed state. You might want to investigate the causes that keep the Test Cases that are defined for a Requirement from passing.

Determining gaps in testing

You can use the Requirement Test Status report to determine whether tests are covering all the code and to answer the following questions:

  • Which requirements have a low overall count of test cases?

  • Which requirements have a high overall count of test cases that are blocked or have never been run?

  • Does the test case coverage for each requirement meet expectations?

  • Which requirements have a high rate of test failures?

  • What is the average number of test cases that are defined for each requirement?

Monitoring test failures and regressions

By monitoring test failures, you can identify and address problems in the code early. You can use the last two reports in the Test dashboard to gain better insight into the number of tests that are failing.

Report

Questions answered

Notes

Manual Test Activity

  • Is the number of tests that the team has never run decreasing?

  • Is the team minimizing the overall number of blocked tests?

  • Are fewer tests failing over time?

  • Are more tests passing?

  • Does the test activity contain spikes that you cannot account for?

The Manual Test Activity report indicates the results for each Test Case run for each test configuration and for all test plans. Spikes that might occur might be early indicators of problems in either the test activity or the quality of code that the team is checking in.

You might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes.

Test Failure Analysis

  • How many tests are regressing?

  • Is the team keeping the overall number of regressions or test failures within expected ranges or team goals?

  • Is the team addressing issues as they are identified and known issues in a timely manner?

A healthy Test Failure Analysis report shows moderate numbers of new issues, known issues, and regressions. If any spikes occur in these areas, the team might need to investigate further. Spikes might indicate problems in either the test activity or the quality of code that the team is checking in.

Also, you might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes.

Customizing the Test dashboard

Here’s how you can customize the Test dashboard:

  • Change the filters of each report in Office Excel to focus on specific product areas or iterations.

  • Filter the Manual Test Activity report in Office Excel for specific test plans or on Test Cases that are either manual or automated.

  • Add existing Excel reports such as Bug Status, Code Churn, and Code Coverage to the dashboard.

  • Create and add reports in Office Excel that show progress by specific members of the team. For an example, see Bugs by Assignment Excel Report.

For more information about how to work with and customize reports in Office Excel, see the following pages on the Microsoft Web site:

See Also

Concepts

Running Manual Tests using Team Web Access

Test Case Readiness Report

Test Plan Progress Report

Other Resources

Dashboards (CMMI)