Test dashboard (CMMI)
The Test dashboard displays five reports that let you monitor test activities, report on progress, find gaps in test coverage, and identify test areas that might require additional investigation. The data covers testing that has occurred in the past four weeks.
In this topic
|
Use this dashboard to answer the following questions:
|
Requirements
You can access the Test dashboard and all other dashboard only if your team project portal has been enabled and is configured to use SharePoint Server Enterprise Edition. For more information, see Dashboards.
To view the dashboard, you must be assigned to or belong to a group that has been assigned the Read permission in SharePoint Products for the team project. To modify, copy, or customize a dashboard, you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project.
To view a work item, you must be a member of the Readers group or your View work items in this node permission must be set to Allow. To create or modify a work item, you must be a member of the Contributors group or your Edit work items in this node permission must be set to Allow.
For more information, see Add users to team projects.
To modify a report in Office Excel, you must be a member of the TfsWarehouseDataReaders security role in SQL Server Analysis Services and you must be assigned or belong to a group that has been assigned the Members permission in SharePoint Products for the team project. For more information, see Grant permissions to view or create reports in TFS.
To view a work item, you must be a member of the Readers group or your View work items in this node permission must be set to Allow. To create or modify a work item, you must be a member of the Contributors group or your Edit work items in this node permission must be set to Allow. For more information, see Add users to team projects.
The Test Plan Progress, Test Case Readiness, Requirement Test Status, and Test Activity reports are available only when the team creates test plans and runs tests as described in Plan Manual Tests using Team Web Access.
Data that appears in the Test dashboard
You can use the Test dashboard to understand how well the team is progressing in testing the requirements. To learn about the Web Parts that are displayed on the Test dashboard, refer to the illustration and the table that follow.
Burndown, progress, trend charts, and reports through do not appear when the server that hosts Analysis Services for the team project is not available.
Web Part |
Data displayed |
Related topic |
---|---|---|
Stacked area graph of the test results of all test grouped into their most recent recorded outcome during the past four weeks. Outcomes include Never Run, Blocked, Failed, and Passed. |
||
Stacked area graph that shows how many test cases have been in the Design or Ready state for the most recent four weeks. |
||
Horizontal bar chart that shows the count of test results for each combination of test case and test configuration that is defined for each requirement. The chart groups the test results according to their most recent test run, where the options are Passed (green), Failed (red), Blocked (purple), or Not Run (gray). |
||
Line chart that shows the cumulative count of all results run for all manual test cases during the most recent four weeks. |
||
Stacked area graph that shows the cumulative count of all failed outcome results for tests, sorted by failure type, during the most recent four weeks. Failure types include Regression, New Issue, and Known Issue. |
||
List of upcoming events. This list is derived from a SharePoint Web Part. |
Not applicable |
|
Count of active, resolved, and closed work items. You can open the list of work items by choosing a number. This list is derived from a Team Web Access Web Part. |
||
List of recent builds and their build status. You can view more details by choosing a specific build. This list is derived from a Team Web Access Web Part. Legend: : Build not started : Build in progress : Build succeeded : Build failed : Build stopped : Build partially succeeded |
||
List of the most recent check-ins. You can view more details by choosing a specific check-in. This list is derived from a Team Web Access Web Part. |
Required activities for tracking testing
For the reports in the Test dashboard to be useful and accurate, the team must perform the following activities:
Define test cases and requirements, and create Tested By links from test cases to requirements.
For manual tests, mark the results of each validation step in the test case as passed or failed.
Important
Testers must mark a test step with a status if it is a validation test step. The overall result for a test reflects the status of all the test steps that the tester marked. Therefore, the test will have a status of failed if the tester marked any test step as failed or did not mark it.
For automated tests, each test is automatically marked as passed or failed.
(Optional) To support filtering, assign Iteration and Area paths to each test case.
Monitoring test progress
You can use the first three reports in the Test dashboard to monitor test progress and answer these questions:
Report |
Questions answered |
Notes |
---|---|---|
Test Case Readiness |
|
|
Test Plan Progress |
|
|
Requirement Test Status |
|
|
Determining gaps in testing
You can use the Requirement Test Status report to determine whether tests are covering all the code and to answer the following questions:
Which requirements have a low overall count of test cases?
Which requirements have a high overall count of test cases that are blocked or have never been run?
Does the test case coverage for each requirement meet expectations?
Which requirements have a high rate of test failures?
What is the average number of test cases that are defined for each requirement?
Monitoring test failures and regressions
By monitoring test failures, you can identify and address problems in the code early. You can use the last two reports in the Test dashboard to gain better insight into the number of tests that are failing.
Report |
Questions answered |
Notes |
---|---|---|
Manual Test Activity |
|
The Manual Test Activity report indicates the results for each Test Case run for each test configuration and for all test plans. Spikes that might occur might be early indicators of problems in either the test activity or the quality of code that the team is checking in. You might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes. |
Test Failure Analysis |
|
A healthy Test Failure Analysis report shows moderate numbers of new issues, known issues, and regressions. If any spikes occur in these areas, the team might need to investigate further. Spikes might indicate problems in either the test activity or the quality of code that the team is checking in. Also, you might want to check the metrics for recent builds, bug status, and code churn to determine whether any of them can help explain the changes. |
Customizing the Test dashboard
Here’s how you can customize the Test dashboard:
Change the filters of each report in Office Excel to focus on specific product areas or iterations.
Filter the Manual Test Activity report in Office Excel for specific test plans or on Test Cases that are either manual or automated.
Add existing Excel reports such as Bug Status, Code Churn, and Code Coverage to the dashboard.
Create and add reports in Office Excel that show progress by specific members of the team. For an example, see Bugs by Assignment Excel Report.
For more information about how to work with and customize reports in Office Excel, see the following pages on the Microsoft Web site:
See Also
Concepts
Running Manual Tests using Team Web Access