For each dashboard to function effectively, it requires a reliable source of data. Datasets serve as the foundational structures for organizing and presenting this data. Continuous Testing has a diverse array of datasets to support various use cases, ensuring that users have access to relevant and structured information. For more information about datasets, see Managing Datasets.
The following are the Continuous Testing out-of-the-box datasets:
-
Reservation details
-
Session details
-
Test execution details
Known Issue
In a dashboard, the preview data does not match the actual data. The values displayed by the Show Data option do not align with the actual attribute and metric values.
Reservation details
The reservation details dataset comprehensively records all reservations made within the cloud environment, enabling cloud administrators to track resource utilization, manage costs, monitor compliance, and optimize resource allocation efficiently. It has attributes and metrics like reservation ID, reservation date, reservation duration, and reservation status. This dataset helps you gain insights into business scenarios such as:
- How many reservations without a session exist per project and user? What is the amount of time spent without a session per project and user?
- What is the trend of reservation per user for a selected project?
- The following are the components of this dataset:
Attributes
Attribute Name | Description |
---|---|
Calendar month | Gregorian calendar month displayed in the format ‘Mon YYYY’ |
Reservation | Unique number of reservation |
Reservation date | Date when a reservation was made or initiated within the cloud environment |
Reservation environment | Environment in which reservation is done |
Reservation project | The project for which the device is reserved |
Reserved By | The project or user who reserved the device |
Session flag | Flag type of the session |
Sys_source | Unique identifier for the source |
Metrics
Metric Name | Description | Formula |
---|---|---|
Rank of scheduled duration | Prioritization is assigned to the scheduled duration within the context of device reservation | <Rank ASC="False" />([Scheduled duration (Hours)]) |
Scheduled Duration | Scheduled time for which a resource or service is reserved | Scheduled start timestamp - Scheduled end timestamp |
Scheduled Duration (Hours) | Time duration between Scheduled Reservation Start Date to Scheduled Reservation End Date in hours | (Rank[Reservation Scheduled Duration] / 3600.0) |
Session details
The session details dataset provides comprehensive insights into user activity and session management within the cloud environment. It enables cloud administrators to monitor user behavior, identify security threats, troubleshoot issues, and ensure compliance with access policies and regulations. This dataset helps you gain insights into business scenarios such as:
- Is there any device that is not used for any tests in my cloud? Should I encourage my tester to test or replace it with another device?
- Is there any device that is used for many tests in my cloud? Should I add more devices with this model/version?
- What kind of test is run for each project in the cloud?
- What is the trend of my license usage in the cloud?
The following are the components of this dataset:
Attributes
Attribute Name | Description |
---|---|
Calendar date | Gregorian calendar date displayed in the format ‘M/D/YYYY’ |
Calendar month | Gregorian calendar month displayed in the format ‘Mon YYYY’ |
Device | Name of the device |
Lagging count of months | Count of number of months since the first data record till the current month |
Reservation | Unique number of reservation |
Scheduled start date | Date when a session is planned or scheduled to begin |
Session | Id of the session |
Session environment | Environment in which a session is taking place |
Session project | Project associated with the session’s activities |
Session status | Status of the session |
Session type | Type of the session |
Sys_source | Unique identifier for the source |
Metrics
Metric Name | Description | Formula |
---|---|---|
Rank of session duration | Prioritization is assigned to the scheduled duration within the context of device reservation | RankBreakBy={@auto}([Session duration (Hours)]) |
Rank of session duration description | Description of the session duration rank | <Rank ASC="False" />([Session duration (Hours)]) |
Session Duration | Time that a session remains active or in progress | Session start timestamp - Session end timestamp |
Scheduled Duration (Hours) | Time duration between Session created to session closed time in hours | ([Session Duration] / 3600.0) |
Test execution details
The Test execution details dataset offers vital insights into software testing, aiding teams in monitoring progress, identifying improvement areas, and assessing product quality. As a Quality Assurance Manager, it helps track test execution, assess test case effectiveness, and gauge application stability. Key scenarios include analyzing test coverage, monitoring release stability, identifying defects’ root causes, optimizing resource allocation, and driving continuous improvement initiatives. This dataset helps you gain insights into business scenarios such as:
- View the trend of test executions across the test application platform
- View the test execution status by devices and the execution report details
- Understand the test execution overview at various application levels and the test device execution details
The following are the components of this dataset:
Attributes
Attribute Name | Description |
---|---|
Application Build Version | The build version of the test application |
Application Platform Name | Name of the mobile platform |
Application Release Version | The release version of the test application |
Calendar Month | Gregorian calendar date displayed in the format ‘D/M/YYYY’ |
Date | Gregorian calendar date displayed in the format ’ YYYY’ |
Device Name | Name of the device, such as iPad Air or Galaxy S5 |
Month | Gregorian calendar month displayed in the format ‘Mon YYYY’ |
Project | Name of the project |
Sys_source | Unique identifier for the source |
Test | Name of the test |
Test application | Application that is being tested |
Test device | Name of the test device |
Test Device Category | Category of the test device, such as Phone or Tablet |
Test Device Manufacturer | Manufacturer of the device on which the test is run |
Test Device Model | Model of the device on which the test is run |
Test Device OS | Operating system version of the device on which the test is run |
Test Device Screen size | Screen size of the device on which the test is run |
Test Execution Cause | Root cause for the test execution failure |
Test Execution environment | Environment in which a test is taking place |
Test execution report url | Combination of test url and name |
Test execution status | Current status of the test execution |
Test number | Unique identifier for the test |