Skip to content

My Project Settings for Analytics Module

Before this you might want to read:

Overview

The settings & preferences on the 'Analytics' tab of the 'My Project-level Settings & Preferences' screen allow you to configure the default settings for the test cycle display in the Dashboard, landing page, and display of the counts of run definitions on the Dashboard.

Analytics Module Preferences & Settings

The settings & preferences are grouped into the following section(s):

Criteria for Auto-selection of Default Test Cycle for display on the Test Cycle Analytics Dashboard:

The 'Default Test Cycle' setting allows you to choose a specific test cycle you want to land on whenever you go to the Analytics Dashboard or to have the recently generated test cycles as the default one to be displayed in the 'Analytics Dashboard' within the Worksoft SaaS portal.


By default, the 'Test Cycle for which analytical reports got generated most recently' is selected. If you wish to have a specific test cycle to be shown all the time, you can select the test cycle from the dropdown.

Landing View within Analytics module:

(a) Default View: The 'Default Tab' setting allows you to choose the tab you want to land on whenever you go to the home screen of the 'Analytics' module within the Worksoft SaaS portal.


The default is 'Test Cycle Dashboard'. The other option that you can choose is to go to 'Analytics Tree View' instead.

(b) Link to the Analytics 'Dashboard' or 'Tree' View from the Test Cycle Outcome Email Notifications:

When you click on the report hyperlink within the Test Cycle Outcomes Email Notifications, if you want to be directly taken to the 'Dashboard' view and NOT the 'Tree' view, please check off the checkbox labeled "Direct me to the Analytics ‘Dashboard’ view when I click the hyperlink for the Reports within the Test Cycle Outcome Email Notifications. If you uncheck this setting, you will be taken to the Analytics ‘Tree’ view instead."

Count of Run Definitions to display on Test Cycle Dashboard:

By default, the counts of the run definitions which are used during the initial generation of the Analytics reports is selected.


You can select the other option 'Use the counts as of the most recent generation of Analytics for a Test Cycle' from the dropdown based on your preference.

Configure Worksheets to be included within the ‘Test Cycle Level Test Outcomes’ > ‘Transactional Details Report’ (View limited to Project Lead only)

By default, only the "Failed Transaction Log" along with the Test Cycle Summary information is available in the 'Transactions Detail Report'. To have the additional sheets in the "Transactions Detail Report, you can select the sheets from the Analytics tab in "My Project Settings" and on performing save action, the settings will be saved and the respective sheets can be viewed in the Transactions Detail Report. 

Please note that this setting is available only for the "Project Lead" role.


Configure 'Test Cycle Level Test Outcomes' > 'Consolidated Triage Report' (View limited to Project Lead only)

By default only Test Run Level Annotations are auto-selected for printing into the following sheets:

  • “TC_Level_Failed_Runs_Gap_Issues”
  • "Test_Cycle_Level_Net_New_Issues”
  • "Test_Cycle_Level_Unique_Issues"

Additionally, if you want the “Test Run Level User Notes” also to be printed in the 3 sheets, please checkoff the appropriate options from the last 3 checkboxes.

Please note that this setting is available only for the "Project Lead" role.


Configure 'Custom Raw Data Reports' to display in the Dashboard (View limited to Project Lead only)

By default, you can view all the sheets available in the Raw Data Reports dropdown having "Test Outcomes by Run Definition" as the default report in the Analytics Dashboard. And if you do not wish to see any of the reports in the dropdown. you can have the default setting made by unchecking the checkboxes available next to the options(as shown in the above GIF). Once, you perform "Save", the changes will be reflected in the Analytics Dashboard.

Please note that this setting is available only for the "Project Lead" role.


Auto Triage Process (View limited to Project Lead only)

On enabling the first setting Worksoft AI/ML will auto-assign the issue(s) to the failed test run(s) only when the test run that is used as a basis for auto-assignment of Root Cause for Failure has an Issue linked from the Issue Store.

By default, this setting is not selected, and please note that this setting is available only for the "Project Lead" role.


Configure Test Cycle Outcomes and Triage Comparison Report (View limited to Project Lead only)


Please note that these settings are available only for the "Project Lead" role.

Configuration of Worksheets to be Included in the Test Cycle Comparison Reports:

  • Include 'Flakiness Summary'. This option when checked off will include the 'Flakiness Summary' sheet to be included in the generated report.
  • Print Separate ‘Detail’ worksheet for each Target Cycle compared to the Baseline. If you don't select this option, a single sheet will include the detail for the baseline test cycle as well as each of the target test cycles.

Columns to Include in the ‘Outcomes & Triage Summary & Detail’ Worksheets of the Test Cycle Comparison Reports:

  • User Notes for Test Runs
  • Traceability to Application Modules
  • Traceability to Manual Test Cases
  • Traceability to Issues in Atlassian Jira, if integrated
  • Impacted Testing Platform

In your project, you may or may not use (populate) the 'User Notes' for the Tests that you schedule for execution. If you do, you may want to check off the checkbox "User Notes for Test Runs". 

In your project, if you build traceability from the automated tests to the application modules and/or manual test cases and/or Issues in your company's Atlassian Jira account, you can check off the second through the fourth checkboxes. 

Within your project, if you routines execute the same test on multiple testing platforms and you want comparison of a test on a specific testing platform within one test cycle to only be compared to a test in the same platform within a second test cycle, you must check off the last checkbox.

Key Data Attributes that will be used for comparison within Test Cycle Comparison Reports:

  • Include 'Flakiness Summary'
  • User Notes of the Test Run
  • Testing Context Run Definition Key
  • Impacted Testing Platform
  • Application Module (Label)
  • Manual Test Case (Label)

Significance of the Key Data Attributes: The Target Test Cycles will be compared with the Baseline Test Cycle by using the combination of the Key Data Attributes (composite key) you choose.

Please note that, sometimes, the Key Data Attributes that you select below may not be sufficient to perform a comparison between the Baseline and Target Test Cycles. For example, if someone manually executes the same test more than once, all the key information will be the same. In such cases, Worksoft SaaS will generate and use a sequence number using the key ordered by datetime of execution

"Do not show the ‘Advanced Options’ to the Users when requesting the generation of the Test Cycle Comparison Reports": When this setting is checked off, will prevent users that attempt to generate the report to override/deviate from the project level values for the above mentioned settings. The project level settings will be used to generate the report.

Configure Auto Purging of Reports (View limited to Project Lead only)

The settings under this section will control when the functional testing test cycle outcome reports and if applicable, the performance test cycle reports will be automatically purged from the time of their generation. This feature allows to keep Analytics Tree View declutttered without which there could be a long list of test cycles showing up within your tree view making it difficult to find the specific test cycle you may be looking for. It can also make the report generation for new test cycles slower especially if your report hierarchy involves generation.


You are allowed to set settings specific to Quality > Test Cycle Test Outcomes & Performance > Test Cycle Test Outcomes separately. It is possible that you don't run Performance Test Cycles as often as you do Quality Test Cycles. Given that the number of test cycles that get generated in a certain period can vary. To account for this system allows for you to set Retention period separately.

  • Setting: "# of days after which Quality > Test Cycle Test Outcomes, Test Cycle Clipboard and Test Cycle Comparison Reports will be auto purged" - default = 15 days; max duration = 90 days
  • Setting: "# of days after which Performance > Test Cycle Test Outcomes, and Scheduling & Preprocessing Reports will be auto purged" - default = 90 days; max duration = 365 days


After this you might want to read:

Feedback and Knowledge Base