Skip to content

Creating & Maintaining "Application Test Driver Inputs" Data File

Managing Application Test Driver Inputs


Worksoft SaaS provides a strong feature to create & schedule test cycles aka "Test Scheduler" (refer to the following article for more details.) But before using this feature, you are expected to create at a minimum two data files i.e. data file that has Test Cycle details (Test Cycle Runner Inputs) and the other is the data file that has Application Run Definition inputs (Application Test Driver Inputs). 

This article provides feature(s) that is(are) available in Worksoft SaaS that help in easing the process of creating/maintaining "Application Test Driver Inputs" data file. Typically the sequence in which this file is updated is
  1. One or more tests are automated
  2. These tests are then certified using local browser and then primary browser in the Functional Cloud
  3. Create and add testing context & other details to Application Test Driver Inputs file
  4. Schedule these tests for few days to test flakiness and train Worksoft SaaS machine learning feature
  5. Certify these tests against other browsers/devices
  6. Add these testing context details of these combinations to either existing Application Test Driver Inputs file or create a new one
  7. Schedule new test cycles as per the requirement
In this article you will be shown how to use this feature to help you with steps 3 & 6 (it is assumed that you have covered other steps stated above before attempting to perform these two steps.)

There are two ways you can create the file and populate testing context details at the same time. The first way is to initiate the process from Test Runs Home Page and the other from Testing Context page. In both cases steps remain the same but there can be differences in what data is populated to the data file. When you add rows to data file from

Test Runs page, following data will be added
  • Testing Context Key (testingContextKey)
  • Run Definition Name (runDefinitionNameMnemonicForDeveloper)
  • Run Definition Status (runDefinitionStatus)
  • Testing Platform (testingPlatformMnemonicForDeveloper)
  • User Alias (userAliasToScheduleThisRun), only if the run is executed using User Alias
  • Label Overrides (if any of these are applied to Run directly, Scenario/RD labels will be ignored)
  • User Defined Overrides (if selected run is executed using QaCONNECT and UDV overrides are used)
Testing Context page, following data will be added
  • Testing Context Key (testingContextKey)
  • Run Definition Name (runDefinitionNameMnemonicForDeveloper)
  • Run Definition Status (runDefinitionStatus)
  • Testing Platform (testingPlatformMnemonicForDeveloper)
Creating "Application Test Driver Inputs" data file:

You have created automation and completed certification and now you are ready to start scheduling test cycles. As a first step you need to create Application Test Driver Inputs file and populate the file with required inputs. To accomplish this activity, follow below steps:

Step 1:  If you have tests runs available to use for adding data rows, you can start from Test Runs home page. You can select runs that you would like to be considered and add them to "My Selections" (for adding to My Selections, you need to use the toggle switch "Switch to Select mode" and then add required runs by using Add to 'My Selections' icon.) And once done, click on the blue icon beside the text "Create/Enhance the Test Scheduler's "Application Test Driver Inputs"" to start the process.


If you don't have test runs and want to use testing context(s), navigate to the Testing Contexts tab of the specific Run Definitions and select the testing contexts you would like to add. Once done click on "Add to the Test Scheduler Input" button.



Step 2: On the overlay do the following
  • Select "New" for "Target Data File Type"

  • Select first option for "How do you want rows added" (second option is meant if you already have other data files to use. We will discuss this under Maintaining Application Test Driver Inputs section)

  • Click on "Continue"


This puts you to a new tab i.e. "New Data File" page where you can review entries and make any changes before saving. The number of rows populated would be equal to the number of test runs or testing context selected in the first step.



Maintaining (appending rows) "Application Test Driver Inputs" data file:

When you automated more test cases and ready to add them to the application test driver inputs data file, you can initiate the process from Test Runs or Testing Context page of Run Definition. Following are the steps that you can follow to accomplish this.

Step 1: From Test Runs page select the required test runs and add them to "My Selections" before clicking on blue icon OR select required testing contexts and click on the button to initiate the process.

Step 2: On the overlay that shows up, do the following
  • Select "Currently existing" option for "Target Data File Type"

  • Select the data file you intend to add the new rows in "Target Data File"

  • In the selection "How do you want rows added" you used first option in the earlier section. In here you can use either II or III option based on what is required

    • Option 2 allows you to select specific rows in the file selected in "Target Data File Type" and use the content available and only replace testing context & testing platform name. Advantage of this option is that you can avoid having to re-type all other details you have populated to the pre-existing rows. For ex: You have a Product Search test case automated and added testing context specific to chrome browser earlier. And also you added more rows, manually to cover 5 different search criteria by using override UDV to select specific row from application test data files. That means you have 5 pre-existing rows for chrome for one test case. Now when you use these five rows for cloning, 5 new rows will be added only with testing context key & platform name modified as per the inputs selected.

    • Option 3 allows the same functionality as that of option 2, with the only change being, source data file can be different i.e you can select pre-existing rows from a different file that that of the file where you intend to add the rows. This comes handy when you maintain two data files, one for functional and other for cross browser

  • If you selected Option 3 in previous step, you will see this field "Source Data File" where you select the source data file from where pre-existing rows would be copied.

  • In the field "Specify a Row Filter", you need to specify the row numbers of the pre-existing rows that you would like to use as source. This is a list of delimited row numbers, delimited by "|" (pipe symbol.)

  • Click on "Continue"


This puts you on a new tab, in the context of Edit of the data file selected in Target Data File. And the new records are populated at the bottom of the grid highlighted in either light yellow or light red colour.

Yellow Colour: Rows are highlighted in yellow colour when the inputs selected i.e. test runs or testing contexts, have a matching record in the selected rows from the combination of data file & row filter. Matching is applied based on the Run Definition and status combination. To explain this with an example, let us assume you selected a test run or testing context key that belongs to Run Definition "Product Search" in "WIP" state. And then you selected a row from "Functional Browser Runs" data file that belong to the same Run Definition i.e. "Product Search" with status "WIP". The newly added colour is highlighted in light yellow indicating that the values available in the pre-existing row are copied over to the new row and replaced with new testing context key and platform name.

Red Colour: If you take above example and assume that corresponding rows in data file is missing for selected test runs or testing contexts, that selection is highlighted in red colour. For such rows, only bare minimum information would be populated and you would need to add all other additional details.



Colour coding of rows helps you, in quickly identifying the rows you may want to review. In the above picture, you can see two rows in yellow and one in red shade. And also you can see that all details are populated for row #41 (yellow in colour) as these values are available in the pre-existing row. 

And the below picture shows content for row #43 for which mapping data file row is missing from the filter criteria specified and you can see that some of the values like batchLabel are missing. While at the same time you can also see that user defined variable overrides are populated as the test run has these details attached.



In addition to the colour coding, system also presents following messages to help review any exceptions.



In the popup shown above, you see that first message indicating why and how many rows are shown in Red colour. While the second message states that there are some data file records specified, that doesn't have any corresponding test runs/testing contexts selected and they are dropped from added to the data file.







Feedback and Knowledge Base