After you design tests in the Test Cases module, you create a test scenario tree.  A test sets tree enables you to organize your testing process by grouping test scenario in folders and organizing them in different hierarchical levels in the Test Scenario module. To do this, examine your application, system environment, and testing process to outline the testing strategy for achieving your goals.

 

After you define test scenarios, you can begin to verify your tests. You can then use Auto Prudent test to view and analyze the results of your tests to ensure the test scenario workflow is working fine.

 

Following are examples of general categories of test sets you can create:


Test Scenario

Description

Sanity

Checks entire application at a basic level—focusing on
 breadth, rather than depth—to verify that the application is functional and stable. This set includes fundamental tests that contain positive checks, validating that the application is functioning properly.

Regression

Tests the system in a more in-depth manner than a sanity set. This set can include both positive and negative checks. Negative tests attempt to fail an application to demonstrate that the application is not functioning properly.

Advanced

Tests both breadth and depth. This set covers the entire
application, and also tests the application’s advanced
 options. You can run this set when there is ample time for testing.

Function

Tests a subsystem of an application. This could be a single
 feature or a group of features.



Adding Tests to a Test Scenarios

 

After you define a test scenario, you can add test case instances to your test scenario.

To add a test to a test scenario:

1. Make sure the Test Lab module is displayed


If the Test Scenario module is not displayed, click the Test Scenario button on the Top Menu.



2. Display the test plan tree if it is not already displayed.


Click the Select Tests button. The right pane displays the test plan tree.



3. Choose the test cases from the right pane and move the test cases to the left pane as displayed below.



4. To ensure the pulled in test cases are in order and able to run seamlessly one-by-one, verify the environment against which you want to verify the execution. You can rearrange the test cases by dragging the test cases up and down.


Adding intelligence to a Test Cases

 

After you added the test cases to a test scenario, you may now need to add intelligence to your test cases and its workflow. You can define a dependency for a test case so that whenever the parent test case is failed, the tagged dependent test cases will not run and the status of the test case will be marked as Failed.

 

To add a dependency to a test case in a test scenario:

1. Once the test cases are pulled to the test scenario, on each test case, two toggle button will be displayed. The first button is to enable the Data Driven for the test case and second button is to tag the dependency for that testcase.


 

2. Now to add a dependency, toggle the dependent button ON, a dropdown will be displayed. 


 

3. Choose the appropriate test on which the current test case is depend on as shown below. Now the dependent test case will be the child of parent test case and the dependent test case will be intended inside.


 

To make a test case as Data Driven, 


1. Once the test cases are pulled to the test scenario, on each test case, two toggle button will be displayed. Toggle ON the Data Driven button, a spread sheet Icon will be displayed. 


 

2. Click on excel Icon, a pop up will be displayed as shown along with the fields meant for Data Driven.


 

3. Enter the values in the fields or drag and drop the spreadsheet with the data, the data will be imported.Click Submit to save the data. Now the test case is ready for data driven iterations.