Sunteți pe pagina 1din 78

TestDrive-Gold

Introduction
TestDrive-Gold is designed to deal with standard Windows controls and any
application which has implemented the Microsoft Active Accessibility Standard,
including languages such as .NET, Visual Basic, Delphi, Visual Lansa, Coolplex,
JWalkTM or typical ‘screen-scrape’ products. The major elements - record and
playback, maintaining & editing scripts, storage, analysis and presentation of
results - have all been designed to offer maximum efficiency and functionality with
minimum complexity.
TestDrive-Gold also integrates with our market-leading testing suite, TestBench.
In this configuration, transactions are followed from arrival on the server, through
all database effects, program calls and other interactions to final results – true end
to end testing of complete client server and e-commerce transactions.
In summary, TestDrive-Gold provides:-
• Recording of windows, input and navigations to form Scripts.
• Re-play of those Scripts with automatic verification of the actual results against
those expected or against pre-defined rules.
• Optionally when integrated with TestBench, TestDrive-Gold ensures that a
consistent environment is set at the start of each test execution
• Optionally when integrated with TestBench, TestDrive-Gold triggers Test_IT to
ensure that the database and reports are verified along with the screens.

Platforms
The TestDrive-Gold repository can run on the following platforms:-
• IBM iSeries (includes integration with TestBench)
• Oracle (includes integration with TestBench)
• SQL server
• Access database (for evaluation purposes only)
Browser Requirements
TestDrive-Gold can be used with Microsoft Internet Explorer version 5.5 or
later with a recent service pack. No other web browser products are
supported.
Getting Started
TestDrive-Gold is installed when TestBench-PC is installed. TestBench-PC
provides a GUI interface into some of the features that can also be accessed by
TestBench on the iSeries. Once a validation code has been registered,
TestDrive-Gold is launched by clicking on the TestDrive-Gold button at the
bottom of TestBench-PC. See the TestBench-PC User Guide for more
information.

The Main Panel


Edit User Preferences

Options
The Options control the settings that affect the recording and playing back of
scripts using TestDrive-Gold. Access this panel from the Tools menu, by clicking
the Options button on the toolbar or by right clicking the name of the currently
opened script.
Web and GUI Tab
All options with no symbol to their left are applicable to both Web and GUI
applications. Those options with a world symbol apply only to Web
applications, those with a document symbol are relevant to GUI applications
only.
Screen Analysis
TestDrive-Gold has many options which enable it to record and playback scripts and
analyse the content of screens for applications written in a wide range of languages. It
uses Microsoft Active Accessibility (MSAA) and Class Rules to define how to interact
with the elements on screen and the Messages help to control when screen pictures will
be taken.
If you are using TestDrive-Gold for the first time over a new application you are likely to
achieve a good result because by default it uses Microsoft Active Accessibility to analyse
screen contents. This means that TestDrive-Gold will enumerate the contents of every
screen using Microsoft Active Accessibility and every control which implements this
standard will be analysed without the need for any additional setup. If there are then any
specific window components which do not support this standard, Class Rules can be
created to retrieve the data by sending and receiving Microsoft Standard and Common
Control messages (MSCC). Any controls which do not support either of these techniques
will be recorded as a panel and their contents will not be retrieved. In summary:

1 Enumerate and analyse items using MSAA

2 Use Class Rules messages for anything which does not support MSAA

When using Class Rules, TestDrive-Gold will enumerate all of the window controls on
every screen and then use the Class Rule definitions to retrieve their content. This does
mean that controls without a window handle will not be included in the contents list.
Class Rules
When TestDrive-Gold is recording and playing back scripts and the MSAA standard
cannot be used, it looks at the names of the windows components on each screen
and must convert these to an internal type so that it knows how to interact with the
component and obtain the information that is listed on the Content screen. It does
this by obeying the Class Rules listed below. For example, a class name of
“Thunder*Option Button” will be treated as a Radio Button.
Pattern Matching
The following characters are allowed for Class Rules that use pattern matching:
Character Matches
? Any single character
* Zero or more characters
# Any single digit (0-9)
[Charlist] Any single character in Charlist
[!Charlist] Any single character not in Charlist
If the class name that you want to match includes any of the special characters
above, you must include it into brackets.
i.e.: Dialogs have class #32770. The class rule for dialogs would be [#]32770.

Priority of Class Rules


Each Class Rule has a sequence number. Every time TestDrive-Gold tries to
find the internal type of a child window it will iterate through the Class Rules
starting from the one with the lowest sequence. Low sequence means high
priority.
The order of the Class Rules is very important and the more specific rules must
be higher up in the list.
Example: Class Rule “Thunder*RadioButton” must go before “Thunder*Button”.
Importing and Exporting Class Rules
These functions enable new or missing Class Rules to be added to an existing list
on mass, avoiding the need to key each one individually.
When rules are exported they are added to a file with the extension .rul, the
location of this file can be selected by the user at the time of export. All Class
Rules will be added to the file but if only some rules are required the file can be
opened using an editor such as Notepad and modified. When the Import option is
selected you are prompted to locate the file containing the Class Rules. If the
format of any of the rules being imported is invalid an error message will be
displayed, the rule must then be corrected prior to trying the request again. Once
all errors have been fixed a message box containing the number of new rules that
will be added and the number of existing rules that will be updated is displayed,
the Class Name is used to identify a matching rule. At this point click on Cancel to
stop the import or Proceed to continue. The changes to the Class Rules will not
however be stored in the database until the Accept button is clicked on the Class
Rules screen.
Messages
TestDrive-Gold knows when to take a new screen picture by listening to the
application under test and monitoring for when it becomes “quiet”, i.e. there is no
message or CPU activity. Some applications send certain messages constantly and
therefore these need to be ignored. These messages are defined in the following
table which should only be modified with assistance from your support contact.

Validation Rules
By default when a script is re-played in TestDrive-Gold, every screen which
actually appears is compared against the expected screen stored in the script
and any differences are highlighted. Validation Rules provide an alternative to
this method in situations where either there is no single correct answer, or the
correct answer might be different dependant on other variables. They also
enable more complex checks to be performed on the information that is
displayed on screen.
Using the following display, a library of Validation Rules can be created at the
Project level. These rules can then be used by individual scripts, see later
section for more information on creating and using Validation Rules.
Validation Functions
Validation Functions are used to perform more complex processing that cannot be
easily achieved by the Validation Rule wizard. They enable small programs to be
created, the results of which can be used by a Validation Rule to compare against a
screen field in order to validate it. For example, a Function could be created to
calculate today’s date or to retrieve the correct salesperson code for a specific order
number. There are two types of Validation Function. A Custom Function must be
entirely created by the user and keyed into TestDrive-Gold via a VB script.
Alternatively an SQL Function can be created using a wizard which guides the user
through checking values in a database file.
All of these Functions are created at the Project level and displayed on the screen
below. They can then be used by any scripts within that Project. See the later section
for more information on using Validation Functions within Validation Rules.
Custom Functions
Custom Functions enable the user to write a VB script to perform more complex
processing that is not possible with a simple Validation Rule. When a Custom
Function is created or edited the following screen is displayed.
SQL Functions
SQL Functions enlist a wizard to assist the user in performing a check against a
database table. When an SQL Function is created or edited the following is the first
screen to be displayed.
Export to Custom Function
Converts an existing SQL Function into a Custom Function so that the
underlying VB script can be modified.
There are five possible values for the option which determines how the return value
should be treated.
Just return the value The contents of the selected record and field will be returned. In
the above example this would be the First Name of the salesperson being retrieved. If
there is more than one record which matched the selection, the contents of the first one
is returned.
Return the number of records A numeric value containing the total number of records
which match the selection criteria is returned.
Return the average of this field A numeric value containing the average of all of the
field values on records that match the selection criteria is returned.
Return the minimum value of this field A numeric value containing the lowest value of
all of the field values on records that match the selection criteria is returned.
Return the maximum value of this field A numeric value containing the highest value
of all of the field values on records that match the selection criteria is returned.

The Function definition has now been completed but can be modified at any time using
the Back and Next buttons. Clicking Next on the above screen will display the final
screen shown below, from where it is possible to run a test on the SQL Function to
ensure that it is working as expected prior to plugging it into a Validation Rule.
All of the parameters which have been defined as ‘Runtime’ are listed and for each one a
test value must be keyed. At this point click the ‘Test’ button. The file will be interrogated
and the results returned. If there is a problem returning the desired results a red cross
and an error code and reason is displayed.
Click Finish to store the Function in the database at the Project level.
Recording

There are two ways to select the application to be tested. All applications which are
currently running on the PC are listed and one of these can be selected by clicking
the ‘Attach to running application’ radio button and then highlighting the correct
application in the list. Alternatively click the ‘Open new application’ option which will
cause the specified application to be launched prior to recording. You must specify
the location of the application to be launched or the initial URL for browser
applications. You can select from a list of all previously launched applications or
browse for new ones. Your previous selection is remembered to make repeated
testing over the same application easier.
During the recording process, depending on your product Options TestDrive-Gold
will either remain in full screen mode or reduce to a compact view as shown below.
The expand/collapse icon in the top right corner of the form can be used to toggle
between these two views. As each new screen appears a picture is taken and every
input action is recorded. The ‘Take Picture’ button can be clicked on if significant
changes have been made on a screen and you would like to record this
intermediate stage for comparison during playback.
Notes

• Browser windows are displayed without the menu options and toolbars. However,
the Back option can be selected from the menu that appears after right clicking
within the browser window.

• During the recording process, five icons in the status panel of TestDrive-Gold
indicate the type of activity which is being monitored for. Typically it is when all
activity stops and the application goes quiet that a new picture is taken. Therefore, if
pictures are not being taken at the correct times, noting down which icons are active
and conveying this information back to your support contact will help to set the
Options correctly. In order from left to right, these icons represent:

- CPU Activity

- Messages

TestBench PC Products 4-48 The Original Software Group TestDrive-Gold

- Windows APIs

- Ajax

- Web Navigation
• If a control has a scroll bar, the action of clicking and holding the up or down arrows to
scroll the contents list will be re-played in real time as the time delay between the down
and up click is recorded. However this may not cause exactly the same result on
playback due to differences in application response times. Therefore whenever possible
it is preferable to use an alternative method to scroll the list, for example drag the rocker
button, use single clicks on the up and down arrows or click within the scroll bar to move
one page at a time.

ii. Select ‘Performance Information and Tools’.

iii. Click ‘Adjust Visual Effects’

iv. Uncheck ‘Animate controls and elements inside windows’.

v. Click ‘OK’.

• No additional load is placed on the server while a Script is being recorded. • If Alt/Tab
is used while recording to access another application, only the Alt is recorded because
after this point focus is no longer on the application being recorded. This will not replay
correctly and therefore this action should be avoided while in record mode.
• If a browser window is maximized or minimized while in record mode, this action is not
repeated on playback and therefore if a specific browser size is required it should be set
prior to initiating the record process.

• When an adobe document is launched from a browser, there is a setting inside Adobe
which determines whether the document is hosted inside a browser, or simply hosted
inside Adobe Reader. Only PDF documents hosted inside a browser can be recorded
and analysed correctly. Also, only Version 7 or later of Adobe Reader is supported.

• Please note that when recording on Windows Vista, there is a specific animation
function which must be switched off to enable TestDrive-Gold to work correctly. If this
option is switched on, a message is displayed. The following steps can be followed to
disable the option.

i. Open the Control Panel.


Drag and Drop
TestDrive-Gold supports dragging a file from another application that is not part
of the script into the application that is being recorded. It intercepts the drop
operation and records this as a specific input type. There is no link or
connection to the source of the drag operation as this may not be available at
playback time, however the input is recorded with the full path of the file being
dropped. At playback time the drag is simulated and the drop over the target
application occurs. The input can be edited in order to drop a different file.
There are two types of drag and drop, the Win32 implementation and the OLE
method. The first is fully supported but there is currently no support for the
latter.

Script Structure
While a Script is open it will be shown in the format illustrated below.
Picture
An image of the recorded screen is displayed. If an input action or element in
the Contents list is highlighted, the target element is surrounded by an orange
rectangle on the display. Conversely, clicking an item on the picture will also
highlight that element in the Contents list.
There are 3 buttons at the top left of the panel that determine how the picture is
displayed.

• 100% - Displays the picture at full size

• Page Width - Displays the picture at the width of the panel

• Best Fit - Resizes the picture so that the whole screen is visible within the
panel
Right clicking an item on the Expected Picture displays a floating menu with three options.
Edit Display the Edit Element screen, see later section for more information.
Checking Enables the element to either be checked or not checked for differences on
playback.
Create Validation Rule Automatically store one of the core properties of a screen field within
a tracked field and then define a Validation Rule which uses this or any other tracked fields or
functions. The following screen is displayed.
Property Use the drop down list to select the core property of the element that will be stored
within the specified tracked field and can then be utilized in the subsequent Validation Rule.
Tracked Field Either select an existing tracked field from the list or key in the name of a new
one that will be used to store the value of the element property.
Element Checking Define whether or not the element on the actual screen will be still be
validated against the expected value on playback. It is quite likely that this kind of checking
will no longer be required if a Validation Rule is being created to check the contents of this
element. When the OK button is clicked, the Validation Rule definition screen is displayed.
Please see the later section for more information.
Two additional icons above the screen image control the mode that the panel operates in.

• Mouse icon – Default view whereby clicking on an element in the screen image will
display relevant information about that element in all other panels.

• Outline icon – Markup mode which enables sections of the screen to be annotated.
Markups
You may wish to annotate specific areas of any of the recorded screens, for
example to highlight information which is incorrect and needs to be changed, or for
training or documentation purposes. To do this, click on the outline icon above the
screen picture. Then use the mouse to drag a box around the area of the screen
that you wish to add a comment about. When the mouse is released, a comments
box is displayed as shown below.
Contents
The elements included on the highlighted screen are presented in list form. Click on
an element to highlight it on the screen picture with an orange rectangle and to
populate the Expected Element Properties panel. The Filter button on the main
toolbar can be used to modify the appearance and contents of the element list. See
the Main Panel section near the beginning of this document for more information.

Right clicking an item in the list displays a


floating menu with two options. The ‘Edit’ option
displays the Edit Element screen, see later
section for more information. ‘Checking’
enables the element to either be checked or
not checked for differences on playback.
If an element is not being checked it is
displayed with a yellow cross icon in this list.
Expected Element Properties
The properties pane displays the properties for the currently selected object in
the Contents pane. An additional section beneath the main properties panel is
populated when a specific property is highlighted. This enables the full list of
values to be viewed for any properties that are too large to fit on the main
display. Such properties are displayed with three dots at the end to indicate that
there is more information that cannot be shown.
The properties that appear in bold are those that will be checked for differences
on playback. These are different for each element type and are considered the
core properties for that type. The only non-core properties that can be optionally
checked are the size and location, this is controlled via the TestDrive-Gold
options.

Edit Element
Right click an element either on the screen picture or within the Expected
Contents list and select the ‘Edit’ option in order to display the following
screen.
Track Value Tab
Use this tab to store the contents of this element for use later in this script or in another
script in an Action Map. See the section on Tracked Fields for more information.
Store Start Value The initial
contents of this field when the
screen was first displayed will be
captured and stored in the
specified tracked field.
Store End Value The final
contents of this field after all
keystrokes have been replayed will
be captured and stored in the
specified tracked field.
Track Field The name of the
tracked field which has been
defined at the Project level and
which will be used to store the
contents of this screen element.
Store Whole Field The entire
contents of the screen element will
be stored.
Store Part Field Only the
specified subset of the screen
element value will be stored.
Expected Input Properties
Highlight an input action on the Script pane to view further details about that input
here. The properties displayed vary slightly depending on the type of input selected.
Edit Input
The following display is accessed by right clicking an input action on the Script panel
and selecting the Edit option, then clicking on the Input Properties tab. Please note
that existing input can be changed or removed but new input cannot be added.
Variable Data Field
Optionally select the name of
a variable data field. The
data within this field in the
Variable Data Set being
used by the Script will be
used for comparison
purposes on playback
instead of the fixed input that
was actually recorded. See
the later section on variable
data for more information.
Tracked Field Optionally
select the name of a tracked
field. The data contained
within this field at playback
time will be used for
comparison purposes
instead of the fixed input that
was actually recorded. See
the later section on tracked
fields for more information.
Summary
The Script or Screen Summary panel has several sections within it, each one is
described in more details below. The title of each section for which information exists
is highlighted, headers for other sections where no data exists are not.

Tracked Fields
All Tracked Fields in use in this
script are listed on this screen.
Beneath each field are listed the
screen elements for where the
tracked field is being used. See
later Tracked Fields section for
more information. Click on any
field in the list to highlight the
details for that field on the other
Content panels. Click on the ‘click
here’ link to define new or modify
existing tracked fields for the
Project.
Variable Data
All variable data fields in use in this script are listed on this screen. Beneath each
field are listed either the input values or the start values of the fields for which the
variable data will be used. Click on any field in the list to highlight the details for that
field on the other Content panels. Click on the ‘click here’ link to manage the Local
and Public variable data sets.

Validation Rules
When the script name is in focus, this panel lists all of the Validation Rules
defined for the entire script. There are two levels of Rules:

• Tracked Field. These are effectively defined for the whole script, but will
only be applied to screens where at least one of the tracked fields utilised
by the rule is populated on that screen. If none of the tracked fields are
populated then the Rule is bypassed for the screen. Therefore the Rules at
the Tracked Field level are only applied when a relevant tracked field
changes.

• Screen Rules. Rules defined at the screen level for individual screens in
the script are always applied, regardless of whether or not any tracked
fields are updated on the screen.
Blocks
There are occasions when the key goal is to match a ‘block’ of similar data elements
within a web page, not necessarily the individual items themselves. For example, when
viewing a long list of products where sometimes a new product is added, it is necessary
to ensure that on playback all of the details of the same product on the expected and
actual pages are matched, even if when the script was recorded the product appeared at
the top of the list but on playback it was half way down. To facilitate this matching
process, ‘blocks’ of data can be identified on the screen and these are used on playback
when matching actual and expected elements. Click on the ‘click here’ link on the Blocks
section of the summary panel to launch the Block Wizard.
In the following example a radio button which appears in each of the ‘blocks’ within the
same web- page has been chosen as the anchor, this is the item or items that will be
found in every block and can be used to identify the repeating pattern. To define the
anchors, click on one of the repeating items, then hold down the shift key and click on a
second item. These two elements will appear in bright red, all other similar items which
have been identified as anchors will be given a pale red border.
When the anchors have been defined, click Continue to view the actual blocks that
have been identified as shown below. What this actually means is that during
playback, a match of the entire block will be searched for. This means there is no
danger that the radio button from one block will be incorrectly matched with that
from another block, which could result in a mouse click selecting entirely the wrong
company.
If the number of blocks that have been identified does not match the number of
anchors that were found on the previous screen, a red icon instead of a green one
will appear next to the text above the screen picture and the Continue button will be
disabled. If this occurs, click the Back button to modify your anchor selection. Only
when matching numbers of anchors and blocks are found will the wizard allow you
to move to the next stage.
When the blocks have been correctly identified, click on Continue to select
one or more Identity Elements. These are the items that are also found in
each block but that will help to identify a block and separate from the others,
therefore wherever possible these should be unique. Sometimes one single
item is not enough to uniquely identify a block, in this situation a combination
of more than one element can be used, for example Product Code and
Package Quantity. Click on an item to select it as an identity element, the
identity elements found in other blocks will all be highlighted. To select more
than one identity element, hold down the shift key.
If identity elements were not found in every block, a red icon instead of a
green one will appear next to the text above the screen picture and the
Continue button will be disabled. If this occurs, click the Back button to
modify your selection. Only when matching numbers of blocks and identity
elements are found will the wizard allow you to move to the next stage.
Click Continue to display the final screen in the Block Wizard. This enables error
checking to be switched off for all of the elements within the blocks although the rest
of the screen will still be checked for differences on playback. This is a sensible option
to take if the list of items for which blocks have been defined is likely to change, or
if variable data will be used to enable input to be played over alternative blocks.
Once a Block Set has been defined, it is listed in the Summary panel as shown below.
The Block Set header is followed by any input for elements within the block, which is
turn followed by the identity elements for the input.
right click options are available depending on which item in the block definition was
highlighted at the time. To view the blocks that have been defined in the wizard,
click on the Block Set header in the Blocks panel.
Block Set
Edit Display the first screen of the Block Wizard as described above to modify the
block selection.
Delete Remove the Block Set.
Block Errors Toggle the flag to either include or ignore differences in block
elements during playback.
This option is also present on the final wizard screen as described above.
Input
Edit Open the Edit Input form to modify details about the input.
Delete Remove the input.
Identity Element
Edit Open the Edit Element form to modify details about the element.
Checking Specify whether or not this element should be checked for differences
on playback.
Create Validation Rule Automatically store one of the core properties of a screen
field within a tracked field and then define a Validation Rule which uses this or any
other tracked fields or functions. See the previous ‘Picture’ section for more
information.
Script Options
Script Options can be accessed in several ways:
1. Shown when the ‘Playback’ button is pressed for a selected Script.
2. Double clicking the Script Header in a selected script or right clicking and
selecting the ‘Edit’ option.
3. Clicking on the Options button when saving the Script.

Tracked Fields This button will allow any Tracked Fields that are associated with
this Project to be edited. See the separate section on Tracked Fields.
Start Mode Determines whether the specified application under test will be
launched by TestDrive-Gold at playback time or whether it will be already running
and TestDrive-Gold can simply attach to it. If the ‘Launch’ option is selected and
the application requires any parameters these can be specified as part of the
command.
Application For browser scripts this will be the initial URL for this script, for all
other scripts it is the name and location of the application under test. If the full
application path is not known, the browse button to the right of the field allows the
application to be located.
Num. Runs The number of times this script should be executed during the current
run.
Use Variable Data If the script has been set up to use variable data this field will
be automatically checked. Un-checking this field will cause the script to be
replayed with its original fixed values.
Use Tracked Fields If the script has been set up to use tracked fields, this option
will be automatically checked. Un-checking this option will cause the script to be
replayed with its original fixed values.
Activate Test_IT Choose whether TestBench facilities are to be enabled when this
Script is executed. Using this option will instruct TestBench to start the Test Case
before replaying the script.
If using JWalk, Test_IT gives the capability to ensure a consistent initial environment,
together with database, data area and program parameter verification. In fact all of
the TestBench Test Case functionality can be used.
If using another application which makes database changes on the iSeries or Oracle,
you will be prompted for the User ID which will actually be making the changes, or for
Oracle the Computer Name can also be used. If the

Expected Input Properties


Highlight an input action on the Script pane to view further details about that input
here. The properties displayed vary slightly depending on the type of input
selected.
Exceptions
These Script level exclusions are an important addition or alternative to that
achieved by excluding individual fields. For example, the time of day may appear
on every screen in a Script. One option would be to tune each screen to ignore
any differences in the time, but Script level exclusions allow this to be achieved via
a single entry.
To add an exception right click anywhere in the data entry grid on the exceptions
tab and then click the add option. To edit or delete an existing exception right click
on the exception and then click ‘Edit’ or ‘Delete’ from the floating menu.
Options
The values listed here are those values that were set up using the Options toolbar
at the time the script was recorded. They are used to determine how the script
should be recorded and played back. They are split into six sections, the first three
of which are used for recording purposes and therefore cannot be changed here.
These are listed for information only. The latter three sections can be changed for
this script and these changes will be applied to subsequent executions of the
script. See the earlier Options section for a full explanation of all of the values.

Playback
Any changes to the Test Items are stored back on the server if there is an active
connection when the Keep button is clicked at the end of the run to save the
results. Once the playback of the script has been completed a similar window is
displayed which enables statuses and comments to be entered (see later Scoring
Results section).
Results
Once the execution of a Script is complete or the execution has been interrupted by the
user, a panel in the following format is displayed.
Each screen which was presented during the execution of the Script along with its
associated input is listed together with a signal light indicating:
Green For screens, all items that were actually presented matched your expectations as
defined in the Script or expanded through variable data. For input, the action was
successfully played back.
Red At least one item did not match your expectations or the input was not played back.
Orange The screen was not verified as you had instructed it to be omitted (not checked).
The panes on the display can be moved to different positions and also overlain. To move
a pane, click on its title and drag it to the new location. To overlay a pane, click on its title
and position it over another pane, a tab for each one will appear at the bottom of the
pane. The panes can be hidden by clicking on the Pin icon, when this is done a tab for
the pane will appear on the left hand side of the screen. To restore the pane to the main
view, click on the tab and then on the Pin icon.

TestDrive-Gold can store two possible formats for this window, which can be selected by
right-clicking on any of the orange window labels.
• Standard – The Picture, Properties and Errors panes are a permanent part of the
display.
• Preferred – This is only available if the ‘Set Preferred’ option has previously been
selected. It enables an alternative to the above format to be created and stored, this will
be specific to the PC being used. Simply adapt the display to suit your requirements and
then select ‘Set Preferred’ to create or overwrite the preferred layout.
• Set Preferred – This selects the current view as a preferred layout which can then be
loaded at anytime via the ‘Preferred’ layout option explained above.
Markups
You may wish to annotate specific areas of any of the screens in the results, for example
to highlight information which is incorrect. To do this, click on the outline icon above the
screen picture. Then use the mouse to drag a box around the area of the screen that
you wish to add a comment about. When the mouse is released, a comments box is
displayed as shown below.

Description Add some text to describe the


issue or instruction.
Flag As Error If this option is checked, the
screen will be marked with a red cross and
a warning created in results.
Pass/Fail Results
If the option has been selected in the TestDrive-Gold Options window, after clicking on
the Keep button to save screen results, a window will be displayed so that statuses can
be recorded for each Test Item selected at the start of the run. The list of selected Test
Items can also be changed here.

Amending Scripts
In the event that changes occur to the system under test for which Scripts already exist, it
is possible to easily change these Scripts.
Changes that can occur are:
• Extra screens can be recorded and inserted into existing test Scripts (see below).
• Incorrect or surplus screens can be deleted from existing test Scripts (see below).
• Both the screen content and input events can be changed to cater for changes (see
earlier sections relating to Script Structure).
• Any variable data that has been set up for a Script can be altered (see later section).
• Validation Rules can be modified.
• The Scripts that will be run as part of an Action Map can be changed.
• Scripts which have been ‘healed’ as part of the Verification method will have been
changed (see later chapter for more details).
Deleting a Screen
To delete a screen from an existing Script:
1 Open the Script that requires changing and click on the relevant Screen name.
2 Right click and select the ‘Delete’ option. You will be asked to confirm that deletion is
required.
3 On confirmation the screen will be deleted.
4 Save the Script.
Care must be taken in the event that Start & End Loops have been specified within the
Script, especially if the screen being deleted is marked as either one (see Variable Data
section).

Inserting a Screen
To insert a screen into an existing Script:

1 Open the Script that requires changing and optionally click on the Screen name
before the point at which you want to insert a new screen.

2 Ensure that the system application under test is on the correct screen for recording.

3 Press the Record button. The following screen will be displayed.


4 Click the Replace option of you wish to delete the existing script and create a new one.
Click the Insert option if you wish to add some new screens to the current script and
specify the desired location of the new screens. Click Next to continue or Cancel to
return to the original script.

5 The Record Wizard is displayed from where the application containing the new
screens to be recorded can be chosen. At this point TestDrive-Gold will go directly into
‘Record’ mode with the focus placed on the selected application. Commence the
required testing and when finished press the ‘Stop’ button

6 Play back the script to verify that


the screens have been captured
correctly and then save the
changes.
Script Verification
This function can be used when more significant changes have been made to the
application rendering existing scripts out of date. When the script is played back in verify
mode, the old screens will be replaced with the new ones and TestDrive-Gold will
attempt to transfer all field information from the old to the new elements. In order to do
this it attempts to match elements on the old screens to those on the new ones, once a
match has been found it attaches existing input, variable data, tracked fields and
checking flags to the new element.
Verification Options
A verification run is begun by right clicking the Script name and selecting the ‘Verify’
option. The Verification Options screen is then displayed which enables the options that
control the verification run to be modified.
Notification
These options control the conditions under which playback will halt to request manual
intervention when elements cannot be matched. If matching does not successfully occur
any input, variable fields, tracked fields or checking flags defined for an existing element
will not be retained by the corresponding element on the new screen.
Any field If any fields cannot be
matched the verification screen
will be displayed so that these
fields can be matched manually
by the User.
Core fields only The
verification screen will be
displayed for manual matching
only when existing core fields
cannot be matched. The second
Notification Option as described
below determines which
elements should be treated as
core fields.
Core Fields
Use these check boxes to
determine which elements screen below specify that a core field is an element
should be regarded as core which has either input, variable data or tracked fields
fields, which is important when associated with it. If the ‘Core fields’ notification
determining the level of user option is selected, the user will be given the option
intervention required in the to manually match any such fields where automatic
matching process as described matching fails.
above. The settings on the
Click on the Next button to select the application under test from the Verification Wizard.
When the verification run has complete, choose whether or not to overwrite the existing
script. The changes will not be saved however until the Save option is taken.
Notification
When elements cannot be matched, the following variation of the main TestDrive-Gold
panel is displayed to enable manual intervention in the matching process. It is initially
displayed in compact mode but can be expanded to full screen mode by clicking on
the expand icon at the top right of the display. The notification level (which is explained
in more detail above) determines whether this screen is displayed when any
unmatched fields are encountered or for core fields only.
Core Fields
The following display is split into two parts, Missing Elements and Matched Elements.
For all elements listed, the four icons in the header bar determine whether the object has
any associated input, variable data, tracked fields or if the element is not being checked
for differences. These four properties are used to determine whether or not an element
is a core field, depending on the options that were selected at the start of the verification
run.

Any core fields that were present on the expected


screen for which a match cannot be found on the
actual screen are listed here. It is possible to proceed
with the replacement of the expected screen without
matching these elements; however any core
information will be lost. If an element is present on
the actual screen to which this element should be
matched, then the two elements can be matched
manually and in doing so all core information will be
transferred to the new field.
Manual Matching
To match a field manually, left click to highlight the element on the Core Fields display
and then drag the element to either the Actual Picture or Actual Contents list until the
target element is also highlighted. Then release the mouse button to complete the
manual matching process. If you matched a missing element, the element will be
removed from this list and added to the list of matched elements. If you selected an
element from the list of matched elements because the matching was incorrect and have
now matched it to a different element on the actual page, it is possible that you will see
an element added to the list of missing elements. This will be the field that was
previously matched to the actual field you have just selected.
Verifying Screen
This panel is used to determine the next action to be taken in the verification run. It can
be expanded or minimized using the double arrow icon at the top right of the panel.
Matching
Options
The options on
this panel control
the level of
accuracy of the
matching
process.
Verification Checkpoints
Right click any screen in the Script panel and select the ‘Verification – Always Stop Here’
option from the floating menu to create a verification checkpoint. This ensures that when
the script is verified it will always stop at the selected screen. This is important, for
example, when a new mandatory field has been added to the application. The
verification window is only displayed when existing core fields cannot be matched but
verification does not stop when new fields are found. Therefore there is no opportunity to
add input for this new field. Using a verification checkpoint the run can be automatically
stopped at this screen and input manually added to the new field.
Variable Data
Variable Data allows you to record shorter, more flexible Scripts and is thus a key
feature of TestDrive-Gold.
For example, let’s imagine that you want to stress test the ‘Add a Customer’ function.
One option would be to record a Script that adds a single customer and then set the
Script Option, ‘Number of Runs’ to 100. However, there is a problem with this approach.
The Script would successfully add the first customer, but all the following 99 attempts
might be rejected with a ‘Customer already on file’ error message as the Script uses a
fixed value for the Customer Number.
Without Variable Data the only alternative is to record a Script that is one hundred times
longer and contains the details of one hundred different customers. This would take
much longer to record and be almost impossible to maintain if the screen layouts for the
‘Add a Customer’ function were changed at any time in the future.
Variable Data allows you to feed selected Data and Constant fields with external
information, rather than using the fixed value that was recorded. You can therefore
record a Script that adds a single customer but then select the Customer Number and
perhaps the Customer Name to be fed not as the record constant, but by fields.
Once this has been done simply define the Customer Numbers and Names that are to
be keyed in and they will be automatically merged with the Script when it is run.
There are five basic steps involved in creating a Variable Data Script:-
1 Record a simple Script to process a single transaction.
2 Access the Script Properties panel by right clicking the Script name and selecting the
‘Edit’ option. Either create a Public Variable Data Set or add data to the Local set. For
controls such as Radio Buttons and Select Boxes the transaction values must be equal
to one of the possible list of values for this field. One transaction is equal to one iteration
of the Script. Variable data can also be defined for the full contents list of controls such
as list boxes. In these cases the transaction data consists of all possible values for the
control separated by the | symbol.
If you have data in another windows application, such as a spreadsheet, you can use
copy and paste to populate the variable data values for a field. Use the Insert Clipboard
Rows option to paste into the Edit Variable Data window.
3 Check the Use Variable Data box on the Playback Wizard to ensure that the keyed
variable data will be used during playback.
4 Link the fields that you have just created on the previous Edit Variable Data panel to
the actual controls on the screens so that TestDrive-Gold knows where to key in the
variable data. Access the Input for the screen and specify that the keystrokes should
come from a variable field. If the data being displayed on any screen can vary then
access the contents and specify that the current value and/or content will come from a
variable field. If information keyed into any field is displayed on another screen, the
current value on that screen can also be made variable to prevent differences being
reported during playback.
5 Specify which screens will be the loop points for this Script by right clicking the screen.
The loop points are indicated on the Script using red up or down arrows.
Access Variable Data
There are two types of Variable Data Sets that can be used in TestDrive-Gold.
Local Variable Data is stored with the individual Script and can only be used by the one
Script.
Public Variable Data Sets are stored at the Project level and can be utilised by any
Script within the Project.
Variable Block Selection
In addition to using blocks to improve the matching process during script playback, their
selection can also be made variable so that the input that was originally recorded against
one block can actually be associated with a different block during playback. To achieve
this, it is the Identity Element for the block that must be made variable, not the input
itself. The input remains the same but it must be keyed against an alternative block.
Tracked Fields
Tracked Fields enable system generated values to be stored and re-used later in the
Script. This has several benefits. Differences during playback as a result of these
unpredictable values can be avoided. Also, these values can then be used as input on
subsequent screens, whereas without Tracked Fields testing would need to be paused
while the correct value was added to the screen in question. They can also be used
within Validation Rules, please see earlier section for more information.
For example, when a new customer is created the system generates a customer number
which is one greater than the last customer used. If data protection is not being used
then the customer number will be different every time the Script is re-played. As the
customer number appears on all of the customer details screens this would cause
differences on playback. One option would be to exclude the customer number from the
comparison, but this means that if the customer number were displayed incorrectly this
would be missed by TestDrive-Gold. Therefore the customer number can be placed into
a Tracked Field and this field can be used for comparison purposes for all occurrences
of the customer number on subsequent screens.
TestBench Integration
TestDrive-Gold is integrated with TestBench on the iSeries or Oracle, providing several
benefits:-
• Central, secure storage for Scripts and results.
• Option to protect the initial test database on the server so that testing can be repeated
without the need to re-create the test environment.
• Full reporting capabilities for all stored test results.
Individual Scripts can optionally be integrated further with TestBench, providing the
following additional benefits:-
• Tracking of transactions after their arrival on the server, providing database effects and
the ability to define expected results using Data Rules.
• Automatic creation of a consistent testing environment on the server.

This full Script integration is requested by checking the ‘Activate Test_IT’ box on the
Script Properties panel. The User ID or Computer Name which actually performs the
updates should then be specified. Further details on reviewing results from tests can
be found in the TestBench-PC section of the user guide.
Troubleshooting
If the expected results are not being achieved while either recording or playing back
using TestDrive-Gold, this can often be resolved by modifying the Options that are being
used. TestDrive-Gold is shipped with some standard Options sets for known application
types. The following list describes how to resolve some of the most common scripting
issues.
Recording Issues
A picture is not being taken when clicking on a tab control within a web page.
Most web tab controls use some form of DHTML to simulate the known Windows tab
controls. This can be monitored for using the option ‘Take picture on content changes’.
Too many mouse moves are being recorded.
Try turning the option ‘Generate Mouse Move inputs for content changes’ off. This option
should only be used as a last resort if the option ‘Generate Mouse Move inputs for
elements with events’ does not record the required input.
Extra pictures are being taken when changing the focus between the target
application and TestDrive-Gold or another application.
Turn the option ‘Take picture on Active window changes’ off.
Playback Issues
A picture is being taken too early (or additional pictures are being taken) so the
playback of input fails.
Try setting the ‘Wait for replacement screen similarity to be’ option to 50% (experiment
with this value). This will make TestDrive-Gold wait for a replacement screen that is
more similar to the expected screen than the current actual screen. If you find you need
this option to enable reliable playback, you might want to configure your ‘During
playback, set activity timeout to’ setting to something more appropriate.
If there are no replacement screens, it might be because Internet Explorer is doing some
post-document-complete processing. Try setting the ‘Delay analysis for’ option to
something like 50ms so we wait for this post- processing to occur.
Elements within a web page are not being matched correctly. This results in input
not playing back.
Some web sites are designed around a single page where the multi-page feel is
simulated by URL parameters. As default we ignore these parameters in our matching.
This results in a link to index.htm?home being treated the same as index.htm?contactus.
To resolve this problem, disable the option ‘For web pages, exclude parameters in
comparisons’.
Playback of some input fails in Lotus Notes, Microsoft Access or another similar
application.
These applications do not have a 100% implementation of Microsoft Active Accessibility.
Try turning on the option ‘Playback dynamic input using recorded offset’. It is also worth
noting that any input that requires this option might not correctly playback if the target
element changed location.
Saving Scripts Locally
If a script has been created which needs to be sent to your support contact for help in
diagnosing an issue or for any other purpose, this script can be saved to your local PC
from where it can then be attached to an email. When this script is open in TestDrive-
Gold, hold down the Ctrl key and Right Click the script name to gain access to an
additional menu item called ‘Save Script As File’. If this option is selected, the location
and name of the new script must be chosen. Click ‘Save’ and then click ‘OK’ on the
subsequent ‘Save script to file’ window (the options on this screen are only required
when creating JWalk Integration scripts).
To load a script into TestDrive-Gold which has been saved locally, first of all ensure that
no script is currently open. Then hold down the Ctrl key and Right Click to obtain a list of
local scripts in the default location.

S-ar putea să vă placă și