Documente Academic
Documente Profesional
Documente Cultură
Introduction
TestDrive-Gold is designed to deal with standard Windows controls and any
application which has implemented the Microsoft Active Accessibility Standard,
including languages such as .NET, Visual Basic, Delphi, Visual Lansa, Coolplex,
JWalkTM or typical ‘screen-scrape’ products. The major elements - record and
playback, maintaining & editing scripts, storage, analysis and presentation of
results - have all been designed to offer maximum efficiency and functionality with
minimum complexity.
TestDrive-Gold also integrates with our market-leading testing suite, TestBench.
In this configuration, transactions are followed from arrival on the server, through
all database effects, program calls and other interactions to final results – true end
to end testing of complete client server and e-commerce transactions.
In summary, TestDrive-Gold provides:-
• Recording of windows, input and navigations to form Scripts.
• Re-play of those Scripts with automatic verification of the actual results against
those expected or against pre-defined rules.
• Optionally when integrated with TestBench, TestDrive-Gold ensures that a
consistent environment is set at the start of each test execution
• Optionally when integrated with TestBench, TestDrive-Gold triggers Test_IT to
ensure that the database and reports are verified along with the screens.
Platforms
The TestDrive-Gold repository can run on the following platforms:-
• IBM iSeries (includes integration with TestBench)
• Oracle (includes integration with TestBench)
• SQL server
• Access database (for evaluation purposes only)
Browser Requirements
TestDrive-Gold can be used with Microsoft Internet Explorer version 5.5 or
later with a recent service pack. No other web browser products are
supported.
Getting Started
TestDrive-Gold is installed when TestBench-PC is installed. TestBench-PC
provides a GUI interface into some of the features that can also be accessed by
TestBench on the iSeries. Once a validation code has been registered,
TestDrive-Gold is launched by clicking on the TestDrive-Gold button at the
bottom of TestBench-PC. See the TestBench-PC User Guide for more
information.
Options
The Options control the settings that affect the recording and playing back of
scripts using TestDrive-Gold. Access this panel from the Tools menu, by clicking
the Options button on the toolbar or by right clicking the name of the currently
opened script.
Web and GUI Tab
All options with no symbol to their left are applicable to both Web and GUI
applications. Those options with a world symbol apply only to Web
applications, those with a document symbol are relevant to GUI applications
only.
Screen Analysis
TestDrive-Gold has many options which enable it to record and playback scripts and
analyse the content of screens for applications written in a wide range of languages. It
uses Microsoft Active Accessibility (MSAA) and Class Rules to define how to interact
with the elements on screen and the Messages help to control when screen pictures will
be taken.
If you are using TestDrive-Gold for the first time over a new application you are likely to
achieve a good result because by default it uses Microsoft Active Accessibility to analyse
screen contents. This means that TestDrive-Gold will enumerate the contents of every
screen using Microsoft Active Accessibility and every control which implements this
standard will be analysed without the need for any additional setup. If there are then any
specific window components which do not support this standard, Class Rules can be
created to retrieve the data by sending and receiving Microsoft Standard and Common
Control messages (MSCC). Any controls which do not support either of these techniques
will be recorded as a panel and their contents will not be retrieved. In summary:
2 Use Class Rules messages for anything which does not support MSAA
When using Class Rules, TestDrive-Gold will enumerate all of the window controls on
every screen and then use the Class Rule definitions to retrieve their content. This does
mean that controls without a window handle will not be included in the contents list.
Class Rules
When TestDrive-Gold is recording and playing back scripts and the MSAA standard
cannot be used, it looks at the names of the windows components on each screen
and must convert these to an internal type so that it knows how to interact with the
component and obtain the information that is listed on the Content screen. It does
this by obeying the Class Rules listed below. For example, a class name of
“Thunder*Option Button” will be treated as a Radio Button.
Pattern Matching
The following characters are allowed for Class Rules that use pattern matching:
Character Matches
? Any single character
* Zero or more characters
# Any single digit (0-9)
[Charlist] Any single character in Charlist
[!Charlist] Any single character not in Charlist
If the class name that you want to match includes any of the special characters
above, you must include it into brackets.
i.e.: Dialogs have class #32770. The class rule for dialogs would be [#]32770.
Validation Rules
By default when a script is re-played in TestDrive-Gold, every screen which
actually appears is compared against the expected screen stored in the script
and any differences are highlighted. Validation Rules provide an alternative to
this method in situations where either there is no single correct answer, or the
correct answer might be different dependant on other variables. They also
enable more complex checks to be performed on the information that is
displayed on screen.
Using the following display, a library of Validation Rules can be created at the
Project level. These rules can then be used by individual scripts, see later
section for more information on creating and using Validation Rules.
Validation Functions
Validation Functions are used to perform more complex processing that cannot be
easily achieved by the Validation Rule wizard. They enable small programs to be
created, the results of which can be used by a Validation Rule to compare against a
screen field in order to validate it. For example, a Function could be created to
calculate today’s date or to retrieve the correct salesperson code for a specific order
number. There are two types of Validation Function. A Custom Function must be
entirely created by the user and keyed into TestDrive-Gold via a VB script.
Alternatively an SQL Function can be created using a wizard which guides the user
through checking values in a database file.
All of these Functions are created at the Project level and displayed on the screen
below. They can then be used by any scripts within that Project. See the later section
for more information on using Validation Functions within Validation Rules.
Custom Functions
Custom Functions enable the user to write a VB script to perform more complex
processing that is not possible with a simple Validation Rule. When a Custom
Function is created or edited the following screen is displayed.
SQL Functions
SQL Functions enlist a wizard to assist the user in performing a check against a
database table. When an SQL Function is created or edited the following is the first
screen to be displayed.
Export to Custom Function
Converts an existing SQL Function into a Custom Function so that the
underlying VB script can be modified.
There are five possible values for the option which determines how the return value
should be treated.
Just return the value The contents of the selected record and field will be returned. In
the above example this would be the First Name of the salesperson being retrieved. If
there is more than one record which matched the selection, the contents of the first one
is returned.
Return the number of records A numeric value containing the total number of records
which match the selection criteria is returned.
Return the average of this field A numeric value containing the average of all of the
field values on records that match the selection criteria is returned.
Return the minimum value of this field A numeric value containing the lowest value of
all of the field values on records that match the selection criteria is returned.
Return the maximum value of this field A numeric value containing the highest value
of all of the field values on records that match the selection criteria is returned.
The Function definition has now been completed but can be modified at any time using
the Back and Next buttons. Clicking Next on the above screen will display the final
screen shown below, from where it is possible to run a test on the SQL Function to
ensure that it is working as expected prior to plugging it into a Validation Rule.
All of the parameters which have been defined as ‘Runtime’ are listed and for each one a
test value must be keyed. At this point click the ‘Test’ button. The file will be interrogated
and the results returned. If there is a problem returning the desired results a red cross
and an error code and reason is displayed.
Click Finish to store the Function in the database at the Project level.
Recording
There are two ways to select the application to be tested. All applications which are
currently running on the PC are listed and one of these can be selected by clicking
the ‘Attach to running application’ radio button and then highlighting the correct
application in the list. Alternatively click the ‘Open new application’ option which will
cause the specified application to be launched prior to recording. You must specify
the location of the application to be launched or the initial URL for browser
applications. You can select from a list of all previously launched applications or
browse for new ones. Your previous selection is remembered to make repeated
testing over the same application easier.
During the recording process, depending on your product Options TestDrive-Gold
will either remain in full screen mode or reduce to a compact view as shown below.
The expand/collapse icon in the top right corner of the form can be used to toggle
between these two views. As each new screen appears a picture is taken and every
input action is recorded. The ‘Take Picture’ button can be clicked on if significant
changes have been made on a screen and you would like to record this
intermediate stage for comparison during playback.
Notes
• Browser windows are displayed without the menu options and toolbars. However,
the Back option can be selected from the menu that appears after right clicking
within the browser window.
• During the recording process, five icons in the status panel of TestDrive-Gold
indicate the type of activity which is being monitored for. Typically it is when all
activity stops and the application goes quiet that a new picture is taken. Therefore, if
pictures are not being taken at the correct times, noting down which icons are active
and conveying this information back to your support contact will help to set the
Options correctly. In order from left to right, these icons represent:
- CPU Activity
- Messages
- Windows APIs
- Ajax
- Web Navigation
• If a control has a scroll bar, the action of clicking and holding the up or down arrows to
scroll the contents list will be re-played in real time as the time delay between the down
and up click is recorded. However this may not cause exactly the same result on
playback due to differences in application response times. Therefore whenever possible
it is preferable to use an alternative method to scroll the list, for example drag the rocker
button, use single clicks on the up and down arrows or click within the scroll bar to move
one page at a time.
v. Click ‘OK’.
• No additional load is placed on the server while a Script is being recorded. • If Alt/Tab
is used while recording to access another application, only the Alt is recorded because
after this point focus is no longer on the application being recorded. This will not replay
correctly and therefore this action should be avoided while in record mode.
• If a browser window is maximized or minimized while in record mode, this action is not
repeated on playback and therefore if a specific browser size is required it should be set
prior to initiating the record process.
• When an adobe document is launched from a browser, there is a setting inside Adobe
which determines whether the document is hosted inside a browser, or simply hosted
inside Adobe Reader. Only PDF documents hosted inside a browser can be recorded
and analysed correctly. Also, only Version 7 or later of Adobe Reader is supported.
• Please note that when recording on Windows Vista, there is a specific animation
function which must be switched off to enable TestDrive-Gold to work correctly. If this
option is switched on, a message is displayed. The following steps can be followed to
disable the option.
Script Structure
While a Script is open it will be shown in the format illustrated below.
Picture
An image of the recorded screen is displayed. If an input action or element in
the Contents list is highlighted, the target element is surrounded by an orange
rectangle on the display. Conversely, clicking an item on the picture will also
highlight that element in the Contents list.
There are 3 buttons at the top left of the panel that determine how the picture is
displayed.
• Best Fit - Resizes the picture so that the whole screen is visible within the
panel
Right clicking an item on the Expected Picture displays a floating menu with three options.
Edit Display the Edit Element screen, see later section for more information.
Checking Enables the element to either be checked or not checked for differences on
playback.
Create Validation Rule Automatically store one of the core properties of a screen field within
a tracked field and then define a Validation Rule which uses this or any other tracked fields or
functions. The following screen is displayed.
Property Use the drop down list to select the core property of the element that will be stored
within the specified tracked field and can then be utilized in the subsequent Validation Rule.
Tracked Field Either select an existing tracked field from the list or key in the name of a new
one that will be used to store the value of the element property.
Element Checking Define whether or not the element on the actual screen will be still be
validated against the expected value on playback. It is quite likely that this kind of checking
will no longer be required if a Validation Rule is being created to check the contents of this
element. When the OK button is clicked, the Validation Rule definition screen is displayed.
Please see the later section for more information.
Two additional icons above the screen image control the mode that the panel operates in.
• Mouse icon – Default view whereby clicking on an element in the screen image will
display relevant information about that element in all other panels.
• Outline icon – Markup mode which enables sections of the screen to be annotated.
Markups
You may wish to annotate specific areas of any of the recorded screens, for
example to highlight information which is incorrect and needs to be changed, or for
training or documentation purposes. To do this, click on the outline icon above the
screen picture. Then use the mouse to drag a box around the area of the screen
that you wish to add a comment about. When the mouse is released, a comments
box is displayed as shown below.
Contents
The elements included on the highlighted screen are presented in list form. Click on
an element to highlight it on the screen picture with an orange rectangle and to
populate the Expected Element Properties panel. The Filter button on the main
toolbar can be used to modify the appearance and contents of the element list. See
the Main Panel section near the beginning of this document for more information.
Edit Element
Right click an element either on the screen picture or within the Expected
Contents list and select the ‘Edit’ option in order to display the following
screen.
Track Value Tab
Use this tab to store the contents of this element for use later in this script or in another
script in an Action Map. See the section on Tracked Fields for more information.
Store Start Value The initial
contents of this field when the
screen was first displayed will be
captured and stored in the
specified tracked field.
Store End Value The final
contents of this field after all
keystrokes have been replayed will
be captured and stored in the
specified tracked field.
Track Field The name of the
tracked field which has been
defined at the Project level and
which will be used to store the
contents of this screen element.
Store Whole Field The entire
contents of the screen element will
be stored.
Store Part Field Only the
specified subset of the screen
element value will be stored.
Expected Input Properties
Highlight an input action on the Script pane to view further details about that input
here. The properties displayed vary slightly depending on the type of input selected.
Edit Input
The following display is accessed by right clicking an input action on the Script panel
and selecting the Edit option, then clicking on the Input Properties tab. Please note
that existing input can be changed or removed but new input cannot be added.
Variable Data Field
Optionally select the name of
a variable data field. The
data within this field in the
Variable Data Set being
used by the Script will be
used for comparison
purposes on playback
instead of the fixed input that
was actually recorded. See
the later section on variable
data for more information.
Tracked Field Optionally
select the name of a tracked
field. The data contained
within this field at playback
time will be used for
comparison purposes
instead of the fixed input that
was actually recorded. See
the later section on tracked
fields for more information.
Summary
The Script or Screen Summary panel has several sections within it, each one is
described in more details below. The title of each section for which information exists
is highlighted, headers for other sections where no data exists are not.
Tracked Fields
All Tracked Fields in use in this
script are listed on this screen.
Beneath each field are listed the
screen elements for where the
tracked field is being used. See
later Tracked Fields section for
more information. Click on any
field in the list to highlight the
details for that field on the other
Content panels. Click on the ‘click
here’ link to define new or modify
existing tracked fields for the
Project.
Variable Data
All variable data fields in use in this script are listed on this screen. Beneath each
field are listed either the input values or the start values of the fields for which the
variable data will be used. Click on any field in the list to highlight the details for that
field on the other Content panels. Click on the ‘click here’ link to manage the Local
and Public variable data sets.
Validation Rules
When the script name is in focus, this panel lists all of the Validation Rules
defined for the entire script. There are two levels of Rules:
• Tracked Field. These are effectively defined for the whole script, but will
only be applied to screens where at least one of the tracked fields utilised
by the rule is populated on that screen. If none of the tracked fields are
populated then the Rule is bypassed for the screen. Therefore the Rules at
the Tracked Field level are only applied when a relevant tracked field
changes.
• Screen Rules. Rules defined at the screen level for individual screens in
the script are always applied, regardless of whether or not any tracked
fields are updated on the screen.
Blocks
There are occasions when the key goal is to match a ‘block’ of similar data elements
within a web page, not necessarily the individual items themselves. For example, when
viewing a long list of products where sometimes a new product is added, it is necessary
to ensure that on playback all of the details of the same product on the expected and
actual pages are matched, even if when the script was recorded the product appeared at
the top of the list but on playback it was half way down. To facilitate this matching
process, ‘blocks’ of data can be identified on the screen and these are used on playback
when matching actual and expected elements. Click on the ‘click here’ link on the Blocks
section of the summary panel to launch the Block Wizard.
In the following example a radio button which appears in each of the ‘blocks’ within the
same web- page has been chosen as the anchor, this is the item or items that will be
found in every block and can be used to identify the repeating pattern. To define the
anchors, click on one of the repeating items, then hold down the shift key and click on a
second item. These two elements will appear in bright red, all other similar items which
have been identified as anchors will be given a pale red border.
When the anchors have been defined, click Continue to view the actual blocks that
have been identified as shown below. What this actually means is that during
playback, a match of the entire block will be searched for. This means there is no
danger that the radio button from one block will be incorrectly matched with that
from another block, which could result in a mouse click selecting entirely the wrong
company.
If the number of blocks that have been identified does not match the number of
anchors that were found on the previous screen, a red icon instead of a green one
will appear next to the text above the screen picture and the Continue button will be
disabled. If this occurs, click the Back button to modify your anchor selection. Only
when matching numbers of anchors and blocks are found will the wizard allow you
to move to the next stage.
When the blocks have been correctly identified, click on Continue to select
one or more Identity Elements. These are the items that are also found in
each block but that will help to identify a block and separate from the others,
therefore wherever possible these should be unique. Sometimes one single
item is not enough to uniquely identify a block, in this situation a combination
of more than one element can be used, for example Product Code and
Package Quantity. Click on an item to select it as an identity element, the
identity elements found in other blocks will all be highlighted. To select more
than one identity element, hold down the shift key.
If identity elements were not found in every block, a red icon instead of a
green one will appear next to the text above the screen picture and the
Continue button will be disabled. If this occurs, click the Back button to
modify your selection. Only when matching numbers of blocks and identity
elements are found will the wizard allow you to move to the next stage.
Click Continue to display the final screen in the Block Wizard. This enables error
checking to be switched off for all of the elements within the blocks although the rest
of the screen will still be checked for differences on playback. This is a sensible option
to take if the list of items for which blocks have been defined is likely to change, or
if variable data will be used to enable input to be played over alternative blocks.
Once a Block Set has been defined, it is listed in the Summary panel as shown below.
The Block Set header is followed by any input for elements within the block, which is
turn followed by the identity elements for the input.
right click options are available depending on which item in the block definition was
highlighted at the time. To view the blocks that have been defined in the wizard,
click on the Block Set header in the Blocks panel.
Block Set
Edit Display the first screen of the Block Wizard as described above to modify the
block selection.
Delete Remove the Block Set.
Block Errors Toggle the flag to either include or ignore differences in block
elements during playback.
This option is also present on the final wizard screen as described above.
Input
Edit Open the Edit Input form to modify details about the input.
Delete Remove the input.
Identity Element
Edit Open the Edit Element form to modify details about the element.
Checking Specify whether or not this element should be checked for differences
on playback.
Create Validation Rule Automatically store one of the core properties of a screen
field within a tracked field and then define a Validation Rule which uses this or any
other tracked fields or functions. See the previous ‘Picture’ section for more
information.
Script Options
Script Options can be accessed in several ways:
1. Shown when the ‘Playback’ button is pressed for a selected Script.
2. Double clicking the Script Header in a selected script or right clicking and
selecting the ‘Edit’ option.
3. Clicking on the Options button when saving the Script.
Tracked Fields This button will allow any Tracked Fields that are associated with
this Project to be edited. See the separate section on Tracked Fields.
Start Mode Determines whether the specified application under test will be
launched by TestDrive-Gold at playback time or whether it will be already running
and TestDrive-Gold can simply attach to it. If the ‘Launch’ option is selected and
the application requires any parameters these can be specified as part of the
command.
Application For browser scripts this will be the initial URL for this script, for all
other scripts it is the name and location of the application under test. If the full
application path is not known, the browse button to the right of the field allows the
application to be located.
Num. Runs The number of times this script should be executed during the current
run.
Use Variable Data If the script has been set up to use variable data this field will
be automatically checked. Un-checking this field will cause the script to be
replayed with its original fixed values.
Use Tracked Fields If the script has been set up to use tracked fields, this option
will be automatically checked. Un-checking this option will cause the script to be
replayed with its original fixed values.
Activate Test_IT Choose whether TestBench facilities are to be enabled when this
Script is executed. Using this option will instruct TestBench to start the Test Case
before replaying the script.
If using JWalk, Test_IT gives the capability to ensure a consistent initial environment,
together with database, data area and program parameter verification. In fact all of
the TestBench Test Case functionality can be used.
If using another application which makes database changes on the iSeries or Oracle,
you will be prompted for the User ID which will actually be making the changes, or for
Oracle the Computer Name can also be used. If the
Playback
Any changes to the Test Items are stored back on the server if there is an active
connection when the Keep button is clicked at the end of the run to save the
results. Once the playback of the script has been completed a similar window is
displayed which enables statuses and comments to be entered (see later Scoring
Results section).
Results
Once the execution of a Script is complete or the execution has been interrupted by the
user, a panel in the following format is displayed.
Each screen which was presented during the execution of the Script along with its
associated input is listed together with a signal light indicating:
Green For screens, all items that were actually presented matched your expectations as
defined in the Script or expanded through variable data. For input, the action was
successfully played back.
Red At least one item did not match your expectations or the input was not played back.
Orange The screen was not verified as you had instructed it to be omitted (not checked).
The panes on the display can be moved to different positions and also overlain. To move
a pane, click on its title and drag it to the new location. To overlay a pane, click on its title
and position it over another pane, a tab for each one will appear at the bottom of the
pane. The panes can be hidden by clicking on the Pin icon, when this is done a tab for
the pane will appear on the left hand side of the screen. To restore the pane to the main
view, click on the tab and then on the Pin icon.
TestDrive-Gold can store two possible formats for this window, which can be selected by
right-clicking on any of the orange window labels.
• Standard – The Picture, Properties and Errors panes are a permanent part of the
display.
• Preferred – This is only available if the ‘Set Preferred’ option has previously been
selected. It enables an alternative to the above format to be created and stored, this will
be specific to the PC being used. Simply adapt the display to suit your requirements and
then select ‘Set Preferred’ to create or overwrite the preferred layout.
• Set Preferred – This selects the current view as a preferred layout which can then be
loaded at anytime via the ‘Preferred’ layout option explained above.
Markups
You may wish to annotate specific areas of any of the screens in the results, for example
to highlight information which is incorrect. To do this, click on the outline icon above the
screen picture. Then use the mouse to drag a box around the area of the screen that
you wish to add a comment about. When the mouse is released, a comments box is
displayed as shown below.
Amending Scripts
In the event that changes occur to the system under test for which Scripts already exist, it
is possible to easily change these Scripts.
Changes that can occur are:
• Extra screens can be recorded and inserted into existing test Scripts (see below).
• Incorrect or surplus screens can be deleted from existing test Scripts (see below).
• Both the screen content and input events can be changed to cater for changes (see
earlier sections relating to Script Structure).
• Any variable data that has been set up for a Script can be altered (see later section).
• Validation Rules can be modified.
• The Scripts that will be run as part of an Action Map can be changed.
• Scripts which have been ‘healed’ as part of the Verification method will have been
changed (see later chapter for more details).
Deleting a Screen
To delete a screen from an existing Script:
1 Open the Script that requires changing and click on the relevant Screen name.
2 Right click and select the ‘Delete’ option. You will be asked to confirm that deletion is
required.
3 On confirmation the screen will be deleted.
4 Save the Script.
Care must be taken in the event that Start & End Loops have been specified within the
Script, especially if the screen being deleted is marked as either one (see Variable Data
section).
Inserting a Screen
To insert a screen into an existing Script:
1 Open the Script that requires changing and optionally click on the Screen name
before the point at which you want to insert a new screen.
2 Ensure that the system application under test is on the correct screen for recording.
5 The Record Wizard is displayed from where the application containing the new
screens to be recorded can be chosen. At this point TestDrive-Gold will go directly into
‘Record’ mode with the focus placed on the selected application. Commence the
required testing and when finished press the ‘Stop’ button
This full Script integration is requested by checking the ‘Activate Test_IT’ box on the
Script Properties panel. The User ID or Computer Name which actually performs the
updates should then be specified. Further details on reviewing results from tests can
be found in the TestBench-PC section of the user guide.
Troubleshooting
If the expected results are not being achieved while either recording or playing back
using TestDrive-Gold, this can often be resolved by modifying the Options that are being
used. TestDrive-Gold is shipped with some standard Options sets for known application
types. The following list describes how to resolve some of the most common scripting
issues.
Recording Issues
A picture is not being taken when clicking on a tab control within a web page.
Most web tab controls use some form of DHTML to simulate the known Windows tab
controls. This can be monitored for using the option ‘Take picture on content changes’.
Too many mouse moves are being recorded.
Try turning the option ‘Generate Mouse Move inputs for content changes’ off. This option
should only be used as a last resort if the option ‘Generate Mouse Move inputs for
elements with events’ does not record the required input.
Extra pictures are being taken when changing the focus between the target
application and TestDrive-Gold or another application.
Turn the option ‘Take picture on Active window changes’ off.
Playback Issues
A picture is being taken too early (or additional pictures are being taken) so the
playback of input fails.
Try setting the ‘Wait for replacement screen similarity to be’ option to 50% (experiment
with this value). This will make TestDrive-Gold wait for a replacement screen that is
more similar to the expected screen than the current actual screen. If you find you need
this option to enable reliable playback, you might want to configure your ‘During
playback, set activity timeout to’ setting to something more appropriate.
If there are no replacement screens, it might be because Internet Explorer is doing some
post-document-complete processing. Try setting the ‘Delay analysis for’ option to
something like 50ms so we wait for this post- processing to occur.
Elements within a web page are not being matched correctly. This results in input
not playing back.
Some web sites are designed around a single page where the multi-page feel is
simulated by URL parameters. As default we ignore these parameters in our matching.
This results in a link to index.htm?home being treated the same as index.htm?contactus.
To resolve this problem, disable the option ‘For web pages, exclude parameters in
comparisons’.
Playback of some input fails in Lotus Notes, Microsoft Access or another similar
application.
These applications do not have a 100% implementation of Microsoft Active Accessibility.
Try turning on the option ‘Playback dynamic input using recorded offset’. It is also worth
noting that any input that requires this option might not correctly playback if the target
element changed location.
Saving Scripts Locally
If a script has been created which needs to be sent to your support contact for help in
diagnosing an issue or for any other purpose, this script can be saved to your local PC
from where it can then be attached to an email. When this script is open in TestDrive-
Gold, hold down the Ctrl key and Right Click the script name to gain access to an
additional menu item called ‘Save Script As File’. If this option is selected, the location
and name of the new script must be chosen. Click ‘Save’ and then click ‘OK’ on the
subsequent ‘Save script to file’ window (the options on this screen are only required
when creating JWalk Integration scripts).
To load a script into TestDrive-Gold which has been saved locally, first of all ensure that
no script is currently open. Then hold down the Ctrl key and Right Click to obtain a list of
local scripts in the default location.