Sunteți pe pagina 1din 176

AMS Design and Model Validation User

Guide
Product Version 6.1.5
April 2011

20102011 Cadence Design Systems, Inc. All rights reserved.


Printed in the United States of America.
Cadence Design Systems, Inc. (Cadence), 2655 Seely Ave., San Jose, CA 95134, USA.
Open SystemC, Open SystemC Initiative, OSCI, SystemC, and SystemC Initiative are trademarks or
registered trademarks of Open SystemC Initiative, Inc. in the United States and other countries and are
used with permission.
Trademarks: Trademarks and service marks of Cadence Design Systems, Inc. contained in this document
are attributed to Cadence with the appropriate symbol. For queries regarding Cadences trademarks,
contact the corporate legal department at the address shown above or call 800.862.4522. All other
trademarks are the property of their respective holders.
Restricted Permission: This publication is protected by copyright law and international treaties and
contains trade secrets and proprietary information owned by Cadence. Unauthorized reproduction or
distribution of this publication, or any portion of it, may result in civil and criminal penalties. Except as
specified in this permission statement, this publication may not be copied, reproduced, modified, published,
uploaded, posted, transmitted, or distributed in any way, without prior written permission from Cadence.
Unless otherwise agreed to by Cadence in writing, this statement grants Cadence customers permission to
print one (1) hard copy of this publication subject to the following conditions:
1. The publication may be used only in accordance with a written agreement between Cadence and its
customer.
2. The publication may not be modified in any way.
3. Any authorized copy of the publication or portion thereof must include all original copyright,
trademark, and other proprietary notices and this permission statement.
4. The information contained in this document cannot be used in the development of like products or
software, whether for internal or external use, and shall not be used for the benefit of any other party,
whether or not for consideration.
Disclaimer: Information in this publication is subject to change without notice and does not represent a
commitment on the part of Cadence. Except as may be explicitly set forth in such agreement, Cadence does
not make, and expressly disclaims, any representations or warranties as to the completeness, accuracy or
usefulness of the information contained in this document. Cadence does not warrant that use of such
information will not infringe any third party rights, nor does Cadence assume any liability for damages or
costs of any kind that may result from use of such information.
Restricted Rights: Use, duplication, or disclosure by the Government is subject to restrictions as set forth
in FAR52.227-14 and DFAR252.227-7013 et seq. or its successor

AMS Design and Model Validation User Guide

Contents

Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Licensing in amsDmv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Related Documents for amsDmv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Installation, Environment, and Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Virtuoso Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Third-Party Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Typographic and Syntax Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
SKILL Syntax Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Form Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

1
Getting Started with amsDmv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The amsDmv Model Validation Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Simulators Supported By amsDmv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Starting the Virtuoso Design Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Starting amsDmv . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Starting amsDmv from the Virtuoso Environment . . . . . . . . . . . . . . . . . . . . . . . . . . .
Starting amsDmv from the Command Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Setting Up the Environment for Using amsDmv from an IC 5.1.41 Session . . . . . . .
The amsDmv User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Menu Bar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Toolbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Tabs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Entering or Modifying Values in Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Specifying Regular Expressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Sorting Data in Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
....................................
Viewing Tooltips for Help Information
Specifying amsDmv Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

14
16
17
18
19
19
21
23
24
25
29
31
32
32
33
33
35

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide

amsDmv Examples

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

2
Setting Up and Running Simulations . . . . . . . . . . . . . . . . . . . . . . . . . . .

41

Setting Up Your Design for Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43


Setting Up and Running Simulations Using ADE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Setting Up and Running Simulations Using ADE XL . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Setting Up and Running Simulations Using SKILL or OCEAN Commands . . . . . . . . . . . 58
Setting Up and Running Simulations Using System Commands . . . . . . . . . . . . . . . . . . 63
Running a Simulation to Compare with Existing Measured Results and Waveform Signal Data
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Comparing Existing Measured Results and Waveform Signal Data for Reference and
Compared Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Specifying a Different Run Type for the Compared Data Source . . . . . . . . . . . . . . . . . . . 75
Running System Commands Before and After Simulation Runs . . . . . . . . . . . . . . . . . . . 78
Starting a Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Stopping a Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Using the amsDmv Log . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Showing and Hiding the Log . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Clearing the Log Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Saving the Log Contents to a File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Viewing the Simulation and Validation Status Summary . . . . . . . . . . . . . . . . . . . . . . . . . 83
Saving and Opening the amsDmv Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Saving the amsDmv Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Saving the amsDmv Setup as an Executable Script and a SKILL File . . . . . . . . . . . . . . 87
Running an amsDmv Script File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Running an amsDmv SKILL File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Running amsDmv from the Command Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

3
Validating Measured Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

Validating Measured Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92


Specifying Global Options for Validating Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Specifying Maximum Acceptable Tolerance Values for Results . . . . . . . . . . . . . . . . . 98
Validating Only Results With Specific Names . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide

Disabling Validation of Results With Specific Names . . . . . . . . . . . . . . . . . . . . . . . . 100


Disabling Validation of Specific Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Validating Only Results for Specific ADE XL Tests . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Excluding Specific Swept or Corner Points of ADE XL Measured Results from Validation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Including Only Specific Swept or Corner Points of ADE XL Measured Results for
Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Working with Local Options for Validating Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Specifying Local Options for Validating Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Copying Local Options For a Result to Other Results . . . . . . . . . . . . . . . . . . . . . . . 105
Deleting Local Options Specified for Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Viewing the Point Details for ADE XL Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Viewing the Failing Points for ADE XL Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Excluding Failing Points from Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Selecting and Deselecting Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Selecting Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Deselecting Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Inverting Result Selections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Hiding and Showing Global Results Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

4
Validating Waveforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

113

Validating Waveforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Specifying Global Options for Validating Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Specifying Maximum Acceptable Tolerance Values for Analog Signals . . . . . . . . . .
Specifying Logic Time Tolerance Values for Digital Signals . . . . . . . . . . . . . . . . . . .
Filtering Glitches in Digital Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Validating Only Signals With Specific Names . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Disabling Validation of Signals With Specific Names . . . . . . . . . . . . . . . . . . . . . . . .
Disabling Validation of Specific Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Validating Only Waveform Signals for Specific Analyses . . . . . . . . . . . . . . . . . . . . .
Validating Only Signals for Specific ADE XL Tests . . . . . . . . . . . . . . . . . . . . . . . . . .
Excluding Specific Time Ranges of Waveform Signals from Validation . . . . . . . . . .
Validating Only Specific Time Ranges of Waveform Signals . . . . . . . . . . . . . . . . . .
Working with Local Options for Validating Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Specifying Local Options for Validating Analog Signals . . . . . . . . . . . . . . . . . . . . . .

114
121
121
123
124
126
127
128
128
129
129
130
130
130

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide

Specifying Local Options for Validating Digital Signals . . . . . . . . . . . . . . . . . . . . . .


Copying Local Options For a Signal to Other Signals . . . . . . . . . . . . . . . . . . . . . . .
Deleting Local Options Specified for Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Viewing the Failing Areas for a Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Viewing the Point Details for a Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Viewing the Failing Areas at a Point . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Excluding Failing Areas of Signals from Validation . . . . . . . . . . . . . . . . . . . . . . . . .
Plotting Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Selecting and Deselecting Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Selecting Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Deselecting Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Inverting Signal Selections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Hiding and Showing Global Signal Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Disabling Automatic Validation of Waveforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

132
135
136
136
139
140
141
143
145
146
146
146
146
147

5
Running Pin Checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

149

6
amsDmv Command Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

155

amsDmv Command Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


amsDmv Command Line Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Command Line Flags . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
amsDmv Command Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

156
156
167
175

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide

Preface
AMS Design and Model Validation (amsDmv) is an integrated model validation solution that
enables you to validate differences in measured and simulated behavior and interfaces of
reference (for example, design) and compared (for example, model) blocks.
This manual describes how you can setup and run amsDmv. The information presented in
this manual is intended for integrated circuit designers and assumes that you are familiar with
analog, digital and mixed-signal design and simulation.
This preface consists of the following sections:

Licensing in amsDmv on page 8

Related Documents for amsDmv on page 8

Third-Party Tools on page 9

Typographic and Syntax Conventions on page 9

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide


Preface

Licensing in amsDmv
amsDmv requires one of the following licenses. The first available license will be checked out
in the order given below:
1. 95200 Virtuoso(R) Analog Design Environment L
2. 95210 Virtuoso(R) Analog Design Environment XL
3. 95220 Virtuoso(R) Analog Design Environment - GXL
Important
If you want to use amsDmv from an IC 5.1.41 session, ensure that you have a
95200 Virtuoso(R) Analog Design Environment L license, or upgrade
your 34510 Virtuoso(R) Analog Design Environment license to the 95200
license. The 95200 license is compatible with the IC 5.1.41 release. Hence, you can
continue using Analog Design Environment in IC 5.1.41 using the 95200 license.
For more information on licensing, see Virtuoso Software Licensing and Configuration
Guide.

Related Documents for amsDmv


The following documents provide more information about the topics discussed in this guide.

Installation, Environment, and Infrastructure

For information on installing Cadence products, see the Cadence Installation Guide.

For information on the Virtuoso design environment, see the Virtuoso Design
Environment User Guide.

For information on database SKILL functions, including data access functions, see the
Virtuoso Design Environment SKILL Reference.

For information on library structure, the library definitions file, and name mapping for data
shared by multiple Cadence tools, see the Cadence Application Infrastructure User
Guide.

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide


Preface

Virtuoso Tools

Virtuoso Schematic Editor L User Guide and Virtuoso Schematic Editor XL User
Guide describe Cadences schematic editor.

Virtuoso Analog Design Environment L User Guide describes the ADE L


environment.

Virtuoso Analog Design Environment XL User Guide describes the ADE XL


environment.

Virtuoso Analog Design Environment GXL User Guide describes the ADE GXL
environment.

Virtuoso Spectre Circuit Simulator User Guide and Virtuoso Spectre Circuit
Simulator Reference describe Cadences Spectre analog circuit simulator.

Virtuoso UltraSim Simulator User Guide describes Cadences multi-purpose single


engine, hierarchical simulator, designed for the verification of analog, mixed signal,
memory, and digital circuits.

Virtuoso AMS Designer Simulator User Guide describes Cadences AMS mixedsignal circuit simulator.

Virtuoso Visualization and Analysis Tool User Guide contains information for
viewing waveforms and post-processing simulation results.

Third-Party Tools
To view any .swf multimedia files, you need:

A Cadence Online Support Login.

Flash-enabled web browser, for example, Internet Explorer 5.0 or later, Netscape 6.0 or
later, or Mozilla Firefox 1.6 or later. Alternatively, you can download Flash Player (version
6.0 or later) directly from the Adobe website.

Speakers and a sound card installed on your computer for videos with audio.

Typographic and Syntax Conventions


This list describes the syntax conventions used in this manual.

April 2011

Product Version 6.1.5

AMS Design and Model Validation User Guide


Preface
literal

Nonitalic words indicate keywords that you must enter literally.


These keywords represent command (function, routine) or option
names.

argument (z_argument)
Words in italics indicate user-defined arguments for which you
must substitute a name or a value. (The characters before the
underscore (_) in the word indicate the data types that this
argument can take. Names are case sensitive. Do not type the
underscore (z_) before your arguments.)
[ ]

Brackets denote optional arguments.

Three dots (...) indicate that you can repeat the previous
argument. If you use them with brackets, you can specify zero or
more arguments. If they are used without brackets, you must
specify at least one argument, but you can specify more.

argument

Specify at least one, but more are possible.

[argument]

Specify zero or more.

A comma and three dots together indicate that if you specify


more than one argument, you must separate those arguments by
commas.

If a command line or SKILL expression is too long to fit inside the paragraph margins of this
document, the remainder of the expression is put on the next line and indented.
When writing the code, put a backslash (\) at the end of any line that continues on to the next
line.

SKILL Syntax Examples


The following examples show typical syntax characters used in SKILL. For more information,
see the Cadence SKILL Language User Guide.
Example 1
list( g_arg1 [g_arg2] ...) => l_result

Example 1 illustrates the following syntax characters.

April 2011

10

Product Version 6.1.5

AMS Design and Model Validation User Guide


Preface
list

Plain type indicates words that you must enter literally.

g_arg1

Words in italics indicate arguments for which you must substitute


a name or a value.

( )

Parentheses separate names of functions from their arguments.

An underscore separates an argument type (left) from an


argument name (right).

[ ]

Brackets indicate that the enclosed argument is optional.

=>

A right arrow points to the return values of the function. Also used
in code examples in SKILL manuals.

...

Three dots indicate that the preceding item can appear any
number of times.

Example 2
needNCells(
s_cellType | st_userType
x_cellCount
)
=> t/nil

Example 2 illustrates two additional syntax characters.


|

Vertical bars separate a choice of required options.

Slashes separate possible return values.

Form Examples
Each form shows you the system defaults:

Filled-in buttons are the default selections.

Filled-in values are the default values.

April 2011

11

Product Version 6.1.5

AMS Design and Model Validation User Guide


Preface

April 2011

12

Product Version 6.1.5

AMS Design and Model Validation User Guide

1
Getting Started with amsDmv
This chapter describes the following topics:

Introduction on page 14

The amsDmv Model Validation Flow on page 16

Simulators Supported By amsDmv on page 17

Starting the Virtuoso Design Environment on page 18

Starting amsDmv on page 19

The amsDmv User Interface on page 24

Specifying amsDmv Options on page 35

amsDmv Examples on page 39


Tip
Cadence recommends that you work through the amsDmv examples (see amsDmv
Examples on page 39) while referring to this user guide.

April 2011

13

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Introduction
Behavioral models may initially be validated by the designer against the original transistor
level design. However, as designs evolve and change over time, designers may not have the
time to continuously validate models against the modified designs. Using the original (out of
sync) model can result in errors such as pin list mismatches between the design and the
model, or worse, incorrect model behavior that hides design flaws. Because of this,
continuous model validation is mandatory in the design creation process.
AMS Design and Model Validation (amsDmv) is an integrated model validation solution that
allows you to validate differences in measured and simulated behavior and interfaces of
reference (for example, design) and compared (for example, model) blocks.
Note: amsDmv is supported in IC 5.1.41, IC 6.1.4 ISR 3, and later releases. However, before
you use amsDmv from the IC 5.1.41 release, you must setup your system as described in
Setting Up the Environment for Using amsDmv from an IC 5.1.41 Session on page 23.
amsDmv supports the following:

Running simulations in various environments (ADE L, ADE XL, SKILL or OCEAN


commands) and from the command line (using system commands including irun or
runams commands) to get measured results and waveform signals to be validated.

Validation of measured resultsgain, power, delay, noise, and so on.

Validation of analog and digital waveform signals saved from simulations.


Note: amsDmv supports validation of only waveform data that is in the SignalScan Turbo
2 (SST2) format.

Validation of pin and module interfaces of the design and model.

Ability to use models created using any modeling language, such as Verilog, VerilogAMS, wreal, VHDL-AMS, schematic, SPICE, SystemC, and so on. This is because
amsDmv compares only measured results and waveforms, and is independent of the
modeling language used.

Ability to reuse existing testbenches and simulation setup in ADE states and ADE XL
views.

Reporting and debugging capabilities.

Ability to save the amsDmv setup as an executable script and a SKILL file. These files
can be used to run the setup from the UNIX command line (using the script file) or from
the CIW (using the SKILL file) as part of a scheduled regression run.

April 2011

14

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
amsDmv doesn't differentiate between behavioral models and transistor level simulation
because it compares only measured results and waveforms. Thus, you can use amsDmv for
model validation as well as comparing model vs. design (design validation using the model
as reference), design vs. design and model vs. model.
The following table describes the use models supported in amsDmv:
Table 1-1 amsDmv Use Models
Use Model

Application

Reference Design vs.


Model

Validating that the model conforms to the behavior of the


design it will replace within a higher level block.

Reference Design vs.


Design

Comparing differences between two implementations of the


same design. The differences can be in the processes,
topologies, settings, and so on.

Reference Model vs.


Design

In a top-down flow, determining that the implemented design


conforms to the original model (model is a simulatable
specification).

Reference Model vs.


Model

Comparing two implementations of the same model. The


models can be in different languages, or analog (Verilog-A) vs.
AMS, AMS vs. digital, and so on.

Note: Automatic model validation is not a replacement for accurate modeling or detailed
manual model validation. Model validation complements the modeling process. Given that a
complete verification of the model was done when the model was created, it might be
sufficient to run a sanity check only as regression setup.

April 2011

15

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

The amsDmv Model Validation Flow


Figure 1-1 amsDmv Model Validation Flow

The model validation flow in amsDmv involves the following steps:


1. Specifying the design and the model along with a testbench that is used to verify or
characterize the design.
2. Simulating the design and the model with the testbench using any of the following:

Settings in an ADE L state or ADE XL view

System commands (including irun or runams command)

SKILL or OCEAN commands

amsDmv automatically compares the simulation results (measured results and


waveform data) based on the specified settings, and displays a pass/fail status report for
each measured result and waveform signal.
April 2011

16

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
3. Performing pin checking to verify whether there are any differences between the pin/
module interfaces of the reference design and the compared model.
4. Manually correcting the design or the model based on the following:

The analysis of the measured results and waveform signals that have a fail status.

The errors reported during pin checking.

5. Running model validation again on the corrected design and model to verify that all
measured results and waveform signals have a pass status and that there are no
differences between the pin/module interfaces of the reference design and the compared
model.

Simulators Supported By amsDmv


The following simulators are supported by amsDmv:

Virtuoso Spectre circuit simulator

Virtuoso UltraSim simulator

Virtuoso Accelerated Parallel Simulator

Virtuoso AMS Designer

SpectreVerilog

UltraSimVerilog

April 2011

17

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Starting the Virtuoso Design Environment


To start the Virtuoso Design Environment, do the following:
1. If you want to use amsDmv from an IC 5.1.41 session, setup your system as described
in Setting Up the Environment for Using amsDmv from an IC 5.1.41 Session on page 23.
2. At the UNIX command prompt, do one of the following:

Type the following command if you are using an IC 6.1or later installation:
virtuoso &

Type the following command if you are using an IC 5.1.41 installation:


icms &

The CIW appears. You can start amsDmv from the CIW. See Starting amsDmv on
page 19 for more information.

April 2011

18

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Starting amsDmv
You can start amsDmv from the Virtuoso environment or the command line.
Note the following:

amsDmv is supported in IC 5.1.41, IC 6.1.4 ISR 3, and later releases.

Before you start amsDmv, ensure that the simulators you want to use with amsDmv are
setup in your path. For more information about the simulators supported by amsDmv, see
Simulators Supported By amsDmv on page 17
Important
Before you start amsDmv from an IC 5.1.41 session, you must setup your system
as described in Setting Up the Environment for Using amsDmv from an IC 5.1.41
Session on page 23.

Starting amsDmv from the Virtuoso Environment


To start amsDmv from the Virtuoso environment, do the following:
1. Start the Virtuoso environment.
For more information, see Starting the Virtuoso Design Environment on page 18.
2. Choose Tools AMS Model Validation.
Note: To start amsDmv from an IC 5.1.41 session, type the following command in the
CIW:
_amsDmv()

The AMS Design and Model Validate form appears. For more information about this form, see
The amsDmv User Interface on page 24.
Note: Figure 1-2 and 1-3 show the AMS Design and Model Validate form that appears in the
IC 6.1 and IC 5.1.41 releases. The ADE XL run type is not displayed in the form for the IC
5.1.41 release because ADE XL is not available in the IC 5.1.41 release.

April 2011

19

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
Figure 1-2 AMS Design and Model Validate form in IC 6.1 Release

April 2011

20

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
Figure 1-3 AMS Design and Model Validate form in IC 5.1.41 Release

Starting amsDmv from the Command Line


To start amsDmv from the command line, do the following:

Type the following command in an UNIX terminal:


amsDmv

April 2011

21

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
The AMS Design and Model Validate form appears. For more information about this
form, see The amsDmv User Interface on page 24.

Note the following when you start amsDmv from the command line:

The ADE L, ADE XL and SKILL run types are not displayed in the AMS Design and
Model Validate form because these run types are not supported in command line mode.

April 2011

22

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Pin checking is not supported as this feature requires the Virtuoso environment.

Setting Up the Environment for Using amsDmv from an IC 5.1.41 Session


You can use amsDmv from an IC 5.1.41 session. However, you must do the following before
you use amsDmv from an IC 5.1.41 session:
1. Download and install IC 6.1.4 ISR 3 or later version.
2. Setup the license that is required by amsDmv.
For more information, see Licensing in amsDmv on page 8.
3. Set the AMS_DMV_PATH environment variable to point to the /tools/dfII/bin folder
in your IC 6.1.4 ISR 3 (or later) installation.
setenv AMS_DMV_PATH <IC614ISRInstallDir>/tools/dfII/bin

4. Add the following entry in your .cdsinit file:


load(strcat(getShellEnvVar("AMS_DMV_PATH")"/../etc/dcm/amsDmv.ile"))

The amsDmv.ile file is an encrypted SKILL file containing the procedures for using
amsDmv in Virtuoso.
Caution
If you use amsDmv from an IC 5.1.41 session, do not setup your
environment to use the IC 6.1.4 ISR 3 or later releases because this can
result in significant issues with the IC 5.1.41 software. You only need to
set the AMS_DMV_PATH environment variable and load the amsDmv.ile
file in your .cdsinit file as described in steps 3 and 4 above.

April 2011

23

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

The amsDmv User Interface

The following sections describe the parts and features of the amsDmv user interface:

Menu Bar on page 25

Toolbar on page 29

April 2011

24

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Tabs on page 31

Entering or Modifying Values in Fields on page 32

Sorting Data in Columns on page 33

Viewing Tooltips for Help Information on page 33

Menu Bar
The menu bar in amsDmv has the following menus.

File

Edit

Run

View

File
The options in the File menu are described below:

File Menu Options

Description

New

Clears the current amsDmv setup.

Open View

Opens an existing amsDmv setup cellview.


For more information, see Saving and Opening the amsDmv
Setup on page 85

Save View

Saves the current amsDmv setup to a cellview so that you can


load it later.
For more information, see Saving and Opening the amsDmv
Setup on page 85

Save as View

Saves the current amsDmv setup to a new cellview.


For more information, see Saving and Opening the amsDmv
Setup on page 85

April 2011

25

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

File Menu Options

Description

Open File

Opens an existing amsDmv setup file.


For more information, see To open an amsDmv setup from a
file, do the following: on page 86.

Save File

Saves the current amsDmv setup to a file so that you can load it
later.
For more information, see Saving the amsDmv Setup on
page 85.

Save As File

Saves the current amsDmv setup to a new file.


For more information, see Saving the amsDmv Setup on
page 85.

Save Command File

Saves the current amsDmv setup as an executable script and a


SKILL file. These files can be used to run the amsDmv setup
from the UNIX command line (using the script file) or from the
CIW (using the SKILL file) as part of a scheduled regression
run.
For more information, see the following topics:

Save Log

Saving the amsDmv Setup as an Executable Script and a


SKILL File on page 87.

Running an amsDmv Script File on page 88

Running an amsDmv SKILL File on page 88

Saves the contents of the Log group box to a text file.


For more information, see Using the amsDmv Log on page 81.

Exit

April 2011

Closes the AMS Design and Model Validate form.

26

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
Edit
The options in the Edit menu are described below:

Edit Menu Options

Description

Preferences

Opens the Preferences for AMS Design and Model Validation


form where you can specify amsDmv options.
For more information, see Specifying amsDmv Options on
page 35.

Pre/Post Run

Specifes the pre and post-run commands that need to be run


before or after the reference and compared data sources are
simulated.
For more information, see Running System Commands Before
and After Simulation Runs on page 78.

Clear Log

Clears the contents of the amsDmv log.


For more information, see Using the amsDmv Log on page 81.

Run
The options in the Run menu are described below:

Run Menu Options

Description

Run Simulations

Runs simulations on the reference and compared data sources


specified in the Source tab.
For more information, see Chapter 2, Setting Up and Running
Simulations.

Load and Validate

Loads and validates results and waveform signals from the


specified ADE XL result database and waveform directory.
For more information, see Comparing Existing Measured
Results and Waveform Signal Data for Reference and
Compared Data Sources on page 72.

April 2011

27

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Run Menu Options

Description

Validate

Validates currently loaded measured results and waveforms.


See also, Disabling Automatic Validation of Waveforms on
page 147.

View
The options in the View menu are described below:

View Menu Options

Description

Show/Hide Options

Shows or hides the:

Global Results Options group box if you are using the


Measured Results tab.
For more information, see Hiding and Showing Global
Results Options on page 112.

Global Signal Options group box if you using the


Waveform Signals tab.
For more information, see Hiding and Showing Global
Signal Options on page 146.

Show/Hide Log

Shows or hides the Log group box.


For more information, see Showing and Hiding the Log on
page 82.

Show Last Simulation

Displays the ADE L user interface for the last simulation run.
This allows you to debug the last simulation run if it failed for
some reason, or some waveforms or measured results are not
as expected.

April 2011

28

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Toolbar
The buttons in the amsDmv toolbar are described below:

Icon

Name

Description

New Setup

Clears the current amsDmv setup.

Open Setup

Opens an existing amsDmv setup.


For more information, see To open an amsDmv setup from a
file, do the following: on page 86.

Save Setup

Saves the current amsDmv setup.


For more information, see Saving the amsDmv Setup on
page 85.

Save Setup
As

Saves the current amsDmv setup to a new file or view.

Save Setup
As
Executable
Script

Saves the current amsDmv setup as an executable script and


a SKILL file. These files can be used to run the amsDmv
setup from the UNIX command line (using the script file) or
from the CIW (using the SKILL file) as part of a scheduled
regression run.

For more information, see Saving the amsDmv Setup on


page 85.

For more information, see the following topics:

Save Log To
File

April 2011

Saving the amsDmv Setup as an Executable Script and a


SKILL File on page 87.

Running an amsDmv Script File on page 88

Running an amsDmv SKILL File on page 88

Saves the contents of the Log group box to a text file.


For more information, see Using the amsDmv Log on
page 81.

29

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Icon

Name

Description

Run
Runs simulations on the reference and compared data
Simulation
sources specified in the Source tab and displays the validation
and Validation results in the Measured Results and Waveform Signals tabs.
For more information, see Chapter 2, Setting Up and
Running Simulations.
Stop
Simulation

Stops the simulations that are currently running.

Load and
Validate
Results and
Waveforms

Loads and validates results and waveform signals from the


specified ADE XL result database and waveform directory.

Validate
Currently
Loaded
Results and
Waveform
Signals

Validates currently loaded measured results and waveforms.

Show/Hide
Options

For more information, see Stopping a Simulation on page 81.

For more information, see Comparing Existing Measured


Results and Waveform Signal Data for Reference and
Compared Data Sources on page 72.

See also, Disabling Automatic Validation of Waveforms on


page 147.

Shows or hides the:

Global Results Options group box if you are using the


Measured Results tab.
For more information, see Hiding and Showing Global
Results Options on page 112.

Global Signal Options group box if you using the


Waveform Signals tab.
For more information, see Hiding and Showing Global
Signal Options on page 146.

Edit
Preferences

Opens the Preferences for AMS Design and Model Validation


form where you can specify amsDmv options.
For more information, see Specifying amsDmv Options on
page 35.

April 2011

30

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Icon

Name

Description

Show/Hide
Log

Shows or hides the Log group box.


For more information, see Showing and Hiding the Log on
page 82.

Tabs
The tabs in the AMS Design and Model Validate form are described below:
Tip
You can select the Advanced Mode check box in the Preferences for AMS Design
and Model Validation form to enable and use advanced amsDmv features in the
tabs. For more information about the Advanced Mode check box, see Specifying
amsDmv Options on page 35.

Tab

Description

Source

Allows you to setup and run simulations on the reference and


compared data sources so that the resulting waveform signals and
measured results can be compared and validated using amsDmv.
You can also specify an existing ADE XL result database and
waveform directory from which measured results and waveforms
need to be loaded and validated.
For more information about using this tab, see Chapter 2, Setting
Up and Running Simulations.

Measured Results

Allows you to compare and validate measured results.


For more information about using this tab, see Chapter 3,
Validating Measured Results.

Waveform Signals

Allows you to compare and validate waveform signals.


For more information about using this tab, see Chapter 4,
Validating Waveforms.

April 2011

31

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Tab

Description

Pin Check

Allows you to verify whether there are any differences between the
pin/module interfaces of the reference design and the compared
model.
For more information about using this tab, see Chapter 5, Running
Pin Checks.

Status

Displays a summary report of the validation status.


For more information about using this tab, see Viewing the
Simulation and Validation Status Summary on page 83.

Entering or Modifying Values in Fields


To enter or modify values in a field, do the following:
1. Double-click on the field.
2. Enter or modify the values in the field.
3. Press the Enter or the Tab key.
Tip
You can also right-click on a field and use the shortcut menu (if available) to specify
or modify values in fields.

Specifying Regular Expressions


On the Measured Results and Waveform Signals tabs you can specify regular expressions to
include or exclude specific measured results and waveform signals, or all measured signals
and waveform signals for specific ADE XL tests from validation.
If the names of ADE XL tests, measured results or waveform signals contain characters that
are also used in regular expressions, ensure that they are escaped using the \ (backslash)
character so that they are not interpreted when the regular expression is evaluated. For
example, a signal named a[3]should be specified in the regular expression as a\[3\]
because [ and ] have a meaning to the regular expression.

April 2011

32

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Sorting Data in Columns


You can sort the data in the columns in the Measured Results and Waveform Signals tabs.
To sort data in columns, do the following:

Click on the name of the column based on which you want to sort the data.

Note: Sorting may be inaccurate if the Use Engineering Notation check box is selected in
the Preferences for AMS Design and Model Validation form.

Viewing Tooltips for Help Information


Besides the user documentation for amsDmv, you can refer to the tooltips that are available
for all the fields in the amsDmv user interface for help information.

April 2011

33

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Place the mouse pointer on a field to view the tooltip for the field.

Tooltip

April 2011

34

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Specifying amsDmv Options


To specify amsDmv options, do the following:
1. Do one of the following:

Choose Edit Preferences.

Click the

toolbar button.

The Preferences for AMS Design and Model Validation form appears.

April 2011

35

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
The options in the form are described below:

Option

Description

Advanced Mode

Enables the following advanced features:

April 2011

Specifying a different run type for the compared data


source. For more information, see Specifying a
Different Run Type for the Compared Data Source on
page 75.

Setting local options for validating measured results


and signals. For more information, see Working with
Local Options for Validating Results on page 103 and
Working with Local Options for Validating Signals on
page 130.

Disabling validation of multiple results (see Disabling


Validation of Specific Results on page 101) or signals
(see Disabling Validation of Specific Signals on
page 128.)

Plotting multiple signals (see Plotting Signals on


page 143).

Excluding specific failing areas of a signal from


validation (see Excluding Failing Areas of Signals from
Validation on page 141).

Detailed debugging and plotting options.

36

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Option

Description

Statistics

Display minimum and maximum values for each measured


result and waveform signal in the reference and compared
data source.
These values are displayed in the following columns in the
Measured Result and Waveform Signal tabs:

Max Ref Value

Min Ref Value

Max Comp Value

Min Comp Value

For more information about these columns, see Chapter 3,


Validating Measured Results and Chapter 4, Validating
Waveforms.
Verbose

Display the following:

Additional popups when you perform operations such


as saving the amsDmv setup (see Saving the amsDmv
Setup on page 85).

More detailed information in the log area (see Showing


and Hiding the Log on page 82) during a simulation
run.

Run Update

When the simulation has successfully completed,


automatically open the Measured Results or Waveform
Signals tab.

Keep All Plots

Retain existing plot windows and create a new plot window


for each new plot action.
By default, the same plot window is updated with new plots.

SimVision

By default, the Virtuoso Visualization and Analysis tool is


used to plot waveform signals.
Select this check box to use SimVision as the default
plotting tool.
For more information about plotting waveform signals, see
Plotting Signals on page 143.

April 2011

37

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv

Option

Description

Using icxx

Indicates whether you are using the icms or icfb


executable from the IC 5.1.41 release.
This check box is automatically selected if you are running
amsDmv from the IC 5.1.41 release.

Split Buses

By default, bus signals are plotted as a single waveform in


SimVision.
Select this check box to plot each bit of bus signals as
separate waveforms.

Keep Last Simulation


Session

Allows debug of last simulation run using View Show


Last Simulation. Currently, this option is supported only
for ADE L simulation runs.

DCM

The ADE L states created by the Design Characterization


and Modeling (DCM) tool may be located outside the
.artist_states directory.
Select this check box to enable selecting ADE L states
located outside the .artist_states directory using a
file browser.

Use Engineering
Notation

Uses engineering notation to display values in the following


columns in the Measured Results and Waveform Signals
tabs:

Worst Absolute Diff

Worst Relative Diff

Max Ref Value

Min Ref Value

Max Comp Value

Min Comp Value

Deselect this check box to use scientific notation to display


values in these columns.
Note: Sorting of data in columns may be inaccurate if this
check box is selected. For more information, see Sorting
Data in Columns on page 33.

April 2011

38

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
1. Select the check box next to an option to enable the option. Deselect the check box to
disable the option.

amsDmv Examples
Examples for amsDmv are available in the /tools/dfII/samples/dcm/amsDmv/
directory of your IC 6.1 installation.
Note: These examples are not available in your IC 5.1.41 installation.
Each example contains:

The required input files and libraries.

A README file that describes the objective of the example and the steps to work through
it.

A files directory that contains the files saved or created by working through the
example.

The following examples can be run using amsDmv from an IC 6.1.4 ISR 3 or later installation:

Example

Description

amsDmvAdeXlRun.tar.Z

Example of amsDmv ADE XL run with measured


results and waveform signal validation.

amsDmvAdeXlWreal.tar.Z

Example of amsDmv ADE XL run with measured


results validation for a design vs. Wreal model.

amsDmvAmsIrun.tar.Z

Example of amsDmv irun run with waveform signal


validation.

amsDmvAmsSst2.tar

Example of amsDmv AMS/SST2 waveform


validation.

amsDmvPinCheck.tar.Z

Example of amsDmv design vs. model pin interface


checks.

amsDmvPsfData.tar

Example of amsDmv psf waveform validation.

amsDmvRdbData.tar.Z

Example of amsDmv ADE XL measured results and


waveform validation.

April 2011

39

Product Version 6.1.5

AMS Design and Model Validation User Guide


Getting Started with amsDmv
The following examples can be run using amsDmv from an IC 5.1.41 installation:

Example

Description

amsDmvAdeIC5141.tar.Z

Example of amsDmv ADE run from an IC 5.1.41


installation with measured results and waveform
signal validation.

amsDmvIC5141PinCheck.tar.Z

Example of amsDmv design vs. model pin interface


checks from an IC 5.1.41 installation.

April 2011

40

Product Version 6.1.5

AMS Design and Model Validation User Guide

2
Setting Up and Running Simulations
The Source tab in the AMS Design and Model Validate form allows you to setup and run
simulations on the reference (for example, design) and compared (for example, model)
blocks so that the resulting waveform signals and measured results can be compared and
validated using amsDmv. You can also specify an existing ADE XL result database and
waveform directory from which measured results and waveforms need to be loaded and
validated.
This chapter describes the following topics:

Setting Up Your Design for Validation on page 43

Setting Up and Running Simulations Using ADE on page 44

Setting Up and Running Simulations Using ADE XL on page 52

Setting Up and Running Simulations Using SKILL or OCEAN Commands on page 58

Setting Up and Running Simulations Using System Commands on page 63

Running a Simulation to Compare with Existing Measured Results and Waveform Signal
Data on page 68

Comparing Existing Measured Results and Waveform Signal Data for Reference and
Compared Data Sources on page 72

Specifying a Different Run Type for the Compared Data Source on page 75

Running System Commands Before and After Simulation Runs on page 78

Starting a Simulation on page 79

Stopping a Simulation on page 81

Using the amsDmv Log on page 81

Viewing the Simulation and Validation Status Summary on page 83

Saving and Opening the amsDmv Setup on page 85

April 2011

41

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Saving the amsDmv Setup as an Executable Script and a SKILL File on page 87

Running an amsDmv Script File on page 88

Running an amsDmv SKILL File on page 88

Running amsDmv from the Command Line on page 89

April 2011

42

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Setting Up Your Design for Validation


For validating waveforms, amsDmv requires that the waveform signals created from the
simulation of the reference and compared data sources are of the same type. The signal
types can be either real numbers (created from SPICE nets or analog disciplines such as
electrical, real or wreal signals) and logic or bus signals (created from digital nets).
You can ensure that the signal types are the same by using the same testbench for the
reference and compared data sources. If different testbenches are used for the reference and
compared data sources, ensure that the resulting waveform signals are of the same type.
The AMS Designer simulator automatically resolves different signal types by inserting the
required connect modules between signals of different disciplines. For example, if an analog
design is simulated with a digital testbench, or a digital design is simulated with an analog
testbench, the AMS Designer simulator inserts the required connect modules to convert
digital to analog and analog to digital. However, for nets for which a discipline is either not
specified or cannot be determined through discipline resolution, the AMS Designer simulator
needs to be forced to make the required domain conversion.
For example, if the output of a digital design inside an analog testbench is not connected to
anything in the analog testbench, it will not be automatically converted to the analog domain
because its discipline cannot be determined through discipline resolution. In such cases, you
can do one of the following to force the AMS Designer simulator to make the conversion from
digital to analog:

Connect the floating digital output to some analog primitivefor example, to a 0 Amp
current source or a high ohm resistor.

Set the net to have the electrical discipline by doing one of the following:

Adding the netDiscipline=electrical property on the net in the schematic.

Using the -setdiscipline option for the elaborator.


For more information about the -setdiscipline option, see the Virtuoso AMS
Designer Simulator User Guide.

Similarly, if the output of an analog design inside a digital testbench is connected to a net in
the testbench that is not used in the digital context, the wire will remain in the analog domain.
In such cases, you can do one of the following to force the AMS Designer simulator to make
the conversion from analog to digital:

Connect the analog output to a dummy digital element, such as a buffer.

Set the net to have the logic discipline by doing one of the following:

April 2011

43

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Adding the netDiscipline=logic property on the net in the schematic.

Using the -setdiscipline option for the elaborator.


For more information about the -setdiscipline option, see the Virtuoso AMS
Designer Simulator User Guide.

For more information about disciplines, discipline resolution and connect modules, see
"Mixed-Signal Aspects of Verilog-AMS" in the Cadence Verilog-AMS Language
Reference.

Setting Up and Running Simulations Using ADE


You can use the setup in an existing ADE state to run simulations on the reference and
compared data sources so that the resulting waveform signals and measured results can be
compared and validated using amsDmv.
To setup and run an ADE simulation, do the following:
1. Click the Source tab.
2. Select the run type as ADE L.
Note: The run type is ADE L if you are using amsDmv from an IC 6.1 installation and

April 2011

44

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
ADE if you are using amsDmv from an IC 5.1.41 installation.

3. Right-click on the Value column in the Reference Data Source group box and choose
Browse to existing ADE setup.

April 2011

45

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The Type of ADE Setup form appears.

4. Do one of the following to select the ADE state:

Click

To

Directory

Select an ADE state from a directory.


Note: If the ADE state you want to use to simulate the reference data
source is not located under the .artist_states directory, ensure that
the DCM check box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see Specifying amsDmv
Options on page 35.
The Select design/config for state form appears.
the snapshot needs to be updated.

April 2011

46

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Click

To
Do the following:
1. Select the library, cell and view for the reference data source, then
click OK.
The Simulator form appears.

2. From the Select simulator drop-down list, select the simulator you
want to use, then click OK.
The Select optional top state directory form appears if the DCM
check box is selected in the Preferences for AMS Design and Model
Validation form. Select the directory containing the ADE state you
want to use and click Choose.The Select state form appears.
3. Select the ADE state you want to use to run simulations on the
reference data source and click Choose.

April 2011

47

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Click

To

Library/
Cell/View

Select an ADE cellview state.


The Select existing Library/Cell/View State form appears. Select the
cellview state and click OK.

The names of the selected library, cell, view, simulator and state are displayed in the
Reference Data Source group box.

April 2011

48

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
5. Do one of the following in the Compared Data Source group box.

Select

To

New View/
Config

Specify that the compared data source will use the setup in the ADE
state specified for the reference data source for simulation purposes,
but will use a different design or config view for simulation purposes.
To specify the config view for the compared data source, do the
following:
1. Right-click on the Value column in the Compared Data Source
group box and choose Browse Override view/config.
The Select override view/config form appears.
snapshot needs to be updated

2. Select the config view and click OK.


The names of the selected library, cell and view are displayed in
the Compared Data Source group box.

April 2011

49

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Select

To
Tip
You can also double-click on any row in the Value column in
the Compared Data Source group box to specify the library,
cell and view names. For more information, see Entering or
Modifying Values in Fields on page 32.
Tip
You can right-click on the Value column in the Compared
Data Source group box and choose Open ADE L to view
the simulation setup for the compared data source in ADE L.

New Settings Specify a different ADE state for the compared data source.
To specify the ADE state for the compared data source, do the
following:
1. Right-click on the Value column in the Compared Data Source
group box and choose Browse to existing ADE setup.
The Type of ADE Setup form appears.

2. Follow the procedure described in step 4 to select the ADE state


for the compared data source.

Tip
You can double-click on any row in the Value column in the Reference Data Source
or the Compared Data Source group box to specify the library, cell, view,
simulator, alternate state directory and state names. For more information, see
Entering or Modifying Values in Fields on page 32.

April 2011

50

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Tip
You can right-click on the Value column in the Reference Data Source or the
Compared Data Source group box and choose Open ADE L to view the
simulation setup for the data source in ADE L.
6. Run simulation by doing one of the following:

Click the

Choose Run Run Simulations.

toolbar button.

For more information about running simulations, see Starting a Simulation on page 79.
7. The following message box appears after the simulation runs for the reference and
compared data sources are completed.

8. Click OK.
9. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Note the following:


will the following text xhange as we can save the view now..?

The measured results from ADE simulation runs are not saved to a persistent result
database. Because of this, you can use amsDmv to validate the measured results only
while the ADE simulation session is active. Therefore, if you run an ADE simulation using
amsDmv in regression mode, you cannot debug failures reported for measured results

April 2011

51

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
because, after the regression run is complete, the ADE session is closed and the
measured results data is lost. In such cases, you can do one of the following:

Open the amsDmv user interface and re-run the ADE simulation to debug the
failures reported for measured results.

Run an ADE XL simulation using amsDmv in regression mode. The measured


results from ADE XL simulation runs are saved in a persistent results database
(.rdb) file. So, you can use amsDmv to validate waveform signals from ADE XL
simulation runs even after the ADE XL simulation session is closed.

For more information about running amsDmv in regression mode, see Running an
amsDmv Script File on page 88 and Running an amsDmv SKILL File on page 88.

The waveform data created from ADE simulation runs are stored in a SignalScan Turbo 2
(SST2) database. So, you can use amsDmv to validate waveform signals from ADE
simulation runs even after the ADE simulation session is closed.

Setting Up and Running Simulations Using ADE XL


You can use the setup in an existing ADE XL view to run simulations on the reference and
compared data sources so that the resulting waveform signals and measured results can be
compared and validated using amsDmv.
Note: Running simulations using ADE XL is supported only in IC 6.1.4 ISR 3 and later
releases. ADE XL simulation is not supported in IC 5.1.41 releases.
To setup and run an ADE XL simulation, do the following:
1. Click the Source tab.

April 2011

52

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
2. Select the run type as ADE XL.

3. Right-click on the Value column in the Reference Data Source group box and choose
Browse to existing ADE XL view.
The Select ADE XL View form appears.

April 2011

53

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
cross check the example once again.

4. Select the ADE XL view and click OK.


The ADE XL History Item form appears.

5. From the Select ADE XL history to load and run drop-down list, select the history
item you want to use and click OK.
The names of the selected library, cell, view and history item are displayed in the
Reference Data Source group box. The settings in the history item are used for running
the simulation.
For more information about ADE XL history items, see the Virtuoso Analog Design
Environment XL User Guide.

April 2011

54

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Tip
Rename the reference history item in ADE XL to, say,
amsDmvReferenceHistory, so that it is easy to remember the reference history
item. Also, lock the history item in ADE XL so that it does not get deleted accidently.
6. In the Compared Data Source group box, do one of the following:

Select

To

New View/
Config

Specify that the compared data source will use the setup in the
ADE XL view specified for the reference data source for simulation
purposes, but will use a different design or config view for
simulation purposes.
To specify the config view for the compared data source, do the
following:
1. Right-click on the Value column in the Compared Data
Source group box and choose Browse Override view/config.
The Select override view/config form appears.

2. Select the config view and click OK.


The names of the selected library, cell and view are displayed
in the Compared Data Source group box.

April 2011

55

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Select

To
Tip
You can also double-click on any row in the Value column
in the Compared Data Source group box to specify the
library, cell and view names. For more information, see
Entering or Modifying Values in Fields on page 32.

New Settings

Specify a different ADE XL view for the compared data source.


To specify the ADE XL view for the compared data source, do the
following:
1. Right-click on the Value column in the Compared Data
Source group box and choose Browse to existing ADE XL
view.
The Select ADE XL View form appears.

2. Select the ADE XL view and click OK.


The ADE XL History Item form appears.

April 2011

56

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Select

To
3. From the Select ADE XL history to load and run drop-down
list, select the history item you want to use and click OK.
The names of the selected library, cell, view and history item are
displayed in the Compared Data Source group box. The
settings in the history item are used for running the simulation.
Tip
Rename the compared history item in ADE XL to, say,
amsDmvComparedHistory, so that it is easy to
remember the compared history item. Also, lock the history
item in ADE XL so that it does not get deleted accidently.

Tip
You can double-click on any row in the Value column in the Reference Data
Source or Compared Data Source group box to specify the library, cell, view, and
history item names. For more information, see Entering or Modifying Values in
Fields on page 32.
Tip
You can right-click on the Value column in the Reference Data Source or
Compared Data Source group box and choose Open ADE XL to open the ADE
XL view for the data source in ADE XL.
7. Run simulation by doing one of the following:

Click the

Choose Run Run Simulations.

toolbar button.

For more information about running simulations, see Starting a Simulation on page 79.

April 2011

57

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
8. The following message box appears after the simulation runs for the reference and
compared data sources are completed.

9. Click OK.
10. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Setting Up and Running Simulations Using SKILL or


OCEAN Commands
You can use SKILL or OCEAN commands to run simulations on the reference and compared
data sources so that the resulting waveform signals and measured results can be compared
and validated using amsDmv.
Note the following:

Running simulations using SKILL or OCEAN commands is supported only when you run
amsDmv from the Virtuoso environment(see Starting amsDmv from the Virtuoso
Environment on page 19). This is not supported when you run amsDmv from the UNIX
command line (see Starting amsDmv from the Command Line on page 21).

The SKILL or OCEAN commands can be directly specified in the Source tab, or you can
specify the path to a file containing the SKILL or OCEAN commands.

To setup and run simulation using SKILL or OCEAN commands, do the following:

April 2011

58

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
1. Click the Source tab.
2. Select the run type as SKILL.

3. In the Reference Data Source group box, do the following to specify the options for
simulating the reference data source:

April 2011

59

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
a. In the SKILL command field, type the SKILL or OCEAN commands required to
simulate the reference data source.
If the SKILL or OCEAN commands exist in a file, specify the path to the file.
Note: You can use the text AMSDMVDIR in the SKILL or OCEAN commands instead
of specifying the path to the current working directory (the directory in which you
started Virtuoso or amsDmv).
b. (Optional) To validate waveforms, right-click on the Value column next to the
Waveform directory field and choose Browse to waveform directory to select
the directory in which waveforms for the reference data source exist.
c. (Optional) To validate measured results from an ADE XL simulation run, right-click
on the Value column next to the ADE XL result database field and choose
Browse to ADE XL result database to specify the location of the ADE XL result
database for the reference data source.
The Browser form appears:

Do one of the following:

April 2011

Click ADE XL view to specify the ADE XL view in which the result database
exists.

60

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The Select Reference ADE XL view form appears. Select the ADE XL view and
click OK.

Click File to specify an ADE XL result database file.


The Open ADE XL reference result database form appears. Select the ADE XL
result database file and click Open.

4. In the Compared Data Source group box, do the following to specify the options for
simulating the compared data source:
a. In the SKILL command field, type the SKILL or OCEAN commands required to
simulate the compared data source.
If the SKILL or OCEAN commands exist in a file, specify the path to the file.
Note: You can use the text AMSDMVDIR in the SKILL or OCEAN commands instead
of specifying the path to the current working directory (the directory in which you
started Virtuoso or amsDmv).
b. (Optional) To validate waveforms, right-click on the Value column next to the
Waveform directory field and choose Browse to waveform directory to select
the directory in which waveforms for the compared data source exist.
c. (Optional) To validate measured results from an ADE XL simulation run, right-click
on the Value column next to the ADE XL result database field and choose
Browse to ADE XL result database to specify the location of the ADE XL result
database for the compared data source.

April 2011

61

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The Browser form appears:

Do one of the following:

Click ADE XL view to specify the ADE XL view in which the result database
exists.
The Select Compared ADE XL view form appears. Select the ADE XL view and
click OK.

Click File to specify an ADE XL result database file.


The Open ADE XL compared result database form appears. Select the ADE XL
result database file and click Open.

5. Run simulation by doing one of the following:

Click the

Choose Run Run Simulations.

toolbar button.

For more information about running simulations, see Starting a Simulation on page 79.
April 2011

62

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The following message box appears after the simulation runs are complete.

6. Click OK.
7. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Setting Up and Running Simulations Using System


Commands
You can use system commands to run simulations on the reference and compared data
sources so that the resulting waveform signals and measured results can be compared and
validated using amsDmv. For example, you can use irun simulation commands to run
simulations on the reference and compared data sources.
Note: The system commands can be directly specified in the Source tab, or you can specify
the path to a file containing the system commands.
To setup and run a simulation using system commands, do the following:
1. Click the Source tab.

April 2011

63

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
2. Select the run type as System.

3. In the Reference Data Source group box, do the following to specify the options for
simulating the reference data source:

April 2011

64

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
a. In the System command field, type the system commands required to simulate the
reference data source.
If the system commands exist in a file, specify the path to the file. Alternatively, rightclick on the Value column next to the System command field and choose Browse
to script file to select the script file.
Note: You can use the text AMSDMVDIR in the system commands instead of
specifying the path to the current working directory (the directory in which you
started Virtuoso or amsDmv).
b. (Optional) To validate waveforms, right-click on the Value column next to the
Waveform directory field and choose Browse to waveform directory to select
the directory in which waveforms for the reference data source exist.
c. (Optional) To validate measured results from an ADE XL simulation run, right-click
on the Value column next to the ADE XL result database field and choose
Browse to ADE XL result database to specify the location of the ADE XL result
database for the reference data source.
The Browser form appears.

Note: If you started amsDmv from the UNIX command line (see Starting amsDmv
from the Command Line on page 21), the Open ADE XL reference result database
form appears instead of the Browser form. Select the ADE XL result database file
and click Open.
Do one of the following:

April 2011

Click ADE XL view to specify the ADE XL view in which the result database
exists.

65

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The Select Reference ADE XL view form appears. Select the ADE XL view and
click OK.

Click File to specify an ADE XL result database file.


The Open ADE XL reference result database form appears. Select the ADE XL
result database file and click Open.

4. In the Compared Data Source group box, do the following to specify the options for
simulating the compared data source:
a. In the System command field, type the system commands required to simulate the
compared data source.
If the system commands exist in a file, specify the path to the file. Alternatively, rightclick on the Value column next to the System command field and choose Browse
to script file to select the file.
Note: You can use the text AMSDMVDIR in the system commands instead of
specifying the path to the current working directory (the directory in which you
started Virtuoso or amsDmv).
b. (Optional) To validate waveforms, right-click on the Value column next to the
Waveform directory field and choose Browse to waveform directory to select
the directory in which waveforms for the compared data source exist.
c. (Optional) To validate measured results from an ADE XL simulation run, right-click
on the Value column next to the ADE XL result database field and choose

April 2011

66

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
Browse to ADE XL result database to specify the location of the ADE XL result
database for the compared data source.
The Browser form appears:

Do one of the following:

Click ADE XL view to specify the ADE XL view in which the result database
exists.
The Select Compared ADE XL view form appears. Select the ADE XL view and
click OK.

Click File to specify an ADE XL result database file.


The Open ADE XL compared result database form appears. Select the ADE XL
result database file and click Open.

5. Run simulation by doing one of the following:

April 2011

Click the

toolbar button.
67

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Choose Run Run Simulations.

For more information about running simulations, see Starting a Simulation on page 79.
The following message box appears after the simulation runs are complete.

6. Click OK.
7. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Running a Simulation to Compare with Existing Measured


Results and Waveform Signal Data
If the waveform signal data or ADE XL results database exists for the reference or compared
data source, you can run a simulation to compare and validate the existing data with the
waveform signal data or ADE XL results database created from the new simulation run. This
allows you to save time because you need not resimulate both the reference and compared
data sources again for validation purposes.
For example, if the waveform signal data or ADE XL results database exists for your reference
data source, you can run simulation only on your compared data source. The waveform signal
data or ADE XL results database created for the compared data source from the new
simulation run will be automatically compared with the existing data for the reference data
source for validation purposes.

April 2011

68

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
To run a simulation to compare with existing results and waveform signal data, do the
following:
1. Click the Source tab.
2. If the waveform signal data or ADE XL results database exists for the reference data
source, select the run type as None in the Reference Data Source group box.

3. In the Reference Data Source group box, do the following to specify the paths to the
existing waveform signal data and ADE XL results database for the reference data
source:
a. (Optional) Right-click on the Value column next to the Waveform directory field
and choose Browse to waveform directory.
The Reference Waveform Directory form appears. Select the waveform directory for
the reference data source and click Choose.
b. (Optional) Right-click on the Value column next to the ADE XL result database
field and choose Browse to ADE XL result database.
April 2011

69

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The Open ADE XL reference result database form appears. Select the ADE XL
result database file and click Open.
4. In the Compared Data Source group box, do one of the following to specify the options
for simulating the compared data source:

Select

To

ADE L

Simulate the compared data source using ADE L.


For more information about simulating the compared data
source using ADE L, see Setting Up and Running
Simulations Using ADE on page 44.

ADE XL

Simulate the compared data source using ADE XL.


For more information about simulating the compared data
source using ADE XL, see Setting Up and Running
Simulations Using ADE XL on page 52.

SKILL

Simulate the compared data source using SKILL or


OCEAN commands.
For more information about simulating the compared data
source using SKILL or OCEAN commands, see Setting Up
and Running Simulations Using SKILL or OCEAN
Commands on page 58.

System

Simulate the compared data source using system


commands.
For more information about simulating the compared data
source using system commands, see Setting Up and
Running Simulations Using System Commands on
page 63.

For example, in the following figure, the path to an existing ADE XL result database file
is specified for the reference data source and an ADE XL simulation is setup for the
compared data source so that the ADE XL results database created for the compared

April 2011

70

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
data source from the new simulation run will be automatically compared with the existing
data for the reference data source for validation purposes.

5. Run simulation by doing one of the following:

Click the

Choose Run Run Simulations.

toolbar button.

For more information about running simulations, see Starting a Simulation on page 79.
April 2011

71

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The following message box appears after the simulation run for the compared data
source is complete.

6. Click OK.
7. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Comparing Existing Measured Results and Waveform


Signal Data for Reference and Compared Data Sources
If the waveform signal data or ADE XL results database exists for both the reference and
compared data sources, you can compare and validate the data without simulating the
reference and compared data sources.
To compare existing waveform signal data or ADE XL results, do the following:
1. Click the Source tab.

April 2011

72

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
2. Select the run type as None.

3. In the Reference Data Source group box, do the following to specify the paths to the
existing waveform signal data and ADE XL results database for the reference data
source:

April 2011

73

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
a. Right-click on the Value column next to the Waveform directory field and choose
Browse to waveform directory.
The Reference Waveform Directory form appears. Select the waveform directory for
the reference data source and click Choose.
b. Right-click on the Value column next to the ADE XL result database field and
choose Browse to ADE XL result database.
The Open ADE XL reference result database form appears. Select the ADE XL
result database file and click Open.
4. In the Compared Data Source group box, do the following to specify the paths to the
existing waveform signal data and ADE XL results database for the compared data
source:
a. Right-click on the Value column next to the Waveform directory field and choose
Browse to waveform directory.
The Compared Waveform Directory form appears. Select the waveform directory for
the compared data source and click Choose.
b. Right-click on the Value column next to the ADE XL result database field and
choose Browse to ADE XL result database.
The Open ADE XL compared result database form appears. Select the ADE XL
result database file and click Open.
5. By default, the measured results and waveform signals are not automatically loaded and
validated. Do one of the following to load and validate the measured results and waveform
signals:

Choose Run Load and Validate to load and validate the measured results and
waveform signals.

Click the
signals.

Click the Measured Results tab to load and validate measured results.

Click the Waveform Signals tab to load and validate waveform signals.

toolbar button to load and validate the measured results and waveform

6. Use the Measured Results and Waveform Signals tabs to validate the differences
reported in the measured results and waveform signals for the reference and compared
data sources.

April 2011

For more information about validating differences in measured results, see


Chapter 3, Validating Measured Results.

74

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

For more information about validating differences in waveform signals, see


Chapter 4, Validating Waveforms.

Specifying a Different Run Type for the Compared Data


Source
By default, the same run type that is selected for the reference data source is used for
simulating the compared data source. For example, if you select ADE L as the run type for
the reference data source in the Source tab, ADE L will be used for simulating both the
reference and the compared data sources.
amsDmv also allows you to specify a different run type for the compared data source. For
example, you can select ADE L as the run type for the reference data source, and System
as the run type for the compared data source.
To specify a different run type for the compared data source, do the following:
1. Select the Advanced Mode check box in the Preferences for AMS Design and Model
Validation form. For more information about the Advanced Mode check box, see
Specifying amsDmv Options on page 35.

April 2011

75

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The New Run Type option is displayed in the Compared Data Source group box on
the Source tab.

2. Select the New Run Type option.

April 2011

76

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
3. In the Compared Data Source group box, select a different run type for the compared
data source by doing one of the following:
Note: Ensure that the run type for the reference and the compared data sources create
the same type of measured result and waveform data. For example, if the run type for the
reference data source is ADE L, you can select ADE L as the run type for the compared
data source but cannot cannot select ADE XL as the run type for the compared data
source.

Select

To

ADE L

Simulate the compared data source using ADE L.


For more information about simulating the compared data
source using ADE L, see Setting Up and Running
Simulations Using ADE on page 44.

ADE XL

Simulate the compared data source using ADE XL.


For more information about simulating the compared data
source using ADE XL, see Setting Up and Running
Simulations Using ADE XL on page 52.

SKILL

Simulate the compared data source using SKILL or


OCEAN commands.
For more information about simulating the compared data
source using SKILL or OCEAN commands, see Setting Up
and Running Simulations Using SKILL or OCEAN
Commands on page 58.

System

Simulate the compared data source using system


commands.
For more information about simulating the compared data
source using system commands, see Setting Up and
Running Simulations Using System Commands on
page 63.

April 2011

77

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Running System Commands Before and After Simulation


Runs
You can specify the system commands that need to be run before or after the reference and
compared data sources are simulated. For example, you can specify the system commands
required for checking out data from a design management system or for ensuring that up-todate data is available to run the required simulations.
When simulations are run, amsDmv performs the following actions, in the following order:
1. Runs the pre-run commands, if any, specified for the reference data source.
2. Simulates the reference data source.
3. Runs the post-run commands, if any, specified for the reference data source.
4. Runs the pre-run commands, if any, specified for the compared data source.
5. Simulates the compared data source.
6. Runs the post-run commands, if any, specified for the compared data source.
To specify the system commands that need to be run before or after simulation runs, do the
following:
1. Choose Edit Pre/Post Run.
The Set Pre/Post Run for Reference and Compared Data Source form appears.

Note the following:

April 2011

78

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

The system commands can be directly specified in this form. If the system
commands exist in a file, click the browse button to select the file.

The environment variables set by system commands are applicable only when the
commands are run. They will not be used by other system commands. For example,
the environment variables specified in the Prerun command field for the reference
data source will not be used when the commands specified in the Postrun
command field for the reference data source are run. This allows you to specify
different values for the same environment variable in different system commands.

2. (Optional) In the Reference Data Source group box, do the following:

In the Prerun command field, specify the commands that need to be run before the
simulation of the reference data source.

In the Postrun command field, specify the commands that need to be run after the
simulation of the reference data source.

3. (Optional) In the Compared Data Source group box, do the following:

In the Prerun command field, specify the commands that need to be run before the
simulation of the compared data source.

In the Postrun command field, specify the commands that need to be run after the
simulation of the compared data source.

4. Click OK.

Starting a Simulation
To start a simulation, do one of the following:

Choose Run Run Simulations.

Click the

toolbar button.

The program does the following:

Runs simulations for the reference and compared data sources. When simulations are
run, amsDmv performs the following actions, in the following order:
a. Runs the pre-run commands, if any, specified for the reference data source.
b. Simulates the reference data source.
c. Runs the post-run commands, if any, specified for the reference data source.

April 2011

79

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
d. Runs the pre-run commands, if any, specified for the compared data source.
e. Simulates the compared data source.
f. Runs the post-run commands, if any, specified for the compared data source.
For more information about specifying the pre and post-run commands that need to be
run before or after the reference and compared data sources are simulated, see Running
System Commands Before and After Simulation Runs on page 78.

Displays the progress and status of the simulation run in the Source tab and in the
amsDmv log.
For example, in the following figure, the progress and status of the simulation run is
displayed in the Source tab and in the amsDmv log.

Indicates simulations
are running

amsDmv log and status


area displays simulation
run progress

For more information about the amsDmv log, see Using the amsDmv Log on page 81.
April 2011

80

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

If you started amsDmv from the CIW (see Starting amsDmv from the Virtuoso
Environment on page 19), displays the progress and status of the simulation run, and
simulation errors, if any, in the CIW.
If you started amsDmv from the UNIX command line (see Starting amsDmv from the
Command Line on page 21), simulation errors, if any, are displayed in the terminal from
which amsDmv was started.

Displays a Simulations Finished message box after the simulation runs are complete.

Stopping a Simulation
To stop a simulation that is running, do the following:

Click the

toolbar button.

Note: A stopped simulation can be restarted but cannot be resumed from where it was
stopped.

Using the amsDmv Log


The amsDmv log displays a log of all actions and their status. For more information about
using the amsDmv log, see the following topics:

Showing and Hiding the Log on page 82

Clearing the Log Contents on page 83

Saving the Log Contents to a File on page 83

April 2011

81

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Showing and Hiding the Log


To show the amsDmv log, do one of the following:

Click the

Choose View Show/Hide Log.

toolbar button.

The amsDmv log is displayed in the Log group box.


Note: The amsDmv log is not stored on the disk. If required, you can save the log contents
to a file as described in Saving the Log Contents to a File on page 83.

amsDmv Log

To hide the amsDmv log, do one of the following:

Click the

April 2011

toolbar button.
82

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Choose View Show/Hide Log.

Clearing the Log Contents


To clear the contents of the amsDmv log, do the following:

Choose Edit Clear Log.

Saving the Log Contents to a File


To save the log contents to a file, do the following:
1. Do one of the following:

Click the

Choose File Save Log.

toolbar button.

The File in which to save current log information form appears.


2. Specify the path and name of the file, then click Save.

Viewing the Simulation and Validation Status Summary


To view a summary report of the simulation and validation status, do the following:

Click the Status tab.

April 2011

83

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
The simulation and validation status is displayed in the Report group box. The progress
bar on the top of the Status tab displays a summary of the validation status.

Tip
You can also view the summary of the simulation and validation status in the report
(.rep) file that is located in the directory in which you started Virtuoso or amsDmv.

April 2011

84

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Saving and Opening the amsDmv Setup


You can save the setup in the AMS Design and Model Validate form to either a file or a cell
view so that you can open it later.

Saving the amsDmv Setup

To save the amsDmv setup to a cellview, do the following:

1. Click on File Save View.


The form for save as amsDmv view appears.
2. Specify the library, cell, category and view name, then click OK.

The setup is saved to a cellview.

Note: There are no refernces to views when the amsDmv is launched from a UNIX command
line.

To save the amsDmv setup to a file, do the following:

1. Click on File Save File.


The File to save current setup form appears.
2. Specify the path and name of the file, then click Save.
The setup is saved in a file with the .dmv extension.

April 2011

85

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
Note: The local options specified for validating results and waveform signals are saved in an
options (.opt) file that has the same name as the setup file. For more information about
specifying local options for validating results and waveform signals, see Working with Local
Options for Validating Results on page 103 and Working with Local Options for Validating
Signals on page 130.

Opening the amsDmv Setup

To open an amsDmv setup from a cellview, do the following:

1. Click on File Open View.


The Open amsDmv View form appears.
2. Specify the library, category, cell and view name, then click OK.

Note: Select the Filter checkbox to view the library names containing only the particular view
type.

To open an amsDmv setup from a file, do the following:

1. Click on File Open File.


The File to load current setup form appears.
2. Specify the path and name of the setup file, then click Open.

April 2011

86

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Saving the amsDmv Setup as an Executable Script and a


SKILL File
You can save the amsDmv setup as an executable script and a SKILL file. This allows you to
run the amsDmv setup from the UNIX command line (using the script file) or from the CIW
(using the SKILL file) in the future without opening the amsDmv user interface. For example,
the script file can be used to run the amsDmv setup from the UNIX command line as part of
a scheduled regression run.
To save the amsDmv setup to a script, do the following:
1. Do one of the following:

Click the

Choose File Save Command File.

toolbar button.

The Save command line filename (to execute current setup) form appears.
2. Specify the path and name of the script file, then click Save.
A message box appears indicating that the current setup is saved to an executable script
file and to a SKILL file.

3. Click OK to close the message box.


Note: The local options specified for validating results and waveform signals are saved in an
options (.opt) file that has the same name as the setup file. For more information about
specifying local options for validating results and waveform signals, see Working with Local

April 2011

87

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations
Options for Validating Results on page 103 and Working with Local Options for Validating
Signals on page 130.

Running an amsDmv Script File


To run an amsDmv script file, type the following command in a UNIX terminal and press
Enter:
./scriptFileName

where scriptFileName is the name of the amsDmv script file.


For more information about what happens when when you run an amsDmv script file, see
Running amsDmv from the Command Line on page 89.
should the clause of running from a view be included here..??

Running an amsDmv SKILL File


To run an amsDmv SKILL file, type the following command in the CIW and press Enter:
load(skillFileName.il)

where skillFileName .il is the name of the amsDmv SKILL file.


When you run an amsDmv SKILL file, the amsDmv user interface is opened and the
simulation and validation processes are automatically run. After the run is complete, the
amsDmv user interface and the CIW are closed. You can use the following command in the
UNIX terminal to view the return code for the SKILL file:
echo $?

The return codes are described below:

Return Code

Description

Indicates that the run (simulation, validation and pin


checking) is successfully completed.

Indicates that simulation, validation or pin checking failed


during the run. For more information about the run details,
see the report (.rep) file for the run.

April 2011

88

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

Running amsDmv from the Command Line


You can use the amsDmv command line options to run amsDmv from the UNIX command line.
For more information about amsDmv command line options, see Chapter 6, amsDmv
Command Reference.
For example, you can create a script file that contains the required amsDmv command line
options for validating the reference and compared data sources and automatically run the
script file as part of a scheduled regression run.
Tip
You can also use the procedure described in Saving the amsDmv Setup as an
Executable Script and a SKILL File on page 87 to automatically create an
executable script file that can be used to run amsDmv from the command line.
Note the following when you run amsDmv from the command line:

Virtuoso and amsDmv will be run in nograph mode. This allows you to run the script on
servers that don't have any displays.

The progress and status of the simulation run, and simulation errors, if any, are displayed
in the terminal.

You can view the progress of an ADE XL simulation run by viewing the latest Job*.log
file in the logs_<userName> directory that is created in the directory in which you run
amsDmv. For example, you can use the following command to view the progress of the
ADE XL simulation run:
tail -f logs_guest/Job0.log

A summary report of the validation status is displayed in the terminal after the simulation
run for the reference and compared data sources are complete.
The detailed simulation and validation status is written to a report (.rep) file that has the
same name as the amsDmv script file. For example, if you run an amsDmv script named
myamsDmvSetup, the report file will be named myamsDmvSetup.rep. You can view the
detailed validation status by opening the report file in a text editor.
For example, the following message that is displayed in the terminal after the simulation
is complete indicates that all simulations were completed successfully, all validation
checks have passed and that the detailed validation status is written to a report file
named myamsDmvSetup.rep.
amsDmv: PASS: Simulation completed successfully. 57 checks passed.
Created report file 'myamsDmvSetup.rep'

April 2011

89

Product Version 6.1.5

AMS Design and Model Validation User Guide


Setting Up and Running Simulations

The Virtuoso log file is written to a log (.log) file that has the same name as the
amsDmv script file. For example, if you run an amsDmv script named myamsDmvSetup,
the log file will be named myamsDmvSetup.log.

After the run is complete, you can use the following command in the UNIX terminal to
view the return code for the script:
echo $?

The return codes are described below:

Return Code

Description

Indicates that the run (simulation, validation and pin


checking) is successfully completed.

Indicates that simulation, validation or pin checking failed


during the run. For more information about the run details,
see the report (.rep) file for the run.

April 2011

90

Product Version 6.1.5

AMS Design and Model Validation User Guide

3
Validating Measured Results
This chapter describes the following topics:

Validating Measured Results on page 92

Specifying Global Options for Validating Results on page 98

Working with Local Options for Validating Results on page 103

Viewing the Point Details for ADE XL Results on page 107

Viewing the Failing Points for ADE XL Results on page 107

Selecting and Deselecting Results on page 111

Hiding and Showing Global Results Options on page 112

April 2011

91

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Validating Measured Results


The Measured Results tab in the AMS Design and Model Validate form allows you to compare
and validate the measured results for the reference (for example, design) and compared (for
example, model) data sources specified in the Source tab.
To validate measured results, do the following:
1. On the Measured Results tab, select the Validate Measured Results check box.This tab
displays the measured results from the simulations run on the reference and compared

April 2011

92

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
data sources, and the worst-case absolute and relative differences between the
measured results for the reference and compared data sources.

April 2011

93

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
The fields in the Measured Results tab are described below:
Column

Description

Result

Name of the measured result created from a simulation run


using:

Test

ADE (in IC 5.1.41)

ADE L or ADE XL (in IC 6.1)

Name of the ADE XL test for which the result was saved
during the simulation run.
Note: The test names are displayed only for measured results
from ADE XL simulation runs. For more information, see
Setting Up and Running Simulations Using ADE XL on
page 52.

Status

Indicates the validation status of the result.

Not Checked indicates that no tolerance values were


specified for results.

Pass indicates that differences between the reference


and the compared result are within the specified
tolerances values.

Fail indicates that differences between the reference and


the compared result are not within the specified
tolerances values.

For more information about specifying tolerance values for


results, see Specifying Maximum Acceptable Tolerance
Values for Results on page 98.
Worst Absolute Diff

Worst-case absolute deviation between the reference and


compared result.

Worst Relative Diff

Worst-case relative (%) deviation between the reference and


compared result.

Max Ref Value

Maximum value for the reference result.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.

April 2011

94

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Column

Description

Min Ref Value

Minimum value for the reference result.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.

Max Comp Value

Maximum value for the compared result.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.

Min Comp Value

Minimum value for the compared result.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.

Options

Displays the global or local options specified for validating


results. For more information, see the following topics:

Specifying Global Options for Validating Results on


page 98.

Working with Local Options for Validating Results on


page 103

Note: This column is displayed only if the Advanced check


box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.
Selected

Allows disabling multiple selected results, or setting, copying


and deleting local options for multiple selected results.
For more information, see Selecting and Deselecting Results
on page 111.
Note: This column is displayed only if the Advanced check
box is selected in the Preferences for AMS Design and Model
Validation form. For more information, see Specifying
amsDmv Options on page 35.

April 2011

95

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
2. Use the options in the Global Results Options group box to control how results are
validated. For more information, see Specifying Global Options for Validating Results on
page 98.
For example, if you specify an acceptable tolerance value in the Relative Tolerance %
field, the Status column displays a Pass status for the results that fall within the specified
tolerance value, and a Fail status for the results that fall outside the specified tolerance
value.
Note: You can also specify local options for a specific result to override the options
specified in the Global Results Options group box for that result. For more information,
see Working with Local Options for Validating Results on page 103.
The options in the Global Results Options group box are described below:

Option

Description

Relative Tolerance (%)

Specifies the maximum acceptable % tolerance value for


the worst-case relative deviation reported for results.
For more information, see Specifying Maximum Acceptable
Tolerance Values for Results on page 98.

Absolute Tolerance

Specifies the maximum acceptable tolerance value for the


worst-case absolute deviation reported for results.
For more information, see Specifying Maximum Acceptable
Tolerance Values for Results on page 98.

Enabled Results

Specifies regular expressions defining the results to be


validated. Only the results with matching names will be
validated and displayed.
For more information, see Validating Only Results With
Specific Names on page 100.

Disabled Results

Specifies regular expressions defining results that should


not be validated. The results with matching names will not
be validated or displayed.
For more information, see the following topics:

April 2011

Disabling Validation of Specific Results on page 101

Disabling Validation of Results With Specific Names on


page 100

96

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Option

Description

Tests

Specifies regular expressions defining the ADE XL tests for


which results must be validated. Only the results from the
tests with matching names will be validated and displayed.
For more information, see Validating Only Results for
Specific ADE XL Tests on page 101.

Result Exclude

Specifies the range of swept or corner points for which


ADE XL results must be excluded from validation.
For more information, see Excluding Specific Swept or
Corner Points of ADE XL Measured Results from
Validation on page 102.

Result Window

Specifies that only the ADE XL results that fall within the
specified range of swept or corner points should be
included for validation.
For more information, see Including Only Specific Swept or
Corner Points of ADE XL Measured Results for Validation
on page 102.

3. Correct the design or the model based on the analysis of the results that have a Fail
status.
4. Run model validation again on the corrected design and model to verify that all measured
results have a Pass status.

April 2011

97

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Specifying Global Options for Validating Results


You can use the options in the Global Results Options group box to control how results are
validated. These options apply to all the results.
Note: You can also specify local options for a specific result to override the options specified
in the Global Results Options group box for that result. For more information, see Working
with Local Options for Validating Results on page 103.
The following topics describe how you can control validation of results using the options in the
Global Results Options group box:

Specifying Maximum Acceptable Tolerance Values for Results on page 98

Validating Only Results With Specific Names on page 100

Disabling Validation of Results With Specific Names on page 100

Disabling Validation of Specific Results on page 101

Validating Only Results for Specific ADE XL Tests on page 101

Excluding Specific Swept or Corner Points of ADE XL Measured Results from Validation
on page 102

Including Only Specific Swept or Corner Points of ADE XL Measured Results for
Validation on page 102

Specifying Maximum Acceptable Tolerance Values for Results


You can specify the maximum acceptable tolerance valuesthe maximum acceptable
differences between the reference and compared results.
To specify the maximum acceptable tolerance values, do either or both the following:

In the Relative Tolerance (%) field, specify the % maximum acceptable tolerance value
for the worst-case relative deviation reported for results.

In the Absolute Tolerance field, specify the maximum acceptable tolerance value for
the worst-case absolute deviation reported for results.

Note: If a reference result value is zero or very close to zero, the relative tolerance will not be
applied unless the compared result is identical to the reference result. In such cases, you
must specify some small absolute tolerance value, say, 1p, to ensure that the results are
compared.

April 2011

98

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
The Status column displays a Pass status for the results that fall within the specified
tolerance value, and a Fail status for the results that fall outside the specified tolerance value.
This allows you to focus on validating only the results with the Fail status.
For example, in the following figure, results that fall within the specified relative tolerance
value of 0.1% have a Pass status. Results that fall outside the relative tolerance value have
a Fail status.

April 2011

99

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Validating Only Results With Specific Names


You can validate only results with specific names in the Measured Results tab. This allows
you to focus on only the results that need to be validated.
To validate only results with specific names, do the following:

Specify the names of the results in the Enabled Results field.


Use regular expressions to specify the names of the results to be validated. For example,
specify:

(a|b|c)* to validate only the results whose names start with the letters a, b and c.

amp*out to validate only the results whose names start with the letters amp and end
with the letters out.

in[^d]* to validate only the results whose names start with the letters in but not
the results starting with letters ind.

Note: Only the results with matching names will be validated and displayed in the Measured
Results tab.
See also:

Specifying Regular Expressions on page 32.

Disabling Validation of Results With Specific Names


You can disable the validation of results with specific names in the Measured Results tab. This
allows you to ignore the results that need not be validated.
To disable validation of results with specific names, do the following:

Specify the names of the results in the Disabled Results field.


Use regular expressions to specify the names of the results that should not be validated.
For example, specify:

(a|b|c)* to disable the validation of results whose names start with the letters a,
b and c.

amp*out to disable the validation of results whose names start with the letters amp
and end with the letters out.

in[^d]* to disable the validation of results whose names start with the letters in
but not the results starting with letters ind.

April 2011

100

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
Note: The results with matching names will not be validated or displayed in the Measured
Results tab.
See also:

Specifying Regular Expressions on page 32.

Disabling Validation of Specific Results


You can disable the validation of specific results. The specified results are not validated or
displayed in the Measured Results tab.
To disable validation of a result, do the following:

Right-click the result and choose Disable Disable this result.

To disable validation of multiple results, do the following:


1. Select the results for which you want to disable validation. For more information about
selecting results, see Selecting Results on page 111.
2. Right-click on any of the selected results and choose Disable Disable selected
results.
A regular expression containing the names of the selected results are automatically added in
the Disabled Results field. For example, if you disable validation of two results named
up_ref_tran:slope and dn_clk_tran:slope, the following regular expression is
automatically added in the Disabled Results field:
(up_ref_tran:slope|dn_clk_tran:slope)

For more information about the Disabled Results field, see Disabling Validation of Results
With Specific Names on page 100.
Tip
To enable validation of a result that is disabled, delete the name of the result from
the regular expression that is displayed in the Disabled Results field. The enabled
result is then validated and displayed in the Measured Results tab.

Validating Only Results for Specific ADE XL Tests


For ADE XL simulation runs, you can specify that only the results for specific ADE XL tests
must be validated. This allows you to focus on only the results for the tests that need to be
validated.
April 2011

101

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
To validate only the results for specific ADE XL tests, do the following:

Specify the names of the tests in the Tests field.


Use regular expressions to specify the names of the tests. For example, specify:

timing to validate only the results for the test named timing.

(t1|t2) to validate only the results for the tests named t1 and t2.

*power* to validate only the results for the tests whose names contain the text
power.

Note: Only the results from the tests with matching names will be validated and displayed in
the Measured Results tab.
See also:

Specifying Regular Expressions on page 32.

Excluding Specific Swept or Corner Points of ADE XL Measured Results


from Validation
You can specify the range of swept or corner points for which measured results for ADE XL
simulation runs must be excluded from validation.
To specify the range of swept or corner points for which ADE XL results must be excluded
from validation, do the following:

Specify the range of swept or corner points in the Result Exclude field.
Use the following syntax to specify a comma separated list of the range of points:
from:to[,from:to]

For example, specify 4:9,12:18 to exclude the results for the points that fall with the range
starting from 4 to 9 and 12 to 18. The Status column displays a Pass status for the results
that fall within the specified range, and a Fail status for the results that fall outside the
specified range.

Including Only Specific Swept or Corner Points of ADE XL Measured


Results for Validation
You can specify that only the ADE XL results that fall within a specified range of swept or
corner points must be included for validation.

April 2011

102

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
To specify the range of swept or corner points for which ADE XL results must be included for
validation, do the following:

Specify the range of swept or corner points in the Result Window field.
Use the following syntax to specify a comma separated list of the range of points:
from:to[,from:to]

For example, specify 1:43,55:126 to include only the results for the points that fall with the
range starting from 1 to 43 and 55 to 126. The Status column displays a Pass status for the
results that fall outside the specified range, and a Fail status for the results that fall within the
specified range.

Working with Local Options for Validating Results


By default, the options specified in the Global Results Options group box are applied to all
the results displayed in the Measured Results tab. You can override the global options for
specific results by specifying local options for them.
For more information about working with local options for results, see the following topics:

Specifying Local Options for Validating Results on page 103

Copying Local Options For a Result to Other Results on page 105

Deleting Local Options Specified for Results on page 106

Specifying Local Options for Validating Results


To specify local options for validating results, do the following:
1. Ensure that the Advanced Mode check box in the Preferences for AMS Design and
Model Validation form is selected. For more information about the Advanced Mode
check box, see Specifying amsDmv Options on page 35.
2. Right-click on a result and choose Result Options Set options for this result.
Tip
To specify local options for more than one result at a time, select the results, then
choose Result options Set options for selected results. For more information
about selecting results, see Selecting Results on page 111.

April 2011

103

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
The Result Options form appears.

Note: By default, this form displays the values specified in the Global Results Options
group box or the local options that had been previously saved for this result. You can
modify the values as required.
3. (Optional) Specify the maximum acceptable tolerance values for the result in the
Relative Tolerance (%) and Absolute Tolerance fields.
For more information about these fields, see Specifying Maximum Acceptable Tolerance
Values for Results on page 98.
4. (Optional) To exclude a specific range of ADE XL swept or corner points in the result from
validation, specify the range of points in the Result Exclude field.
For more information about the Result Exclude field, see Excluding Specific Swept or
Corner Points of ADE XL Measured Results from Validation on page 102
5. (Optional) To include only a specific range of ADE XL swept or corner points in the result
for validation, specify the range of points in the Result Window field.
For more information about the Result Window field, see Including Only Specific Swept
or Corner Points of ADE XL Measured Results for Validation on page 102
6. Click OK.
The Options column in the Measured Results tab indicates that local options are
specified for the result. For example, the text Local: reltol=2, abstol=0.5 in the

April 2011

104

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
Options column in the following figure indicates that local options are specified for the
result.

Copying Local Options For a Result to Other Results


You can copy the local options specified for a result to other selected results.

April 2011

105

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
To copy the local options specified for a result to other results, do the following:
1. Select the results to which you want to copy the local options. For more information about
selecting results, see Selecting Results on page 111.
2. Right-click the result whose local options you want to copy and choose Result Options
Copy this results options to other selected results.

Deleting Local Options Specified for Results


When you delete the local options specified for a result, the global options specified in the
Global Results Options group box are applied to the result.
To delete the local options for a specific result, do the following:

Right-click the result and choose Result Options Revert options for this result to
use global values.

To delete the local options for multiple results:


1. Select the results whose local options you want to delete. For more information about
selecting results, see Selecting Results on page 111.
2. Right-click on any of the selected results and choose Result Options Revert
selected results options to use global values.

April 2011

106

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Viewing the Point Details for ADE XL Results


To view the measured values at each of the swept points for an ADE XL result, do the
following:

Right-click the result and choose Show point details for this result.
The <resultName> measured result form appears displaying the point details for the
result.

Note: The absolute or relative difference of a point that falls within the specified absolute or
relative tolerance value is displayed in green color indicating that the point is passing for the
result. The absolute or relative difference of a point that falls outside the specified absolute or
relative tolerance value is displayed in red color indicating that the point is failing for the result.

Viewing the Failing Points for ADE XL Results


To view the list of points that are failing for an ADE XL result, do the following:

Right-click the result that has the Fail status and choose Show failing points for this
result.

April 2011

107

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
The <resultName> failing points form appears displaying the start and end range of
points that are failing for the reference and compared results that exceeded the specified
absolute and relative tolerances.

Excluding Failing Points from Validation


The Exclude column in the <resultName> failing points form allows you to exclude
specific failing points of a result from validation.
Note: The Exclude column is displayed in the Signal Failing Areas form if the Advanced
Mode check box in the Preferences for AMS Design and Model Validation form is selected.
For more information about the Advanced Mode check box, see Specifying amsDmv
Options on page 35.
To exclude failing points from validation, do the following:
1. Select the Exclude check box next to the points.
2. Click the Exclude button.

April 2011

108

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
The Options column in the Measured Results tab indicates that the points are excluded from
validation. For example, the text Local: reltol=5, exclude=2:3 in the Options column
in the following figure indicates that the failing point range 2 to 3 are excluded from validation.

After excluding a failing point for a result from validation, you can include it for validation by
doing the following:

April 2011

109

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results
1. Right-click the result in the Measured Results tab and choose Result Options Set
options for this result.
The Result <resultName> Options form appears.

2. In the Result Exclude field, delete the range for the point you want to include for
validation.
The point is displayed as a failing point for validation. For information about viewing the
failing points for a result, see Viewing the Failing Points for ADE XL Results on page 107.

April 2011

110

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Selecting and Deselecting Results


If you want to perform operations such as disabling multiple results (see Disabling Validation
of Specific Results on page 101), or setting, copying and deleting local options for multiple
results (see Working with Local Options for Validating Results on page 103), you can select
the results before performing these operations.

Selecting Results
Note: Before you select results, ensure that the Advanced Mode check box in the
Preferences for AMS Design and Model Validation form is selected. For more information
about the Advanced Mode check box, see Specifying amsDmv Options on page 35.
To select a specific result, do the following:

Select the check box next to the result in the Selected column.

To select all results, do the following:

Right-click and choose Result Selection Select all results.


The check boxes next to all the results are selected in the Selected column.

Deselecting Results
To deselect a specific result, do one the following:

Deselect the check box next to the result in the Selected column.

To deselect all results, do the following:

Right-click and choose Result Selection Deselect all results.


The check boxes next to all the results are deselected in the Selected column.

Inverting Result Selections


When you invert result selections, the results that are currently selected are deselected and
the results that are currently deselected are selected.
To invert result selections, do the following:

Right-click and choose Result Selection Invert result selections.

April 2011

111

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Measured Results

Hiding and Showing Global Results Options


To hide the Global Results Options group box, do one of the following:

Click the

Choose View Show/Hide Options.

toolbar button.

To show the Global Results Options group box, do one of the following:

Click the

Choose View Show/Hide Options.

April 2011

toolbar button.

112

Product Version 6.1.5

AMS Design and Model Validation User Guide

4
Validating Waveforms
This chapter describes the following topics:

Validating Waveforms on page 114

Specifying Global Options for Validating Signals on page 121

Working with Local Options for Validating Signals on page 130

Viewing the Failing Areas for a Signal on page 136

Viewing the Point Details for a Signal on page 139

Viewing the Failing Areas at a Point on page 140

Plotting Signals on page 143

Selecting and Deselecting Signals on page 145

Hiding and Showing Global Signal Options on page 146

Disabling Automatic Validation of Waveforms on page 147

April 2011

113

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Validating Waveforms
The Waveform Signals tab in the AMS Design and Model Validate form allows you to compare
and validate the analog and digital waveform signals for the reference (for example, design)
and compared (for example, model) data sources specified in the Source tab.amsDmv
supports validation of only waveform data that is in the SignalScan Turbo 2 (SST2) format.
To validate waveform signals, do the following:
1. On the Waveform Signals tab, select the Validate Waveform Signals check box.

The Analog group box in the Waveform Signals tab displays the following information for
analog signals:
Note: The Analog group box is displayed only if analog signals are loaded from the
April 2011

114

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
waveforms.

Column

Description

Signal

Name of the analog signal.

Test

Name of the ADE XL test for which the analog signal was
saved during the simulation run.
Note: The test names are displayed only for measured
results from ADE XL simulation runs. For more information,
see Setting Up and Running Simulations Using ADE XL on
page 52.

Type

Status

Type of the analog signal.

V indicates a voltage signal.

A indicates a current signal.

Indicates the validation status of the signal.

Not Checked indicates that no tolerance values were


specified for analog signals.

Pass indicates that differences between the reference


and the compared signal are within the tolerances
specified for analog signals.

Fail indicates that differences between the reference


and the compared signal are not within the tolerances
specified for analog signals.

For more information about specifying tolerance values for


analog signals, see Specifying Maximum Acceptable
Tolerance Values for Analog Signals on page 121.
Worst Abs Diff

Worst-case absolute deviation between the reference and


compared signal.

Worst Rel Diff

Worst-case relative (%) deviation between the reference


and compared signal.

Max Ref Value

Maximum value for the reference signal.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

April 2011

115

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Column

Description

Min Ref Value

Minimum value for the reference signal.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

Max Comp Value

Maximum value for the compared signal.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

Min Comp Value

Minimum value for the compared signal.


Note: This column is displayed only if the Statistics check
box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

Options

Displays the global or local options specified for validating


signals. For more information, see the following topics:

Specifying Global Options for Validating Signals on


page 121.

Working with Local Options for Validating Signals on


page 130.

Note: This column is displayed only if the Advanced check


box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.
Selected

Allows plotting, disabling, or setting, copying and deleting


local options for multiple selected signals.
For more information, see Selecting and Deselecting
Signals on page 145.
Note: This column is displayed only if the Advanced check
box is selected in the Preferences for AMS Design and
Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

April 2011

116

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
The Logic group box in the Waveform Signals tab displays the following information for
digital signals:
Note: The Logic group box is displayed only if digital signals are loaded from the
waveforms.

Column

Description

Signal

Name of the digital signal.

Test

Name of the ADE XL test for which the digital signal was
saved during the simulation run.
Note: The test names are displayed only for measured
results from ADE XL simulation runs. For more information,
see Setting Up and Running Simulations Using ADE XL on
page 52.

Type

Status

Function

April 2011

Type of digital signal.

Logic indicates a scalar signal.

Logic Bus indicates a vectored signal.

Validation status of the digital signal.

Pass indicates that validation passed because the


reference and compared signal are functionally the
same.

Fail indicates that validation failed because there are


functional differences between the reference and
compared signal.

Indicates whether there are functional differences between


the reference and compared digital signal.

Same indicates that the reference and compared


signal are functionally the same.

Different indicates that there are functional differences


between the reference and compared signal.

117

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Column

Description

Timing Difference

Displays the maximum timing difference between the


reference and compared digital signal.
Note the following:

The timing difference is displayed in green color


(indicating a pass status) if it is less than the specified
logic time tolerance value.

The timing difference is displayed in red color


(indicating a fail status) if it is greater than the specified
logic time tolerance value.

The status Not Checked indicates that logic time


tolerance value was not specified or that there are
functional differences between the reference and
compared signal.

For more information about specifying the logic time


tolerance value, see Specifying Logic Time Tolerance
Values for Digital Signals on page 123.
1. Use the options in the Global Signal Options group box to control how signals are
validated. For more information, see Specifying Global Options for Validating Signals on
page 121.
For example, if you specify an acceptable relative tolerance value in the Analog
Relative Tolerance % field, the Status column displays a Pass status for the analog
signals that fall within the specified tolerance value, and a Fail status for the analog
signals that fall outside the specified tolerance value.
Note: You can also specify local options for a specific signal to override the options
specified in the Global Signal Options group box for that signal. For more information,
see Working with Local Options for Validating Signals on page 130.
The options in the Global Signal Options group box are described below:

Field

Description

Analog Options

April 2011

118

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Field

Description

Analog Relative
Tolerance (%)

Specifies the maximum acceptable % tolerance value for


the worst-case relative deviation reported for analog
signals.
For more information, see Specifying Maximum Acceptable
Tolerance Values for Analog Signals on page 121.

Analog Absolute
Tolerance

Specifies the maximum acceptable tolerance value for the


worst-case absolute deviation reported for analog signals.
For more information, see Specifying Maximum Acceptable
Tolerance Values for Analog Signals on page 121.

Logic Options
Logic Time Tolerance

Specifies the maximum acceptable tolerance value for the


time difference (skew) between transition times reported
for digital signals.
The default unit of time for this field is seconds.
For more information, see Specifying Logic Time Tolerance
Values for Digital Signals on page 123.

Logic Glitch Filter

Filters glitches that are equal to or smaller in width than the


specified time value from the reference and compared
digital signals before comparing them.
The default unit of time for this field is seconds.
For more information, see Filtering Glitches in Digital
Signals on page 124.

Common Options
Enabled Signals

Specifies regular expressions defining the signals to be


validated. Only the signals with matching names will be
validated and displayed.
For more information, see Validating Only Signals With
Specific Names on page 126.

April 2011

119

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Field

Description

Disabled Signals

Specifies regular expressions defining signals that should


not be validated. The signals with matching names will not
be validated or displayed.
For more information, see the following topics:

Signal Dataset

Disabling Validation of Signals With Specific Names on


page 127

Disabling Validation of Specific Signals on page 128.

Specifies regular expressions defining the analyses for


which signals must be validated. Only the signals from the
analyses with matching names will be validated.
For more information, see Validating Only Waveform
Signals for Specific Analyses on page 128.

Tests

Specifies regular expressions defining the ADE XL tests for


which signals must be validated. Only the signals from the
tests with matching names will be validated and displayed.
For more information, see Validating Only Signals for
Specific ADE XL Tests on page 129.

Signal Time Exclude

Specifies the time ranges that should be excluded when


validating each signal. Only the time ranges that fall
outside the specified time ranges will be validated for each
signal.
For more information, see Excluding Specific Time Ranges
of Waveform Signals from Validation on page 129.

Signal Time Window

Specifies the time ranges that should be validated for each


signal. The time ranges that fall outside the specified time
ranges will not be validated.
For more information, see Validating Only Specific Time
Ranges of Waveform Signals on page 130.

2. Correct the design or the model based on the analysis of the signals that have a Fail
status.
3. Run model validation again on the corrected design and model to verify that all waveform
signals have a Pass status.

April 2011

120

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Specifying Global Options for Validating Signals


You can use the options in the Global Signal Options group box to control how signals are
validated. These options apply to all the signals.
Note: You can also specify local options for a specific signal to override the options specified
in the Global Signal Options group box for that signal. For more information, see Working
with Local Options for Validating Signals on page 130.
The following topics describe how you can control validation of waveform signals using the
options in the Global Signal Options group box:

Specifying Maximum Acceptable Tolerance Values for Analog Signals on page 121

Specifying Logic Time Tolerance Values for Digital Signals on page 123

Filtering Glitches in Digital Signals on page 124

Validating Only Signals With Specific Names on page 126

Disabling Validation of Signals With Specific Names on page 127

Disabling Validation of Specific Signals on page 128

Validating Only Waveform Signals for Specific Analyses on page 128

Validating Only Signals for Specific ADE XL Tests on page 129

Excluding Specific Time Ranges of Waveform Signals from Validation on page 129

Validating Only Specific Time Ranges of Waveform Signals on page 130

Specifying Maximum Acceptable Tolerance Values for Analog Signals


You can specify the maximum acceptable tolerance valuesthe maximum acceptable
differences between the reference and compared analog signals. The compared signal is resampled and interpolated at the reference waveform timepoints and the values at these time
points are compared using the specified tolerances. The tolerances define a window around
the reference waveform within which the compared waveform values should sitif they don't,
they fail.
To specify the maximum acceptable tolerance values, do either or both the following:

In the Analog Relative Tolerance (%) field, specify the % maximum acceptable
tolerance value for the worst-case relative deviation reported for analog signals.

April 2011

121

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

In the Analog Absolute Tolerance field, specify the maximum acceptable tolerance
value for the worst-case absolute deviation reported for analog signals.

Note: If a reference signal value is zero or very close to zero, the relative tolerance will not
be applied unless the compared signal is identical to the reference signal. In such cases, you
must specify some small absolute tolerance value, say, 1p, to ensure that the signals are
compared.
The Status column displays a Pass status for the analog signals that fall within the specified
tolerance value, and a Fail status for the analog signals that fall outside the specified
tolerance value. This allows you to validate only the analog signals with the Fail status.
For example, in the following figure, analog signals that fall within the specified relative
tolerance value of 5% have a Pass status. Analog signals that fall outside the relative
tolerance value have a Fail status.

April 2011

122

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Specifying Logic Time Tolerance Values for Digital Signals


You can specify the maximum acceptable time difference (skew) between transition times on
the reference and compared digital signals in the Logic Time Tolerance field.
Note the following:

The default unit of time for the Logic Time Tolerance field is seconds.

The logic time tolerance value is applied only if there are no functional differences
between a reference and compared signal.

For example, in the following figure, the Timing Difference column indicates a pass status
(indicated by the green color text) for the top.clk signal because the maximum time
difference of 900p at a simulation time of 45n is less than the specified 1n logic time
tolerance value. However, the top.invOut signal has a timing failure (indicated by the red

April 2011

123

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
color text) due to a maximum time difference of 1.2n at a simulation time of 45.2n that is
greater than the specified 1n logic time tolerance value.

Note: The Timing Difference column in the above figure displays the Not Checked status
for the top.d[13:0] and top.pwlOut signals because the reference and compared
signals are functionally different (indicated by the text Different in the Function column).

Filtering Glitches in Digital Signals


If there are small acceptable glitches in digital signals because of spurious noise or behavior,
you can specify a time value in the Logic Glitch Filter field to filter glitches that are equal to
or smaller in width than the specified time value. The glitches are ignored when reference and
compared digital signals are compared.
Note: The default unit of time for the Logic Glitch Filter field is seconds.

April 2011

124

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
For example, in the following figure, the waveform for the reference and compared
top.pwlOut signal has an additional glitch starting at 33ns and ending at 35ns (a width of
2ns) that appears on one of the signals and not on the other.

April 2011

125

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
To filter this glitch, specify 2n in the Logic Glitch Filter field, so that the additional glitch in
the top.pwlOut signal is ignored, and a pass status is displayed for the signal as shown in
the following figure.

Validating Only Signals With Specific Names


You can validate only signals with specific names in the Waveform Signals tab. This allows
you to focus on only the signals that need to be validated.
To validate only signals with specific names, do the following:

Specify the names of the signals in the Enabled Signals field.


Use regular expressions to define the names of the signals to be validated. For example,
specify:

April 2011

126

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

(a|b|c)* to validate only the signals whose names start with the letters a, b and c.

amp*out to validate only the signals whose names start with the letters amp and
end with the letters out.

in[^d]* to validate only the signals whose names start with the letters in but not
the signals starting with letters ind.

Note: Only the signals with matching names will be validated and displayed in the Waveform
Signals tab.
See also:

Specifying Regular Expressions on page 32.

Disabling Validation of Signals With Specific Names


You can disable the validation of signals with specific names in the Waveform Signals tab.
This allows you to ignore the signals that need not be validated.
To disable validation of signals with specific names, do the following:

Specify the names of the signals in the Disabled Signals field.


Use regular expressions to specify the names of the signals that should not be validated.
For example, specify:

(a|b|c)* to disable the validation of signals whose names start with the letters a,
b and c.

amp*out to disable the validation of signals whose names start with the letters amp
and end with the letters out.

in[^d]* to disable the validation of signals whose names start with the letters in
but not the signals starting with letters ind.

Note: The signals with matching names will not be validated or displayed in the Waveform
Signals tab.
See also:

Specifying Regular Expressions on page 32.

April 2011

127

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Disabling Validation of Specific Signals


You can disable the validation of specific analog or digital signals. The specified signals are
not validated or displayed in the Waveform Signals tab.
To disable validation of a signal, do the following:

Right-click the signal and choose Disable Disable this signal.

To disable validation of multiple signals, do the following:


1. Select the signals for which you want to disable validation. For more information about
selecting signals, see Selecting Signals on page 146.
2. Right-click any of the selected signals and choose Disable Disable selected signals.
A regular expression containing the names of the selected signals is automatically added in
the Disabled Signals field. For example, if you disable validation of two signals named
top.rin and top.supply, the following regular expression is automatically added in the
Disabled Signals field:
(top\.rin|top\.supply)

For more information about the Disabled Signals field, see Disabling Validation of Signals
With Specific Names on page 127.
Tip
To enable validation of a signal that is disabled, delete the name of the signal from
the Disabled Signals field. The enabled signal is then validated and displayed in
the Waveform Signals tab.

Validating Only Waveform Signals for Specific Analyses


You can specify that only the waveform signals for specific analyses must be validated.
To specify the analyses for which waveform signals must be validated, do the following:

Specify the names of the analyses in the Signal Dataset field.


Use regular expressions to specify the names of the analyses. For example, specify
(tran|dc) to validate only the waveform signals for transient and DC analysis runs.
The Status column displays a Pass status for the waveform signals for analyses other
than transient and DC analysis, and a Fail status for the waveform signals for transient
and DC analysis runs.

April 2011

128

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Validating Only Signals for Specific ADE XL Tests


For ADE XL simulation runs, you can specify that only the signals for specific ADE XL tests
must be validated. This allows you to focus on only the signals for the tests that need to be
validated.
To validate only the signals for specific ADE XL tests, do the following:

Specify the names of the tests in the Tests field.


Use regular expressions to specify the names of the tests. For example, specify:

timing to validate only the signals for the test named timing.

(t1|t2) to validate only the signals for the tests named t1 and t2.

*power* to validate only the signals for the tests whose names contain the text
power.

Note: Only the signals from the tests with matching names will be validated and displayed in
the Waveform Signals tab.
See also:

Specifying Regular Expressions on page 32.

Excluding Specific Time Ranges of Waveform Signals from Validation


You can specify the time ranges that should be excluded when validating each waveform
signal. Only the time ranges that falls outside the specified time ranges will be validated for
each signal.
To specify the time ranges that should be excluded when validating each signal, do the
following:

Specify the time ranges in the Signal Time Exclude field.


Use the following syntax to specify a comma separated list of time ranges:
from:to[,from:to]

For example, specify 70n:72n,110n:112n to exclude the time range starting from 70n to
72n and 110n to 112n when validating each signal.

April 2011

129

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Validating Only Specific Time Ranges of Waveform Signals


You can specify the time ranges that should be validated for each waveform signal. The time
ranges that fall outside the specified time ranges will not be validated.
To specify the time ranges that should be validated for each signal, do the following:

Specify the time ranges in the Signal Time Window field.


Use the following syntax to specify a comma separated list of time ranges:
from:to[,from:to]

For example, specify 70n:72n,110n:112n to validate only the time range starting from 70n
to 72n and 110n to 112n for each signal.

Working with Local Options for Validating Signals


By default, the options specified in the Global Signal Options group box are applied to all
the signals displayed in the Waveform Signals tab. You can override the global options for
specific signals by specifying local options for each signal.
For more information about working with local options for validating signals, see the following
topics:

Specifying Local Options for Validating Analog Signals on page 130

Specifying Local Options for Validating Digital Signals on page 132

Copying Local Options For a Signal to Other Signals on page 135

Deleting Local Options Specified for Signals on page 136

Specifying Local Options for Validating Analog Signals


To specify local options for validating analog signals, do the following:
1. Ensure that the Advanced Mode check box in the Preferences for AMS Design and
Model Validation form is selected. For more information about the Advanced Mode
check box, see Specifying amsDmv Options on page 35.
2. Right-click the signal in the Analog group box and choose Signal Options Set
options for this signal.

April 2011

130

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Tip
To specify local options for more than one signal at a time, select the signals, then
right-click on any of the selected signals and choose Signal Options Set options
for selected signals. For more information about selecting signals, see Selecting
Signals on page 146.
The Signal Options form appears.

Note: By default, this form displays the values specified in the Global Signal Options
group box in the Waveform Signals tab or the local options that had been previously
saved for this signal. You can modify the values as required.
3. (Optional) Specify the maximum acceptable tolerance values for the analog signal in the
Analog Relative Tolerance (%) and Analog Absolute Tolerance fields.
For more information about these fields, see Specifying Maximum Acceptable Tolerance
Values for Analog Signals on page 121.
4. (Optional) In the Signal Time Exclude field, specify the time ranges that should be
excluded when validating the signal. Only the time ranges that fall outside the specified
time ranges will be validated for the signal.
For more information about the Signal Time Exclude field, see Excluding Specific Time
Ranges of Waveform Signals from Validation on page 129.
5. (Optional) In the Signal Time Window field, specify the time ranges that should be
validated for the signal. The time ranges that fall outside the specified time ranges will not
be validated for the signal.
For more information about the Signal Time Window field, see Validating Only Specific
Time Ranges of Waveform Signals on page 130.
6. Click OK.
April 2011

131

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
The Options column in the Analog group box in the Waveform Signals tab indicates that
local options are specified for the analog signal. For example, the text Local:
reltol=5, abstol=500n in the Options column in the following figure indicates that
local options are specified for the top.aout analog signal.

Specifying Local Options for Validating Digital Signals


To specify local options for validating digital signals, do the following:

April 2011

132

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
1. Ensure that the Advanced Mode check box in the Preferences for AMS Design and
Model Validation form is selected. For more information about the Advanced Mode
check box, see Specifying amsDmv Options on page 35.
2. Right-click the signal in the Logic group box and choose Signal Options Set options
for this signal.
Tip
To specify local options for more than one signal at a time, select the signals, then
right-click on any of the selected signals and choose Signal Options Set options
for selected signals. For more information about selecting signals, see Selecting
Signals on page 146.
The Signal Options form appears.

Note: By default, this form displays the values specified in the Global Signal Options
group box in the Waveform Signals tab or the local options that had been previously
saved for this signal. You can modify the values as required.
3. (Optional) Specify the maximum acceptable time difference (skew) between transition
times on the reference and compared digital signal in the Logic Time Tolerance field.
For more information about the Logic Time Tolerance field, see Specifying Logic Time
Tolerance Values for Digital Signals on page 123.
4. (Optional) Specify a time value in the Logic Glitch Filter field to filter any glitch that is
equal to or smaller in width than the specified time value from digital signals.
For more information about the Logic Glitch Filter field, see Filtering Glitches in Digital
Signals on page 124.

April 2011

133

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
5. (Optional) In the Signal Time Exclude field, specify the time ranges that should be
excluded when validating the signal. Only the time ranges that fall outside the specified
time ranges will be validated for the signal.
For more information about the Signal Time Exclude field, see Excluding Specific Time
Ranges of Waveform Signals from Validation on page 129.
6. (Optional) In the Signal Time Window field, specify the time ranges that should be
validated for the signal. The time ranges that fall outside the specified time ranges will not
be validated for the signal.
For more information about the Signal Time Window field, see Validating Only Specific
Time Ranges of Waveform Signals on page 130.
7. Click OK.
The Options column in the Logic group box in the Waveform Signals tab indicates that
local options are specified for the top.invOut1 digital signal. For example, the text

April 2011

134

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
Local: timetol=2n, glitchtime=2n in the Options column in the following figure
indicates that local options are specified for the digital signal.

Copying Local Options For a Signal to Other Signals


You can copy the local options specified for a signal to other selected signals.
To copy the local options specified for a signal to other signals, do the following:

April 2011

135

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
1. Select the signals to which you want to copy the local options. For more information
about selecting signals, see Selecting Signals on page 146.
2. Right-click the signal whose local options you want to copy and choose Signal Options
Copy this signals options to other selected signals.

Deleting Local Options Specified for Signals


When you delete the local options specified for a signal, the global options specified in the
Global Signal Options group box are applied to the signal.
To delete the local options for a specific signal, do the following:

Right-click the signal and choose Signal Options Revert options for this signal to
use global values.

To delete the local options for multiple signals:


1. Select the signals whose local options you want to delete. For more information about
selecting signals, see Selecting Signals on page 146.
2. Right-click on any of the selected signals and choose Signal Options Revert
selected signals options to use global values.

Viewing the Failing Areas for a Signal


To view the failing areas for a signal, do the following:

Right-click the signal and choose Show failing areas for this signal.
The Signal Failing Areas form appears displaying the failing areas for the signal.

You can do the following in the Signal Failing Areas form:

Right-click a failing area and choose Plot Signal Failing Area to plot the failing area.

April 2011

136

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
For example, the waveform window in the following figure highlights a failing area from
71.09ns to 71.46ns for signal dn. The red trace with the suffix (cmp) is for the signal
in the compared date source and the blue trace with the suffix (ref) is for the signal in
the reference data source.

Reference
Signal

Compared
Signal

For more information about plotting signals, see Plotting Signals on page 143.

Right-click a failing area and choose Plot Signal Failing Area with other selected
signals to plot the failing area along with other signals that are selected in the Waveform
Signals tab. For more information about selecting signals in the Waveform Signals tab,
see Selecting Signals on page 146.
For example, the waveform window in the following figure highlights a failing area with
the time range from 71.09ns to 71.46ns for signal dn and also displays the plots for
the same time range for signals dnb and up that are selected in the Waveform Signals

April 2011

137

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
tab. The traces with the suffix (cmp) are for the signals in the compared date source and
the traces with the suffix (ref) are for the signals in the reference data source.

Exclude failing areas of signals from validation. For more information, see Excluding
Failing Areas of Signals from Validation on page 141.

April 2011

138

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Viewing the Point Details for a Signal


To view the values at each of the swept points for a waveform signal created during an ADE
XL simulation run, do the following:

Right-click the signal and choose Show point details for this signal.
The <signalName> waveform signal form appears displaying the point details for the
signal.

Note: The absolute or relative difference of a point that falls within the specified absolute or
relative tolerance value is displayed in green color indicating that the point is passing for the
signal. The absolute or relative difference of a point that falls outside the specified absolute
or relative tolerance value is displayed in red color indicating that the point is failing for the
signal.
You can do the following in the <signalName> waveform signal form:

Right click the column for a failing point and choose Show failing areas for this signal
to view the failing areas for the signal at that point. For more information, see Viewing the
Failing Areas at a Point on page 140.

Right click the column for a failing point and choose Plot this signal to plot the signal.
For more information about plotting signals, see Plotting Signals on page 143.

April 2011

139

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Viewing the Failing Areas at a Point


To view the failing areas at a point for a waveform signal created during an ADE XL simulation
run, do the following:
1. View the point details for the signal. For more information, see Viewing the Point Details
for a Signal on page 139.
2. Right-click the column for the point in the <signalName> waveform signal form and
choose Show failing points for this signal.
For example, to view the failing areas at point 1 for the dn signal, right-click the column
1 in the dn waveform signal form and choose Show failing points for this signal.

The <signalName> Signal Failing Areas form appears displaying the start and end
range of differences between the reference and compared signals that exceeded the

April 2011

140

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
specified absolute and relative tolerances. For more information about the Signal Failing
Areas form, see Viewing the Failing Areas for a Signal on page 136.

Excluding Failing Areas of Signals from Validation


The Exclude column in the Signal Failing Areas form allows you to exclude specific failing
areas of a signal from validation.
Note: The Exclude column is displayed in the Signal Failing Areas form if the Advanced
Mode check box in the Preferences for AMS Design and Model Validation form is selected.
For more information about the Advanced Mode check box, see Specifying amsDmv
Options on page 35.
To exclude failing areas from validation, do the following:
1. Select the Exclude check box next to a failing area.
2. Click the Exclude button.
The Options column in the Waveform Signals tab indicates that the failing areas are excluded
from validation. For example, the text Local: reltol=5, exclude=1.89867e-

April 2011

141

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
08:1.95042e-08 in the Options column in the following figure indicates that the failing area
1.89867e-08 to 1.95042e-08 is excluded from validation.

After excluding a failing area for a signal from validation, you can include it for validation by
doing the following:
1. Right-click the signal in the Waveform Signals tab and choose Signal Options Set
options for this signal.

April 2011

142

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
The Signal <signalName> Options form appears.

2. In the Signal Time Exclude field, delete the range for the failing area you want to
include for validation.
The area is displayed as a failing area for validation. For information about viewing the failing
areas at a point for a signal, see Viewing the Failing Areas at a Point on page 140.

Plotting Signals
You can plot signals for debugging purposes.
Note the following:

If you are running amsDmv from an IC 6.1 installation, you can plot signals using
SimVision or the Virtuoso Visualization and Analysis tool. If you are running amsDmv
from an IC 5.1.41 installation, you can plot signals using SimVision or WaveScan.

The Virtuoso Visualization and Analysis tool is used as the default plotting tool if you are
running amsDmv from an IC 6.1 installation, and WaveScan is used as the default
plotting tool if you are running amsDmv from an IC 5.1.41 installation.
To use SimVision as the default plotting tool, select the SimVision check box in the
Preferences for AMS Design and Model Validation form. For more information, see
Specifying amsDmv Options on page 35.
SimVision is used as the default plotting tool if digital signals exist in the waveform data
because plotting logic failing areas with SimVision is easier than with the Virtuoso
Visualization and Analysis tool or WaveScan (in IC 5.1.41). To use the Virtuoso
Visualization and Analysis tool or WaveScan for plotting waveform data that contains
digital signals, deselect the SimVision check box in the Preferences for AMS Design and
Model Validation form.

April 2011

143

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

By default, bus signals are plotted as a single waveform in SimVision. To plot each bit of
bus signals as separate waveforms in SimVision, select the Split Buses check box in
the Preferences for AMS Design and Model Validation form. For more information, see
Specifying amsDmv Options on page 35.

In the waveform window, the suffix (ref) is used to indicate signals in the reference data
source and the suffix (cmp) is used to indicate signals in the compared data source.
For example, in the following plot for signal dn, the red trace with the suffix (cmp) is for
the signal in the compared date source and the blue trace with the suffix (ref) is for the
signal in the reference data source.

Reference
Signal

Compared
Signal

To plot a signal, do the following:

Right-click the signal and choose Plot Plot this signal.

To plot all signals, do the following:


April 2011

144

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Right-click a signal and choose Plot Plot all signals.

To plot all failing signals, do the following:

Right-click a signal and choose Plot Plot failing signals.

To plot multiple signals, do the following:


1. Select the signals you want to plot. For more information about selecting signals, see
Selecting Signals on page 146.
2. Right-click on any of the slected signal and choose Plot Plot selected signals.
To plot all failing signals (signals with a Fail status), do the following:

Right-click a failing signal and choose Plot Plot failing signals.

To plot a failing area for a signal, do the following:


1. View the failing areas for the signal. For more information, see Viewing the Failing Areas
for a Signal on page 136.
2. Right-click a failing area in the Signal Failing Areas form and choose Plot Signal Failing
Area.
To plot a failing area for a signal along with other signals selected in the Waveform Signals
tab, do the following:
1. In the Waveform Signals tab, select the signals you want to plot with the failing area. For
more information about selecting signals, see Selecting Signals on page 146.
2. View the failing areas for the signal. For more information, see Viewing the Failing Areas
for a Signal on page 136.
3. Right-click a failing area in the Signal Failing Areas form and choose Plot Signal Failing
Area with other selected signals.

Selecting and Deselecting Signals


If you want to perform operations such as disabling multiple signals (see Disabling Validation
of Specific Signals on page 128), plotting multiple signals (see Plotting Signals on page 143),
setting, copying or deleting local options for multiple signals (see Working with Local Options
for Validating Signals on page 130), you can select the signals before performing these
operations.

April 2011

145

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

Selecting Signals
Note: Before you select signals, ensure that the Advanced Mode check box in the
Preferences for AMS Design and Model Validation form is selected. For more information
about the Advanced Mode check box, see Specifying amsDmv Options on page 35.
To select a specific signal, do the following:

Select the check box next to the signal in the Selected column.

To select all signals, do the following:

Right-click a signal and choose Signal Selection Select all signals.


The check boxes next to all the signals are selected in the Selected column.

Deselecting Signals
To deselect a specific signal, do the following:

Deselect the check box next to the signal in the Selected column.

To deselect all signals, do the following:

Right-click a signal and choose Signal Selection Deselect all signals.


The check boxes next to all the signals are deselected in the Selected column.

Inverting Signal Selections


When you invert signal selections, the signals that are currently selected are deselected and
the signals that are currently deselected are selected.
To invert signal selections, do the following:

Right-click a signal and choose Signal Selection Invert signal selections.

Hiding and Showing Global Signal Options


To hide the Global Signal Options group box, do one of the following:

Click the

Choose View Show/Hide Options.

April 2011

toolbar button.

146

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms
To show the Global Signal Options group box, do one of the following:

Click the

Choose View Show/Hide Options.

toolbar button.

Disabling Automatic Validation of Waveforms


By default, whenever any option for validating waveforms is changed, the currently loaded
waveforms for the reference and compared data sources are automatically compared and the
updated validation results are displayed in the Waveform Signals tabs. For more information
about specifying options for validating waveforms, see Specifying Global Options for
Validating Signals on page 121 and Working with Local Options for Validating Signals on
page 130.
However, automatic validation can be significantly slow if the waveform data is very large. In
such cases, you can disable automatic validation by setting the following environment
variable before you start amsDmv:
setenv AMS_DMV_MANUAL_REVALIDATE

If the AMS_DMV_MANUAL_REVALIDATE environment variable is set, the following message box


appears when you change any option for validating waveforms.

Click OK to close the message box and do one of the following to view the updated
validation results in the Waveform Signals tab:

Choose Run Validate.

Click the

toolbar button.

Tip
Choose Run Validate or click the
toolbar button only after you make all the
required changes in the options. This saves time because validation is run only
once.

April 2011

147

Product Version 6.1.5

AMS Design and Model Validation User Guide


Validating Waveforms

April 2011

148

Product Version 6.1.5

AMS Design and Model Validation User Guide

5
Running Pin Checks
Pin checking allows you to verify whether there are any differences between the pin and
module interfaces of the reference design and the compared model. You can verify whether:

Pin names, pin order and pin direction are the same between the reference design and
the compared model.

Module names are the same between the reference design and the compared model.

To perform pin checking, do the following:


1. Start amsDmv from the Virtuoso environment.
For more information, see Starting amsDmv from the Virtuoso Environment on page 19.
Note: To perform pin checking, you must start amsDmv from the Virtuoso environment.
Pin checking is not supported when you run amsDmv from the command line.
The AMS Design and Model Validate form appears.

April 2011

149

Product Version 6.1.5

AMS Design and Model Validation User Guide


Running Pin Checks
2. Click the Pin Check tab.

3. Select the Check Reference to Compared Pins check box.


Important
The reference and compared views that should be specified for pin checking in the
Library, Cell and Views group box are different from the views specified for the
reference and compared data sources (for the ADE L and ADE XL run type) in the
Source tab. The views specified in the Source tab refer to an ADE L state or ADE
XL view. However, the views specified in the Pin Check tab must refer to a device
under test and contain pin information.
4. Right-click the Reference column in the Library, Cell and Views group box and
choose Browse.

April 2011

150

Product Version 6.1.5

AMS Design and Model Validation User Guide


Running Pin Checks
The Select reference lib/cell/view form appears.

5. Select the library, cell, and view for the reference design and click OK.
The reference library, cell, and view names are displayed in the Reference column.
Tip
You can right-click the Reference column in the Library, Cell and Views group
box and choose Open to open the reference view for editing. If the reference view
is a schematic view, it is opened in Virtuoso Schematic Editor. If the reference view
is a text view, such as a verilog view, it is opened in a text editor.
6. Right-click the Compared column in the Library, Cell and Views group box and
choose Browse.

April 2011

151

Product Version 6.1.5

AMS Design and Model Validation User Guide


Running Pin Checks
The Select compared lib/cell/view form appears.

7. Select the library, cell, and view for the compared model and click OK.
The library, cell, and view names for the compared model are displayed in the
Compared column.
Tip
You can right-click the Compared column in the Library, Cell and Views group
box and choose Open to open the compared view for editing. If the compared view
is a schematic view, it is opened in Virtuoso Schematic Editor. If the compared view
is a text view, such as a verilog view, it is opened in a text editor.
8. In the Validate group box, select the Views Exist check box to verify that the reference
and compared library, cell, and views:

Exist and can be read by amsDmv

Contain pin information

The Status column displays the Passed status if the checks are successful.
Note: The other pin checks in the Validate group box are enabled only if the Views
Exist check is successful.

April 2011

152

Product Version 6.1.5

AMS Design and Model Validation User Guide


Running Pin Checks
9. Select the check boxes to enable the pin checks you want to perform.

Select

To check whether

Pin Names

Pin names are the same in the reference and compared


views

Pin Order

Pin names and pin order are the same in the reference and
compared views

Pin Direction

Pin direction (input, output, or inout) is the same in the


reference and compared views

Module name

Module names in the reference and compared views are


the same
Note: If a module exists in only the reference or the
compared view, and not in both, amsDmv checks whether
the cell name of the view that does not have the module has
the same name as the module name.

The Status column in the Validate group box displays the status for each check as
Passed or Failed. The Report group box displays the errors identified for each check

April 2011

153

Product Version 6.1.5

AMS Design and Model Validation User Guide


Running Pin Checks
that fails and the progress indicator at the top of the Pin Check tab displays a summary
of the pin check status, as shown in the figure.

Progress
Indicator

10. Correct the pin check errors and perform a pin check again to verify that all the errors are
corrected.
Tip
To open the compared or reference view for editing from amsDmv, right-click the
Compared or Reference column in the Library, Cell and Views group box and
choose Open.

April 2011

154

Product Version 6.1.5

AMS Design and Model Validation User Guide

6
amsDmv Command Reference
The amsDmv command allows you to run amsDmv from the UNIX command line. The
amsDmv command syntax is given below:
amsDmv [-help] [optional_arguments]

Use the following command to view information about all the options available in the amsDmv
command:
amsDmv -help

Note: Option values must be enclosed within single quotes.


See the following topics for more information about the amsDmv command:

amsDmv Command Options on page 156

amsDmv Command Examples on page 175

April 2011

155

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Options


See the following topics for more information about amsDmv command options:

amsDmv Command Line Options on page 156

Command Line Flags on page 167

amsDmv Command Line Options


The following table describes the amsDmv command line options and their values.

amsDmv Option and Value

Description

-cmppostrun 'value'

Specify the system commands or the path to a file


containing the system commands that need to be run
after the simulation of the compared data source is
complete.
Related option:

-cmppostruntype 'value'

-cmppostruntype

Specify the type of the -cmppostrun option.


Valid value: 'syscmd'

-cmpprerun 'value'

Specify the system commands or the path to a file


containing the system commands that need to be run
before the simulation of the compared data source is
run.
Related option:

-cmppreruntype 'value'

-cmppreruntype

Specify the type of the -cmpprerun option.


Valid value: 'syscmd'

-load 'value'

Specify the path to the amsDmv setup (.dmv) file to be


loaded.
For more information about amsDmv setup files, see
Saving and Opening the amsDmv Setup on page 85.

April 2011

156

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-localoptionsfile 'value'

Specify the path to the .opt file containing the local


options for validating results and waveform signals.
The .opt file is created when you save the amsDmv
setup to a file (see Saving the amsDmv Setup on
page 85) or as an executable script (see Saving the
amsDmv Setup as an Executable Script and a SKILL
File on page 87).
For more information about specifying local options for
validating results and waveform signals, see:

-pinscmpcell 'value'

Working with Local Options for Validating Results


on page 103.

Working with Local Options for Validating Signals


on page 130.

Specify the name of the cell containing the compared


model for which pin checking should be run.
Related option:

-pinscmplib 'value'

-checkpins

Specify the name of the library containing the


compared model for which pin checking should be run.
Related option:

-pinscmpview 'value'

-checkpins

Specify the name of the cellview containing the


compared model for which pin checking should be run.
Related option:

-pinsrefcell 'value'

-checkpins

Specify the name of the cell containing the reference


design for which pin checking should be run.
Related option:

April 2011

-checkpins

157

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-pinsreflib 'value'

Specify the name of the library containing the


reference design for which pin checking should be run.
Related option:

-pinsrefview 'value'

-checkpins

Specify the name of the cellview containing the


reference design for which pin checking should be run.
Related option:

-refpostrun 'value'

-checkpins

Specify the system commands, or the path to a file


containing the system commands that need to be run
after the simulation of the reference data source is
complete.
For more information, see Running System
Commands Before and After Simulation Runs on
page 78.

-refpostruntype 'value'

Specify the type of the -refpostrun option.


Valid value: 'syscmd'

-refprerun 'value'

Specify the system commands, or the path to a file


containing the system commands that need to be run
before the simulation of the reference data source is
run.
For more information, see Running System
Commands Before and After Simulation Runs on
page 78.

-refpreruntype 'value'

Specify the type of the -refprerun option.


Valid value: 'syscmd'

-repfile 'value'

Specify the name of the report file that is created after


the amsDmv command is run.
The report file contains the simulation run and
validation status.

April 2011

158

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-resabstol 'value'

Specify the maximum acceptable tolerance value for


the worst-case absolute deviation reported for results.
For more information, see Specifying Maximum
Acceptable Tolerance Values for Results on page 98.

-rescmp 'value'

Specify the path to the compared ADE XL result


database from which results should be loaded for
validation.
Related option:

-resexclude 'value'

-simcmpnone

Specify the range of swept or corner points for which


ADE XL results should be excluded from validation.
For more information, see Excluding Specific Swept or
Corner Points of ADE XL Measured Results from
Validation on page 102.
By default, all swept and corner points are included for
validation.

-resref 'value'

Specify the path to the reference ADE XL result


database from which results should be loaded for
validation.
Related option:

-resreltol 'value'

-simrefnone

Specify the maximum acceptable % tolerance value


for the worst-case relative deviation reported for
results.
For more information, see Specifying Maximum
Acceptable Tolerance Values for Results on page 98.

-restest 'value'

Specify regular expressions defining the ADE XL tests


for which results should be validated. Only the results
from the tests with matching names are validated.
The default value '*'indicates that the measured
results of all tests are validated.
For more information, see Validating Only Results for
Specific ADE XL Tests on page 101.

April 2011

159

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-results 'value'

Specify regular expressions defining the results to be


validated. Only the results with matching names are
validated.
The default value '*'indicates that all measured
results are validated.
For more information, see Validating Only Results
With Specific Names on page 100.

-resultsdisable 'value'

Specify regular expressions defining the results that


should not be validated. The results with matching
names are not validated.
For more information, see Disabling Validation of
Results With Specific Names on page 100.

-reswindow 'value'

Specify that only the ADE XL results that fall within a


specified range of swept or corner points should be
validated.
For more information, see Including Only Specific
Swept or Corner Points of ADE XL Measured Results
for Validation on page 102.

-simaxlcmpcell 'value'

Specify the name of the cell containing the ADE XL


view that should be used for simulating the compared
data source using ADE XL.

-simaxlcmphistory 'value'

Specify the name of the ADE XL history item whose


settings should be used to simulate the compared data
source using ADE XL.

-simaxlcmplib 'value'

Specify the name of the library containing the ADE XL


view that should be used for simulating the compared
data source using ADE XL.

-simaxlcmpview 'value'

Specify the name of the ADE XL view that should be


used for simulating the compared data source using
ADE XL.

April 2011

160

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-simaxlconfigcmpcell 'value'

Specify the name of the cell containing the alternative


design or config view that should be used for
simulating the compared data source using ADE XL.
Related options:

-simaxlconfigcmplib 'value'

-simaxlconfigcmplib

-simaxlconfigcmpview

Specify the name of the library containing the


alternative design or config view that should be used
for simulating the compared data source using ADE
XL.
Related options:

-simaxlconfigcmpview 'value'

-simaxlconfigcmpcell

-simaxlconfigcmpview

Specify the name of the alternative design or config


view that should be used for simulating the compared
data source using ADE XL.
The compared data source uses the setup in the ADE
XL view specified for the reference data source for
simulation purposes, but uses the specified design or
config view for simulation purposes.
Related options:

-simaxlconfigcmplib

-simaxlconfigcmpcell

-simaxlrefcell 'value'

Specify the name of the cell containing the ADE XL


view that should be used for simulating the reference
data source using ADE XL.

-simaxlrefhistory 'value'

Specify the name of the ADE XL history item whose


settings should be used for simulating the reference
data source using ADE XL.

-simaxlreflib 'value'

Specify the name of the library containing the ADE XL


view that should be used for simulating the reference
data source using ADE XL.

April 2011

161

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-simaxlrefview 'value'

Specify the name of the ADE XL view that should be


used for simulating the reference data source using
ADE XL.

-simskillcmp 'value'

Specify the SKILL or OCEAN commands, or the path


to a file containing the SKILL or OCEAN commands
that should be used for simulating the compared data
source when the -simcmpskill option is specified.

-simskillref 'value'

Specify the SKILL or OCEAN commands, or the path


to a file containing the SKILL or OCEAN commands
that should be used for simulating the reference data
source when the -simrefskill option is specified.

-simstatecmpcell 'value'

Specify the name of the cell for the compared data


source that needs to be simulated using ADE L.

-simstatecmpdir 'value'

Specify the path to the directory containing the ADE L


state whose settings should be used when simulating
the compared data source using ADE L.

-simstatecmplib 'value'

Specify the name of the library for the compared data


source that should be simulated using ADE L.

-simstatecmpname 'value'

Specify the name of the ADE L state whose settings


should be used when simulating the compared data
source using ADE L.

-simstatecmpsim 'value'

Specify the name of the simulator that needs to be


used for simulating the compared data source using
ADE L.

-simstatecmpview 'value'

Specify the name of the cellview for the compared


data source that needs to be simulated using ADE L.

-simstateconfigcmpcell 'value'

Specify the name of the cell containing the alternative


design or config view that should be used for
simulating the compared data source using ADE L.
Related options:

April 2011

-simstateconfigcmplib

-simstateconfigcmpview

162

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-simstateconfigcmplib 'value'

Specify the name of the library containing the


alternative design or config view that should be used
for simulating the compared data source using ADE L.
Related options:

-simstateconfigcmpview 'value'

-simstateconfigcmpcell

-simstateconfigcmpview

Specify the name of the alternative design or config


view that should be used for simulating the compared
data source using ADE L.
The compared data source uses the setup in the ADE
L state specified for the reference data source for
simulation purposes, but uses the specified design or
config view for simulation purposes.
Related options:

-simstateconfigcmplib

-simstateconfigcmpcell

-simstaterefcell 'value'

Specify the name of the cell for the reference data


source that should be simulated using ADE L.

-simstaterefdir 'value'

Specify the path to the directory containing the ADE L


state whose settings should be used when simulating
the reference data source using ADE L.

-simstatereflib 'value'

Specify the name of the library for the reference data


source that needs to be simulated using ADE L.

-simstaterefname 'value'

Specify the name of the ADE L state whose settings


should be used when simulating the reference data
source using ADE L.

-simstaterefsim 'value'

Specify the name of the simulator that should be used


for simulating the reference data source using ADE L.

-simstaterefview 'value'

Specify the name of the cellview for the reference data


source that should be simulated using ADE L.

April 2011

163

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-simsyscmp 'value'

Specify the system commands, or the path to a file


containing the system commands that should be used
for simulating the compared data source.
Related option:

-simsysref 'value'

-simcmpsys

Specify the system commands, or the path to a file


containing the system commands that should be used
for simulating the reference data source.
Related option:

-wavabstol 'value'

-simrefsys

Specify the maximum acceptable tolerance value for


the worst-case absolute deviation reported for analog
signals.
For more information, see Specifying Maximum
Acceptable Tolerance Values for Analog Signals on
page 121.

-wavcmp 'value'

Specify the path to the directory containing the


waveform signal data for the compared data source.
Related option:

-wavdataset 'value'

-simcmpnone

Specify regular expressions defining the analyses for


which waveform signals should be validated. Only the
signals from the analyses with matching names are
validated.
The default value '*'indicates that signals from all
analyses are validated.
For more information, see Validating Only Waveform
Signals for Specific Analyses on page 128.

April 2011

164

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-wavexclude 'value'

Excludes from validation the waveform signals that fall


within the specified time range.
For example, specify
-wavexclude '70n:72n,110n:112n' to exclude
waveform signals that fall within the time range starting
from 70n to 72n and 110n to 112n from validation.
For more information, see Excluding Specific Time
Ranges of Waveform Signals from Validation on
page 129.

-wavglitchtime 'value'

Ignores glitches in digital signals that are equal to or


smaller in width than the specified time value.
The default unit of time is seconds.
For more information, see Filtering Glitches in Digital
Signals on page 124.

-wavref 'value'

Specify the path to the directory containing the


waveform signal data for the reference data source.
Related option:

-wavreltol 'value'

-simrefnone

Specify the maximum acceptable % tolerance value


for the worst-case relative deviation reported for
analog signals.
For more information, see Specifying Maximum
Acceptable Tolerance Values for Analog Signals on
page 121.

-wavsignals 'value'

Specify regular expressions defining signals to be


validated. Only the signals with matching names are
validated and displayed.
The default value '*'indicates that all signals are
validated.
For more information, see Validating Only Signals With
Specific Names on page 126.

April 2011

165

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Option and Value

Description

-wavsignalsdisable 'value'

Specify regular expressions defining signals that


should not be validated. The signals with matching
names are not validated.
For more information, see Disabling Validation of
Signals With Specific Names on page 127.

-wavtest 'value'

Specify regular expressions defining the ADE XL tests


for which signals should be validated. Only the signals
from the tests with matching names are validated and
displayed.
The default value '*'indicates that the signals of all
tests are validated.
For more information, see Validating Only Signals for
Specific ADE XL Tests on page 129.
Note: This option is used only if the -wavfromrdb
option is specified.

-wavtimetol 'value'

Specifies the maximum acceptable tolerance value for


the time difference (skew) between transition times
reported for digital signals.

-wavwindow 'value'

Validates only the waveform signals that fall within the


specified time range.
For example, specify
-wavwindow '70n:72n,110n:112n' to validate only
the waveform signals that fall within the time range
starting from 70n to 72n and 110n to 112n.
For more information, see Validating Only Specific
Time Ranges of Waveform Signals on page 130.

April 2011

166

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

Command Line Flags


The following table describes the amsDmv command line flags.

amsDmv Command Line Flag

Description

-advanced

Enables advanced amsDmv features such as


additional compared Virtuoso simulation job running
options and the following features for measured results
and waveform signals:

-allplots

Setting local options for validating measured


results and signals. For more information, see
Working with Local Options for Validating Results
on page 103 and Working with Local Options for
Validating Signals on page 130.

Disabling multiple results (see Disabling Validation


of Specific Results on page 101) or signals (see
Disabling Validation of Specific Signals on
page 128.)

Plotting multiple signals (see Plotting Signals on


page 143).

Excluding specific failing areas of a signal from


validation (see Excluding Failing Areas of Signals
from Validation on page 141).

Detailed debugging and plotting options.

Retains existing plot windows and creates a new plot


window for each new plot action.
By default, the same plot window is updated with new
plots.

-batch

April 2011

Executes command line arguments, exits tool with


status and creates report file.

167

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-checkpins

Enables checking of reference and model pin and


module information. For more information, see
Chapter 5, Running Pin Checks.
Related options:

-pinscmpcell

-pinscmplib

-pinscmpview

-pinsrefcell

-pinsreflib

-pinsrefview

-pinsdirection

-pinsexist

-pinsmodulename

-pinsnames

-pinsorder

-dcm

Enables selecting ADE L states located outside the


.artist_states directory.

-exit

If amsDmv is run from Virtuoso, Virtuoso exits when


amsDmv exits.
Note: You can use this option with the -batch option.

-help

Displays the description of the amsDmv command and


its options.

-icxx

Indicates that you are using the icms or icfb


executable from the IC 5.1.41 release instead of the
virtuoso executable from the IC 6.1 release.
Specify this option when you are running amsDmv
from the IC 5.1.41 release.

-interactive

Overrides batch command line operation.


This option disables the -batch option.

April 2011

168

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-keepsims

Retains simulation session information so that the


session can be used for debugging purposes if the
simulation fails.

-pinsdirection

Checks whether the pin direction (input, output or


inout) is the same in the reference and compared
views.

-pinsexist

Checks whether the reference and compared views


exist and can be read by amsDmv.
The following pin checking options are used only if this
check succeeds:

-pinsmodulename

-pinsdirection

-pinsmodulename

-pinsnames

-pinsorder

Checks whether the module names in the reference


and compared views are the same.
Note: If a module exists in only the reference or the
compared view, and not in both, amsDmv checks
whether the cell name of the view that does not have
the module has the same name as the module name.

-pinsnames

Checks whether the pin names are the same in the


reference and compared views.

-pinsorder

Checks whether the pin names and pin order are the
same in the reference and compared views.

-run

Runs simulations on the reference and compared data


sources.

-runupdate

Automatically displays the Measured Results or


Waveform Signals tab after the simulation has
successfully completed.

April 2011

169

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-simcmpade

Specifies that ADE L should be used to simulate the


compared data source.
For more information, see Setting Up and Running
Simulations Using ADE on page 44.
Related options:

-simcmpadexl

-simstatecmpcell

-simstatecmpdir

-simstatecmplib

-simstatecmpname

-simstatecmpsim

-simstatecmpview

Specifies that ADE XL should be used to simulate the


compared data source.
For more information, see Setting Up and Running
Simulations Using ADE XL on page 52.
Related options:

April 2011

-simaxlcmpcell

-simaxlcmphistory

-simaxlcmplib

-simaxlcmpview

-simaxlconfigcmpcell

-simaxlconfigcmplib

-simaxlconfigcmpview

170

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-simcmpnone

Specifies that the validation of:

Waveform signals for the compared data source


should be done using the waveform data in the
directory specified using -wavcmp option.

Measured results for the compared data source


should be done using the measured results in the
ADE XL result database specified using the rescmp option.

Note: If this option is specified, the compared data


source is not simulated.
-simcmpskill

Uses SKILL or OCEAN commands specified using the


-simskillcmp option to simulate the compared data
source.
For more information, see Setting Up and Running
Simulations Using SKILL or OCEAN Commands on
page 58.

-simcmpsys

Uses system commands specified using the simsyscmp option to simulate the compared data
source.
For more information, see Setting Up and Running
Simulations Using System Commands on page 63.

-simconfig

Enables specifying an alternative design or config view


that should be used for simulating the compared data
source using ADE L or ADE XL.
The compared data source uses the setup in the ADE
L state or ADE XL view specified for the reference data
source for simulation purposes, but uses the specified
design or config view for simulation purposes.

-simdiff

Specifies that the compared data source can have a


different simulation run type.
If this option is specified, you must also specify the advanced option.

April 2011

171

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-simrefade

Specifies that ADE L should be used to simulate the


reference data source.
For more information, see Setting Up and Running
Simulations Using ADE on page 44.
Related options:

-simrefadexl

-simstaterefcell

-simstaterefdir

-simstatereflib

-simstaterefname

-simstaterefsim

-simstaterefview

Specifies that ADE XL should be used to simulate the


reference data source.
For more information, see Setting Up and Running
Simulations Using ADE XL on page 52.
Related options:

-simrefnone

-simaxlrefcell

-simaxlrefhistory

-simaxlreflib

-simaxlrefview

Specifies that the validation of:

Waveform signals for the reference data source


should be done using the waveform data in the
directory specified using the -wavref option.

Measured results for the reference data source


should be done using the measured results in the
ADE XL result database specified using the resref option.

Note: If this option is specified, the reference data


source is not simulated.

April 2011

172

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-simrefskill

Uses SKILL or OCEAN commands specified using the


-simskillref option to simulate the reference data
source.
For more information, see Setting Up and Running
Simulations Using SKILL or OCEAN Commands on
page 58.

-simrefsys

Uses system commands specified using the simsysref option to simulate the reference data
source.
For more information, see Setting Up and Running
Simulations Using System Commands on page 63.

-simsame

Indicates that the compared data source uses the


same simulation run type that is specified for the
reference data source.

-simvision

Uses SimVision as the default tool for plotting


waveform signals.
By default, the Virtuoso Visualization and Analysis tool
is used to plot signals.
For more information, see Plotting Signals on
page 143.

-splitbuses

Plots each bit of bus signals as separate waveforms.


By default, bus signals are plotted as a single
waveform in SimVision.

April 2011

173

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-statistics

Displays minimum and maximum values for each


measured result and waveform signal in the reference
and compared data source.
These values are displayed in the following columns
on the Measured Results and Waveform Signals tabs:

Max Ref Value

Min Ref Value

Max Comp Value

Min Comp Value

For more information about these columns, see


Chapter 3, Validating Measured Results and
Chapter 4, Validating Waveforms.
-useengnotation

Uses engineering notation to display values in the


following columns in the Measured Results and
Waveform Signals tabs:

Worst Absolute Diff

Worst Relative Diff

Max Ref Value

Min Ref Value

Max Comp Value

Min Comp Value

Deselect this check box to use scientific notation to


display values in these columns.
-verbose

April 2011

Displays the following:

Additional pop-up messages when you perform


operations such as saving the amsDmv setup (see
Saving the amsDmv Setup on page 85).

More detailed information in the log area (see


Showing and Hiding the Log on page 82) during a
simulation run.

174

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

amsDmv Command Line Flag

Description

-virtuoso

Indicates that amsDmv is launched from Virtuoso.


amsDmv attempts to connect using an MPS session.

-wavfromrdb

Loads waveform signals from the ADE XL result


database.
This option overrides the -wavref and -wavcmp
options.
Related option:

-wavtest

amsDmv Command Examples


amsDmv -load wav1.dmv

amsDmv -wavref design_psf -wavcmp model_psf -wavabstol 0.5 -wavreltol 5

amsDmv -resref design_sim.rdb -rescmp model_sim.rdb -resabstol 1f -results


'*period' -resexclude 7:8 -wavfromrdb

amsDmv -wavref waves_ref_inv.shm -wavcmp waves_mod_inv.shm -wavglitchtime 3nwavtimetol 100p

amsDmv -repfile '-simcmpsys -simrefsys -run -simsame -simsyscmp 'rsh icspvlnx78 'cd
AMSDMVDIR; mod_irun_cmd'' \
-simsysref 'rsh icspvlnx78 'cd AMSDMVDIR; ref_irun_cmd'' -wavcmp './
waves_mod.shm' -wavref './waves_ref.shm'

April 2011

175

Product Version 6.1.5

AMS Design and Model Validation User Guide


amsDmv Command Reference

April 2011

176

Product Version 6.1.5

S-ar putea să vă placă și