Sunteți pe pagina 1din 328

Informatica PowerCenter Hands-On Workshop

Agenda
Time 9:00 9:30 10:00 Topic Introduction to Informatica Introduction to Informatica Data Integration Platform Introduction to PowerCenter

10:30
11:30

Tutorial Lesson 1
Tutorial Lesson 2

12:30
1:30 2:00 2:30 3:00

Lunch
Tutorial Lesson 3 Tutorial Lesson 4 Using the Debugger Putting It All Together

4:00
4:30

Tutorial Lesson 6
Review and Q/A

Workshop Objectives
By the end of the day you will:

Understand the broad set of data integration challenges facing organizations today and how the Informatica Platform can be used to address them
Access data from different data sources and targets Profile a data set and understand how to look for basic problems that need to be solved Integrate data from multiple sources through Extraction, Transformation and Load (ETL) Debug data integration processes (mappings) Expose integration logic as Web Services for use in a SOA architecture
3

Informatica
The #1 Independent Leader in Data Integration
$900
Founded: 1993 2012 Revenue: $811.6 million 7-year Annual CAGR: 17% per year Employees: 2,810+ Partners: 450+
Major SI, ISV, OEM and On-Demand Leaders

$800
$700 $600 $500 $400 $300 $200 $100 $0
2005 2006 2007 2008 2009 2010 2011 2012

Customers: Over 5,000


Customers in 82 Countries Direct Presence in 28 Countries # 1 in Customer Loyalty Rankings (7 Years in a Row)

Global Presence & Global Perspective


Employees in 26 Countries.and growing!

Product Development

Customer Support

Professional Services

Why Informatica?

Proven Technology Leadership

Product Leadership
Proven Technology Leadership

Enterprise Data Integration

Data Quality

Master Data Management

Cloud Data Integration

ULTRA MESSAGING

COMPLEX EVENT PROCESSING

B2B DATA EXCHANGE

CLOUD DATA INTEGRATION

ENTERPRISE DATA INTEGRATION

APPLICATION ILM

DATA QUALITY

MASTER DATA MANAGEMENT

Application ILM

B2B Data Exchange


Informatica supports the requirements of cross-organizational data exchange, so users apply familiar & trusted data integration tools and techniques to the growing practice of B2B data integration.

Complex Event Processing


Informatica received high praise for its services from customers. For deployments involving systems monitoring use cases, Informatica offers a five-day standup of RulePoint.

Ultra Messaging
In spite of the new entrants, Informatica remains the market leader in this highly demanding part of the messaging market.

Why Informatica?
A Track Record of Continuous Innovation
Q2 2012 Informatica 9.5
Q1 2012 Informatica 9.1 Data Quality

Q3 2012 ILM DDM 9.5.1 IIR 9.5 MDM 9.5

Data Services 9.5 PowerCenter 9.5 PowerExchange 9.5 Data Quality 9.5 Data Explorer 9.5

Q3 2011 Ultra Messaging Cloud MDM Q2 2011 Informatica 9.1 Q1 2011 CEP 5.2 MDM Q4 2010 Cloud
Proactive Monitoring Options for DQ, PC MDM for DB2 MDM Securities Master

Q4 2011 ILM 9.1 MDM 9.2

DVO 9.5 ILM Dynamic Data Masking 9.5.1 Informatica Identity Resolution 9.5 Informatica MDMRegistry Edition 9.5

DQ templates DR 9.1 for Sybase CEP Proactive PC Monitoring

Dynamic Data Masking Unified Registry & Hub MDM

UM PowerCenter integration Broad Cloud Connectivity DQ Dashboards and Reports MDM Counterparty Master MDM Social Networking

Big Data Integration Self Service Adaptive Data Services Authoritative & Trustworthy Data

Cloud Express Trust.InformaticaCloud.com B2B & AddressDoctor plugins

Why Informatica?
Empowering the Data-Centric Enterprise
BUSINESS IMPERATIVES
Improve Decisions Modernize Business Improve Efficiency & Reduce Costs Mergers Acquisitions & Divestitures Acquire & Retain Customers Outsource Non-core Functions Governance Risk Compliance Increase Partner Network Efficiency Increase Business Agility

IT INITIATIVES
Business & Operational Intelligence Legacy Retirement Application ILM Application Consolidation Customer, Supplier, Product Hubs BPO SaaS Risk Mitigation & Regulatory Reporting B2B Integration Zero Latency Operations

DATA INTEGRATION PROJECTS

Data Warehouse

Data Migration

Test Data Management & Archiving

Data Consolidation

Master Data Management

Data Synchronization

Complex Event Processing

B2B Data Exchange

Ultra Messaging

Why Informatica?
The Neutral, Trusted and Preferred Partner
BI Partners OEM OEM Partners Cloud Partners Cloud Global SI Partners Global SI Partners

INFORM SI Partners Database & Infrastructure Infrastructure Database and

Operating Systems

Platforms & Technologies

10

Secular Technology Megatrends


IT is Changing

CLOUD

INTERACTIONS

MOBILE

ON-PREMISE

TRANSACTIONS

DESKTOP

11

Data Integration
Data is Changing

CLOUD

INTERACTIONS

MOBILE

ON-PREMISE

TRANSACTIONS

DESKTOP

12

Maximize Return on Data

13

Our Singular Mission


Maximize Return on Data

Value of Data
=
We empower organizations to maximize return on data to drive their top business imperatives

14

Informatica Value Proposition


Increase Value Of Data by Enabling Top Business Imperatives And Reduce IT Costs

TRANSACTIONS

DESKTOPS

ON-PREMISE

CLOUD

MOBILE

INTERACTIONS

15

The Informatica Platform

16

17

Informatica Value Proposition


Comprehensive, Unified & Open

TRANSACTIONS

DESKTOPS

ON-PREMISE

CLOUD

MOBILE

INTERACTIONS

18

The Growing Challenge: BIG DATA


Expanding the Frontiers of Data Integration

Social Data Interactional Relational Data Transactional

Source: An IDC White Paper - sponsored by EMC. As the Economy Contracts, the Digital Universe Expands.

19

Big Data
Confluence of Big Transaction, Big Interaction and Big Data Processing

Big Transaction Data


Online Transaction Processing (OLTP) Oracle DB2 Ingres Informix Sysbase SQL Server Online Analytical Processing (OLAP) & DW Appliances Teradata Redbrick EssBase Sybase IQ Netezza Exadata HANA Greenplum DataAllegro Asterdata Vertica Paraccel

Big Interaction Data


Social Media Data Facebook Twitter Linkedin Youtube

Other Interaction Data Cloud Salesforce.com Concur Google App Engine Amazon Clickstream image/Text Scientific Genomoic/pharma Medical Medical/Device Sensors/meters RFID tags CDR/mobile

Big Data Processing

20

Barriers to Becoming Data Driven


Data is not timely

BI Application

Customer Service Portal

Sales Automation Application

CUSTOMER

CUSTOMER

PRODUCT

CUSTOMER

CUSTOMER

ORDER PRODUCT

INVOICE PRODUCT

ORDER INVOICE

ORDER INVOICE

ORDER PRODUCT

21

Barriers to Becoming Data Driven


Data is not timely Data is not trustworthy

BI Application

Customer Service Portal

Sales Automation Application

CUSTOMER

CUSTOMER

PRODUCT

CUSTOMER

CUSTOMER

ORDER PRODUCT

INVOICE PRODUCT

ORDER INVOICE

ORDER INVOICE

ORDER PRODUCT

22

Barriers to Becoming Data Driven


Data is not timely Data is not trustworthy Data is not relevant
CUSTOMER

BI Application

Customer Service Portal

Sales Automation Application

CUSTOMER

CUSTOMER

PRODUCT

CUSTOMER

ORDER PRODUCT

INVOICE PRODUCT

ORDER INVOICE

ORDER INVOICE

ORDER PRODUCT

Business

IT

23

The Informatica Approach


Comprehensive, Unified, Open and Economical platform

Data Warehouse

Data Migration

Test Data Management & Archiving

Data Consolidation

Master Data Management

Data Synchronization

B2B Data Exchange

SWIFT

NACHA

HIPAA

Cloud Computing

Application

Database

Unstructured

Partner Data

24

Whats New with Informatica

25

Informatica Delivers Greater Value to Customers


Acquisition of Best-in-Class Products from Adjacent Markets
Recent Acquisitions Since January 2009
Applimation (Jan. 2009) Leader in Application Information Life Cycle Management. Helping Customers manage their data assets from cradle-to-grave:
Data Archive for storing inactive and rarely accessed data on lower cost storage medium. Data Privacy for masking identifying and identity data to remain in compliance with todays stringent personal information and privacy laws. Data Subset for creating purpose-built data sets for environment (DEV/QA/SYSTEM) or functional (CRM/ERP/FINANCE/HR) based testing. Address Doctor (Mid-2009) Leader in Worldwide Name and Address Validation for Over 200 Countries. Providing Customers with the highest quality name and address content for name and address standardization and validation. Agent Logic (Oct. 2009) Leader in Complex Event Processing and Operational Intelligence. Helping Customers institutionalize event detection patterns and human analysis patterns into automated opportunistic action and response alerts. Siperian (Jan. 2010) Leader in Master Data Management. Helping Customers achieve mastery of a single view of X across all mission critical data domains.

26

Introduction to PowerCenter
Enterprise Data Integration and ETL

27

28

29

30

31

Informatica Platform
Single unified architecture
Provider
XML, Messaging, and Web Services

PowerCenter
Design Manage Workflow Manager Monitor Workflow Monitor

Consumer
Portals, Dashboards, and Reports

Client

Designer

XML, Messaging, and Web Services

Administrator

Packaged Applications

Relational and Flat Files

Services Framework

Repository Service

Packaged Applications

Repository
Relational and Flat Files

Mainframe and Midrange

Integration Service

Web Services

Mainframe and Midrange

32

Proven Scalability
Threaded Parallel Processing
Provider
XML, Messaging, and Web Services

PowerCenter
Partition Point

Consumer
Monitor Workflow Monitor
Consumer Thread
Portals, Dashboards, and Reports

Design

Manage Workflow Manager


Transformation Threads

Client

Designer

Packaged Applications

Provider Thread

XML, Messaging, and Web Services

Administer

Relational and Flat Files

Services Framework

Repository Service

Packaged Applications

Repository In-memory pipeline


Relational and Flat Files

Mainframe and Midrange

Integration Service

Web Services

Mainframe and Midrange

33

Proven Scalability
Pipeline Parallel Processing
Provider
XML, Messaging, and Web Services

PowerCenter
Provider Thread

Consumer
Monitor
Consumer Thread
Portals, Dashboards, and Reports

Design Designer

Transformation Threads

Manage

Client

Workflow Manager

Workflow Monitor

Administrator

Packaged Applications

In-memory pipeline

XML, Messaging, and Web Services

Provider Thread

Transformation Threads
Repository Service

Consumer Thread

Relational and Flat Files

Services Framework

Packaged Applications

Repository
Relational and Flat Files

In-memory pipeline
Mainframe and Midrange

Mainframe and Midrange

Integration Service

Web Services

34

What will we learn in this chapter?


How to: 1. Launch PowerCenter Designer to start your project

2. Connect to the PowerCenter Repository


3. Import Source and Target Structures
From Relational Tables and Flat Files

4. Create Target Structures


Define Tables and Create them in the Database

35

PowerCenter Client Tools


Designer Used to create mappings that logically define what is to be done and how. Mappings define the sources, the targets and the transformations that you want to perform on the data all through a graphical drag and drop environment. Workflow Manager Used to create, schedule and run workflows. A workflow is a set of instructions that describes how and when to run tasks related to extracting, transforming, and loading data. Workflow Monitor Used to graphically monitor the status of scheduled and running workflows for each PowerCenter server. You can view what tasks succeeded or failed and drill into the execution logs for each task to get run-time details. Repository Manager Used to create and administer the metadata repository. You can create users and groups and assign privileges and permissions to them and create folders to contain the metadata.

36

Informatica Platform
Single unified architecture
Provider
XML, Messaging, and Web Services

PowerCenter
Design Manage Workflow Manager Monitor Workflow Monitor

Consumer
Portals, Dashboards, and Reports

Client

Designer

XML, Messaging, and Web Services

Administrator

Packaged Applications

Relational and Flat Files

Services Framework

Repository Service

Packaged Applications

Repository
Relational and Flat Files

Mainframe and Midrange

Integration Service

Web Services

Mainframe and Midrange

37

Sub-Tools within Designer


Source Analyzer - Used to Import or create source definitions.

Warehouse Designer - Used to Import or create target definitions.


Transformation Designer - Used to create reusable transformations

Mapplet Designer - Used to create reusable groups of transformations

Mapping Designer - Used to create mappings to extract, transform and load data.

38

Source Analyzer
Integrated. Key component of PowerCenter Designer, Source Analyzer offers universal data access in a single unified platform Consistent. A single consistent method to access and manage any data source regardless of type or location Visual. Simple graphical interface for importing and creating source definitions for any of the data sources supported by PowerCenter

39

Target Designer
Integrated. Key component of PowerCenter Designer, Target Analyzer offers universal data access in a single unified platform Consistent. A single consistent method to access and manage any data target regardless of type or location Visual. Simple graphical interface for importing target definitions for any of the data types supported by PowerCenter Extensible. Can create target definitions, executable DDLs, and even create new tables in the warehouse

40

Tutorial Lesson 1

15 min 5 min for Lab and 10 min Break

Creating Users and Groups Creating a Folder in the PowerCenter Repository Creating Source Tables: Pre Requisite - Create the demo source tables.

41

Lesson 2

Creating Source Definitions Creating Target Definitions and Target Tables

42

Lab 2: Step-by-step Overview


1. Launch the Designer 2. Log into the repository 3. Import relational source structure 4. Import flat file source structure 5. Create a relational target structure and build it in the relational instance

43

Using Designer

Double-Click to Launch Designer

44

Using Designer

1. Right-click the Workshop repository 2. Select Connect to open

45

Using Designer

1. Enter Username: Administrator 2. Enter Password: Administrator

46

Using Designer

1. Right-click the MappingLab folder 2. Select Open to open


This is where most of our work will be done

47

Using Designer

We should now be in the Source Analyzer

1. Make sure you see Source Analyzer at the top left hand part of the gray work area

48

Using Designer

Import a relational source 1. From the menu bar select Sources Import from Database

49

Using Designer

In the Import Tables dialog, choose the ODBC connection for the data source where the source tables reside 1. Click the ODBC data source drop-down box 2. Select the data source called source

Note Informatica only uses ODBC to import the metadata structures into PowerCenter.

50

Using Designer

1. 2. 3. 4.

Enter Username: pc_user The Owner name will self populate Enter Password: pc_user Press Connect
51

Using Designer

1. Open up the directory tree under Select tables 2. select table CUSTOMERS 3. Press OK

52

Using Designer

1. Verify the source metadata structure for the CUSTOMERS and GOOD_CUST_STG tables Next, we will import our flat file source structure

53

Using Designer

1. From the menu bar and select Sources Import from File

54

Using Designer

1. Navigate to the C:\PowerCenter Workshop directory 2. Select the TRANSACTIONS.dat file

55

Using Designer
1. Select the Import field names from the first line check box this tells PowerCenter to start importing from the second line (Note Start Import at Row: has changed to 2) 2. Keep the remaining defaults (the flat file source Delimited not Fixed Width) 3. Press Next
The flat file wizard is now displayed which allows us to parse through our flat file source.
56

Using Designer

1. Keep the defaults (the flat file is comma delimited) 2. Press Next
Look around this page. Notice you can account for multiple delimiters, consecutive delimiters and quotes around data.

57

Using Designer
Earlier we told PowerCenter to use the first line of the original flat file for the column names. Note that the columns are now named for us. Review the other options on this page.

1. Press Finish

58

Using Designer

Congratulations!
You just successfully imported one flat file and two relational source structures.

59

Using Designer

Select the Target Designer to bring in our target structures 1. Select the second icon on the shortcut line

60

Using Designer

Notice when you select the Target designer, the menu options change. One now says Targets 1. Select Targets and choose the Import from Database option

61

Using Designer

The target structures are in the target instance of the database


1. Select the target ODBC data source named target

62

Using Designer

1. 2. 3. 4.

Enter Username: target The Owner name will self populate Enter Password: target Press Connect
63

Using Designer

CUSTOMER_NONAME will capture all of our records that do not have an associated customer name. GOOD_CUSTOMERS will capture all clean records to be loaded into our Data Warehouse.

1. Expand the directory tree 2. Select CUSTOMER_NONAME and GOOD_CUSTOMERS 3. Select OK

64

Using Designer

Build a date table Select Targets Create


One of the objectives of our Data Warehouse is to allow end users to drill into a customer name and determine the date that the customer has purchased their items.

65

Using Designer

Select the database type 1. Click the drop-down box and choose Oracle for the database type

66

Using Designer

1. Enter CUSTOMER_DATES as the name for the target table 2. Press Create

67

Using Designer

A new table should appear in the workspace behind the pop up menu 1. Select Done to close the Create Target Table dialog

68

Using Designer
Edit the table - CUSTOMER_DATES

1. Double-click the CUSTOMER_DATES table

69

Using Designer

The properties for the CUSTOMER_DATES table is displayed

70

Using Designer

1. Select the Columns tab

71

Using Designer

Add columns to the table 1. Press the Add icon three times to add in three new columns

72

Using Designer

1. Click NEWFIELD 2. Rename the column CUST_ID

73

Using Designer

1. Click in the Datatype column drop-down for CUST_ID 2. Select number

74

Using Designer

1. Click in the Key Type drop-down for CUST_ID to make this field a Primary Key

75

Using Designer

1. Change the second Column Name to TRANSACTION_ID 2. Change the Datatype of the TRANSACTION_ID to number. 3. Change the third Column Name to Date_of_Purchase 4. Change the Datatype of the Date_of_Purchase column to date 5. Press OK
76

Using Designer

We now have a metadata target structure in the PowerCenter Metadata Repository. We will now build the table in the Oracle target instance.

77

Using Designer

Build the table in the Oracle target instance 1. Select Targets Generate/Execute SQL

78

Using Designer

Connect to the Oracle instance


1. Press Connect

79

Using Designer

1. Press the ODBC data source dropdown menu 2. Select the target database

80

Using Designer

1. Enter the Username target 2. Enter the Password target 3. Press Connect

81

Using Designer

We are now connected to target

Build the CUSTOMER_DATES table


1. Select Selected tables on the radio menu (we only want to build the CUSTOMER_DATES table) 2. Choose the options above (We know the table doesnt exist but lets drop the table before we build the new one just in case) 3. Press Generate and execute

82

Using Designer

The table has been successfully built


1. Close the Database Object Generation box
83

Using Designer

The table GOOD_CUST_STG is for staging good customer records prior to loading them into the data warehouse. It will be used as both a target (when we clean the data) and a source (when we load the clean data into the warehouse). We can reuse the source definition to create the target.

With the Target Designer Selected 1. Expand the Sources folder so that GOOD_CUST_STG is visible 2. Drag the GOOD_CUST_STG object from the Sources directory tree in the navigation pane to the Target Designer Canvas.
84

Using Designer
GOOD_CUST_STG is now setup to be used as both a source and a target in PowerCenter. However, while the table exists in PowerCenter, it does not yet exist in our target Oracle database. Lets build this table in our target Oracle database.

85

Using Designer

Build the table in the Oracle target instance 1. Select Targets Generate/Execute SQL

86

Using Designer

Connect to the Oracle instance


1. Press Connect

87

Using Designer

1. Press the ODBC data source dropdown menu 2. Select the target database

88

Using Designer

1. Enter the Username target 2. Enter the Password target 3. Press Connect

89

Using Designer

We are now connected to target

Build the GOOD_CUST_STG table


1. Select Selected tables on the radio menu (we only want to build the GOOD_CUST_STG table) 2. Choose the options above (We know the table doesnt exist but lets drop the table before we build the new one just in case) 3. Press Generate and execute

90

Using Designer

The table has been successfully built


1. Close the Database Object Generation box
91

Using Designer
If we look back at the directory tree in the Navigation Pane, we will see that we now have three Sources TRANSACTIONS (flat file) CUSTOMERS (relational) GOOD_CUST_STG (relational) and four Targets (all relational) CUSTOMER_DATES CUSTOMER_NONAME GOOD_CUSTOMERS GOOD_CUST_STG

92

Lab 2: Tutorial Lesson 2


15 min

Creating Source Definitions Creating Target Definitions and Target Tables

93

Lesson 3: Building a Mapping

Creating a Pass-Through Mapping Creating Sessions and Workflows Running and Monitoring Workflows

94

What will we learn in this chapter?

What is a mapping?
What are Transformation Objects? How do we build a mapping? How do we Join sources together? How do we separate out records with missing data?

95

PowerCenter Mapping Designer


A mapping is a logical definition of your Data Integration process it represents a set of source and target definitions that are linked by transformation objects.
The mapping designer is a graphical drag and drop environment that lets you define the sources, define the targets and the transformations that you want to perform on the data An easy to use GUI environment for creating, organizing, and maintaining a series of mappings.
96

PowerCenter Transformations
Some examples
Transaction Control Router Normalizer Custom Transformation Stored Procedure Lookup
XML Parser Update Strategy Source Qualifier Sort Rank Sequence Generator Aggregator

Transformations used in this mapping. For a detailed description of these Transformations and their function see the tables in Appendix A

Mapplet
Filter JAVA Target Definition Union

XML Generator
Expression Joiner Mapplet Input Mapplet Output

97

PowerCenter Functions

Some Examples. A more complete reference can be found in the Appendix B at the end of this Guide

Summary view of all available functions Character manipulation (CONCAT, LTRIM, UPPER, ) Datatype Conversion (TO_CHAR, TO_DECIMAL, ) Data matching and parsing (Reg_Match, Soundex, ) Date manipulation (Date_Compare, Get_Date_Part, ) Encryption/Encoding (AES_Encrypt, Compress, MD5, ) Financial Functions (PV, FV, Pmt, Rate, ) Mathematical operations (LOG, POWER, SQRT, Abs, ) Trigonometric Functions (SIN, SINH, COS, TAN, ) Flow Control and Conditional (IIF, DECODE, ERROR, ) Test and Validation (ISNULL, IS_DATE, IS_NUMBER, ) Library of Reusable User Created Functions Variable Updates (SETVARIABLE, SETMINVARIABLE, ) Available Lookups that may be used
98

In this Scenario
We show how to build mappings with Designer. Mappings are a logical process that define the structure of data and how it is changed as it flows from one or more data sources to target locations. Mappings are the core of the Informatica data integration tool set. With Informatica transformations and mappings are reusable and can be used in multiple different scenarios.
For our first mapping we need to combine two sets of data for our data warehouse. We also need to separate good records from bad ones that are missing the customer name.

99

Sample Step-by-step Overview Scenario


1. Create a new mapping:

2. Join data from two sources CUSTOMERS and TRANSACTIONS.dat


3. Check to see if the Customer Name is missing from any of the records 4. Store these records in the CUSTOMER_NONAME table 5. Write all good records to the staging table GOOD_CUSTOMERS_STG for loading into the warehouse

100

How to build a Mapping

1. Select the Mapping Designer icon

101

How to build a Mapping

1. Validate that the Mapping Designer is active

102

How to build a Mapping

1. Select Mappings Create to start building a new mapping

103

How to build a Mapping

1. Rename the mapping m_remove_missing_customers 2. Press OK

104

How to build a Mapping

Add the source TRANSACTIONS to the mapping 1. Expand the + next to source and Flatfile so TRANSACTIONS and CUSTOMERS are visible (as above) 2. Drag the TRANSACTIONS source into the work area

105

How to build a Mapping

The TRANSACTIONS source is added to the mapping.

106

How to build a Mapping

Add the source CUSTOMERS table to the mapping 1. Click and drag the CUSTOMERS source into the workspace
107

How to build a Mapping

Both TRANSACTIONS and CUSTOMERS are now added to the mapping.

108

How to build a Mapping

Add the target tables to the mapping. 1. Expand the Targets folder 2. While holding CTRL select the CUSTOMER_NONAME and GOOD_CUST_STG tables 3. Still holding CTRL, drag them onto the workspace
109

How to build a Mapping

110

How to build a Mapping

1. Collapse the Navigation Pane for now to give us more work space (Single click-left icon) 2. Collapse the Output Window at the bottom of our screen (Single Click-right icon)

111

How to build a Mapping

Add a joiner transformation to join the two source files together 1. Single click on the joiner transform 2. Single click in the workspace, the Joiner transformation should appear
112

How to build a Mapping

1. Highlight all of the fields in the TRANSACTION Source Qualifier transformation (holding SHIFT, click the first field then click the last) 2. Still holding SHIFT, drag the selection to the Joiner Transformation
113

How to build a Mapping

To join the two sources we need to add the Customer fields to the Joiner 1. Highlight the fields in the CUSTOMERS Source Qualifier transformation 2. Drag them to the Joiner Next, we need to edit the Joiner properties

1. Double-click on the Joiner

114

How to build a Mapping

1. Click Rename
Remember, all of this metadata will be captured in the PowerCenter Metadata Repository. Since we have the ability to report on the PowerCenter Metadata Repository, we want the names of our transformation objects to be meaningful.

115

How to build a Mapping

1. Rename the joiner jnr_many_to_one 2. Click OK

116

How to build a Mapping

Notice that once we have a field in each source named CUST_ID, it named the second instance of CUST_ID to CUST_ID1.

1. Click on the Ports tab

117

How to build a Mapping

Add a join condition 1. Click on the Condition tab 2. Click on the Add condition icon

118

How to build a Mapping

A default condition will be displayed. Since we have two fields with similar names, by default, the condition will use these to field names.

1. Press OK

119

How to build a Mapping

With the data joined we need to separate good records from those with missing customer names. 1. Click on the Router Transformation 2. Click on the workspace to add a router to the mapping

120

How to build a Mapping

We want to keep all of the fields from the Joiner except CUST_ID1, which is the same as CUST_ID.
1. Hold CTRL and select all fields except for CUST_ID1 2. Drag the selected fields to the Router We need to tell the Router what conditions to check for. 1. Double-click the Router to edit it.
121

How to build a Mapping

Rename the Router 1. Click Rename 2. Type Transformation Name rtr_check_customer_name 3. Click OK

122

How to build a Mapping

1. Select the Groups tab.

123

How to build a Mapping

The Router groups data based on user defined conditions. All records that meet the Group Filter Condition are included in the output for that group.
We need to create two groups. One for records with a customer name and one records where the name is missing. 1. Click the Add button twice.

124

How to build a Mapping

Rename the Groups 1. 2. 3. 4. Click on the first Group Name Rename the group GOOD_CUSTOMER Click on the second Group Name Rename the group CUSTOMER_NONAME

Next we need to edit the Group Filter Condition 1. Click the arrow on the first condition to open editor

125

How to build a Mapping

Bad records have a NULL value for the customer name. If the record is not NULL then it is good. 1. Enter the expression: NOT ISNULL(CUST_NAME) 2. Click Validate to test your expression 3. Click OK, to close the message window. 4. Click OK, to close the Expression Editor

126

How to build a Mapping

1. Open the Expression Editor for CUSTOMER_NONAME

127

How to build a Mapping

1. Enter the expression: ISNULL(CUST_NAME) 2. Click OK

128

How to build a Mapping

1. Press OK

129

How to build a Mapping

The Router appears. Expand the transformation and scroll down to see the two we created.

130

How to build a Mapping


We need to connect the output groups to the appropriate table. 1. Expand the Router transform and scroll until the GOOD_CUSTOMER group is visible 2. Select all of the fields (or ports) under GOOD_CUSTOMER 3. Drag the selected fields to the CUST_ID field on the GOOD_CUST_STG target

Note: When you drag and release, Designer connects the first field in the set being dragged to the field under the cursor when the mouse is released, the second with the second and so on. If your fields are not in matching order you may need to connect them one at a time.
131

How to build a Mapping

1. Connect the CUSTOMER_NONAME group to the CUSTOMER_NONAME target table

Both tables should now be connected

132

How to build a Mapping

Were almost done. A few final details before we finish.

1. Click the disk icon to Save the mapping 2. Click the Toggle Output Window icon to view save status and other messages 3. Verify the mapping is VALID, if it is not check for Error messages 4. Finally clean up the workspace. Right-click and select Arrange All Iconic

133

How to build a Mapping

Congratulations!

You just built your first mapping.

134

Lab 3: 20 min (take 5 min break) Creating a Pass-Through Mapping

Creating a Pass-Through Mapping

135

Lesson 3: Workflow
Using Workflow Manager and Monitor

136

What will we learn in this section?


1. What is Workflow Manager?
2. How do we build a session task? 3. How do we sequence sessions? 4. How do we execute our mapping? 5. How do we monitor execution with Workflow Monitor?

137

Informatica Platform
Workflow Manager and Workflow Monitor
Provider
XML, Messaging, and Web Services

PowerCenter
Design Manage Workflow Manager Monitor Workflow Monitor

Consumer
Portals, Dashboards, and Reports

Client

Designer

XML, Messaging, and Web Services

Administrator

Packaged Applications

Relational and Flat Files

Services Framework

Repository Service

Packaged Applications

Repository
Relational and Flat Files

Mainframe and Midrange

Integration Service

Web Services

Mainframe and Midrange

138

Workflow Tasks
Workflow Tasks
Assignment Command Control Decision Email Event-Raise Event-Wait Session Timer

Description
Assigns a value to a workflow variable Specifies a shell command to run during the workflow. Stops or aborts the workflow. Specifies a condition to evaluate. Sends email during the workflow. Notifies the Event-Wait task that an event has occurred. Waits for an event to occur before executing the next task. Runs a mapping you create in the Designer. Waits for a timed event to trigger.

139

In this Scenario

We will use the Workflow Manager and Workflow Monitor to build a workflow to execute the mappings we just built. We will configure our workflow and then monitor the workflow in the Workflow Monitor. Along the way, we will investigate the various options in both tool sets.

140

Step-by-step Overview
1. Open the Workflow Manager through Designer
2. Create a session task. 3. Configure the session task to run the mappings we just built. 4. Investigate the options in the Workflow Manager. 5. Monitor the execution of the session in the Workflow Monitor. 6. View the run properties and session log in the Workflow Manager.

141

Working with Data

Launch the Workflow Manager 1. Press the Orange W on the tool bar above to launch the Workflow Manager

142

Workflow Manager Tools

Create reusable tasks

Create worklets

Create workflows

143

Using the Workflow Manager

1. Select the Workflow Designer

144

Using the Workflow Manager

1. Select the Session task and drop it in by clicking on the Workflow Designer workspace

145

Using the Workflow Manager

We need to join and remove records with no customer name before we can load them into the Data Warehouse. 1. Select the mapping m_remove_missing_customers 2. Click OK

146

Using the Workflow Manager

Keep all the defaults 1. Press OK

147

Using the Workflow Manager

We need to configure the Session to connect to the source and target structures 1. Double-click the Session task to open and edit it

148

Using the Workflow Manager

1. Select the Mapping tab


2. Select SQ_TRANSACTIONS (under the Sources folder on the left)

149

Using the Workflow Manager

1. 2. 3.

Scroll down under Properties until you see Source file directory Enter the location C:\PowerCenter Workshop as the Source file directory Enter TRANSACTIONS.dat as the Source filename

150

Using the Workflow Manager

1. Select SQ_CUSTOMERS under Sources on the left 2. Click the drop-down to select the correct Oracle instance that houses this source table

151

Using the Workflow Manager

The Relational Connection Browser opens

1. Select the Source connection under Objects (this is the Oracle instance where the CUSTOMERS table resides) 2. Press OK

152

Using the Workflow Manager

Configure the target structures 1. 2. Select CUSTOMER_NONAME from the Targets folder on the left Click the drop-down box under Value to open the Relational Connction Browser

153

Using the Workflow Manager

1. Select Target 2. Press OK

154

Using the Workflow Manager


Since all of our target tables exist in the same place, lets take a shortcut.

1. Right click in the Value box 2. Select Apply Connection Value To all Instances to assign this connection value to all target tables

155

Using the Workflow Manager

1. Review the information for GOOD_CUST_STG 2. Notice Target is already filled in 3. Press OK to close the Session Editor

156

Using the Workflow Manager


The GOOD_CUST_STG staging table may have data in it from previous runs. We want to make sure the only data in the table is for the current run.

1. Under Properties, scroll down and select the Truncate target table option 2. Select OK to close the Session Editor

157

Using the Workflow Manager


Loading the Data Warehouse is a two part process. The first mapping/session joins the source data, removes bad records and loads good data into a staging table. The data still needs to be refined and loaded into the Data Warehouse. The second mapping we built does this.

1. Add another Session task to the workspace

158

Using the Workflow Manager

1. Select the mapping m_build_customer_DW 2. Click OK

159

Using the Workflow Manager

The new Session is added. We need to sequence the sessions so they execute in the proper order. 1. Select Link Tasks 2. Click on the left session and drag to the session on the right, so the sessions are connected. 3. Double-click the new Session (on right) task to edit it
160

Using the Workflow Manager

1. Select the Mapping tab

161

Using the Workflow Manager

1. Select SQ_GOOD_CUST_STG under Sources on the left 2. Click the arrow under Value to select the correct Oracle instance for this source table

162

Using the Workflow Manager

In the Relational Connection Browser 1. Select the Target connection under Objects why ?? 2. Press OK

163

Using the Workflow Manager

Configure the target structures 1. 2. Select CUSTOMER_DATES from the Targets folder on the left Click the arrow under Value to open the Relational Connction Browser

164

Using the Workflow Manager

1. Select Target 2. Press OK

165

Using the Workflow Manager


Since all of our target tables exist in the same place, lets take a shortcut.

1. Right click in the Value box 2. Select Apply Connection Value To all Instances to assign this connection value to all target tables

166

Using the Workflow Manager

1. Verify GOOD_CUSTOMERS now uses the Target Connection

167

Using the Workflow Manager


The Lookup Transformation also requires a Connection

Under the Transformations folder in the left navigation pane 1. Verify the lkp_product_description Connection Value uses Source, if not click the arrow and update 2. Click OK

168

Using the Workflow Manager


Take a moment to review the configuration options under the other session tabs. 1. Review Properties

Properties allow you to specify log options, recovery strategy, commit intervals for this session in the workflow and so forth. Note in this case the workflow will continue even if the mapping fails.

169

Using the Workflow Manager


1. Review Config Object

The Config Object allows you to specify a variety of Advanced, Logging, Error Handling and Grid related options. Scroll down to view the range of options available.

170

Using the Workflow Manager


1. Review Components 2. Select OK to close the Session Editor

In the Components tab, you can configure presession shell commands, post-session commands, email messages if the session succeeds or fails, and variable assignments.

171

Using the Workflow Manager

Save the workflow 1. Click Save under the Repository menu

172

Using the Workflow Manager

1. Verify the workflow is VALID, if not scroll up to check for errors 2. Select Workflows Start Workflow
173

Using the Workflow Monitor

Workflow Monitor provides a variety of views for monitoring workflows and sessions. This view shows the status of running jobs.

1. Notice that the Workflow Monitor is displayed when you start a workflow 2. Let the task run to completion
174

Using the Workflow Monitor

1. Select the Task View tab

This view allows users to view the tasks associated with a specific workflow. Note that in this case our workflow has two sessions and has been successfully run several times. Your view may vary depending on when and how many times you have run your mapping

175

Using the Workflow Monitor

1. Select the Gantt Chart view 2. Right click on the first session in the workflow we just ran 3. Select Get Run Properties
176

Using the Workflow Monitor


1. Review the Task Details 2. Note the session Status 3. Note the number of Source and Target rows Do the results make sense? Two tables were joined so we would expect a lower total written than read, Correct? 1. Click and expand Source/Target Statistics

177

Using the Workflow Monitor


1. Review Source/Target Statistics 2. 8 rows were written to CUSTOMER_NONAME Table but 11 were rejected rows 3. Scroll over and check the Last Error Message for that target

Looks like Writer execution failed for some reason with error 8425. Lets take a look at the session log and find out what the 8425 error is.

1. Select the Get Session Log tab (Hyperlink on top right)


178

Using the Workflow Monitor

1. 2. 3. 4.

Select Find. . . Enter the Error Number 8425 Select the radio button for All fields Select Find Next

179

Using the Workflow Monitor


It seems we have a unique constraint violation. Mostly likely there are duplicate primary keys in the data. Lets debug.

1. Close out the session


180

Using the Workflow Manager

In order to debug, lets override writing our data to the CUSTOMER_NONAME table.

1. Open up our Session Task again

181

Using the Workflow Manager

1. Select CUSTOMER_NONAME from the Mapping tab 2. Override the Relational Writer 3. Select the drop-down box

182

Using the Workflow Manager

1. Select File Writer


Notice we can override our target because PowerCenter separates logical design from physical bindings. Specifically mappings do not include connection information, while workflows do, actual binding occurs at runtime.

183

Using the Workflow Manager

1. Under properties Scroll down to Header Options 2. Click drop-down and select Output Field Names 3. Select Set File Properties

184

Using the Workflow Manager

1. Switch the radio button to Delimited 2. Press OK 3. Press OK to exit the Session Editor

185

Using the Workflow Manager

1. Save the changes we made 2. Verify the workflow is VALID 3. Run the workflow again

186

Using the Workflow Monitor

We were now able to load all 19 rows. No rows were rejected.

1. Review the output file

187

Using the Workflow Monitor

1. In Windows Explorer navigate to C:\Informatica\9.0.1\server\infa_shared\TgtFiles 2. Double-click on the customer_noname1.out file

188

Using the Workflow Monitor

As we suspected, we have duplicate Customer IDs and will have to deal with that in our mapping, but well save that for another day!
189

Using the Workflow Monitor


Before we finish, we want to verify that our data loaded into the Data Warehouse.

1. Go to Designer and open the Target Designer 2. Right click on the GOOD_CUSTOMERS target 3. Select Preview Data. . .

190

Using the Workflow Monitor

1. 2. 3. 4.

Verify the ODBC data source is target Enter Username: target Enter Password: target Press Connect

191

Using the Workflow Monitor


Congratulations!!
The Data was Successfully Loaded! You just completed building and loading the Data Warehouse!

192

Lab 3: 30 min
10 min break

Creating Sessions and Workflows Running and Monitoring Workflows

193

Lesson 4

194

What will we learn in this section?

How to use a look-up to enrich records with data from another source?
What is a reusable transformation? How to use expressions to format data? How to use aggregate functions to generate results from a data set?

195

In this Scenario
We will use Designer to build another mapping. Where the last lab focused on joining raw data and removing bad records, this lab focuses on using transformations to convert, enrich, and reformat the data and, finally, load it into the data warehouse.
Specifically, we will be working with the good records that the first mapping loaded into the staging table.

196

PowerCenter Transformations
Some examples
Transaction Control Router Normalizer Custom Transformation Stored Procedure Lookup
XML Parser Update Strategy Source Qualifier Sort Rank Sequence Generator Aggregator

Transformations we will use in this lab

Mapplet
Filter JAVA Target Definition Union

XML Generator
Expression Joiner Mapplet Input Mapplet Output

197

Step-by-step Overview
1. Create a new mapping called m_build_customer_DW 2. Get a product description from the PRODUCT table

3. Format customer names and product descriptions so the first letter is Upper Case
4. For the good data, perform a simple calculation to determine total revenue 5. Collapse any duplicates 6. Load transaction dates into a table so a reporting tool can get the date of a specific transaction
198

How to build a Mapping

Starting from the Mapping Designer 1. Select Mappings Create to build a new mapping
199

How to build a Mapping

1. Rename the mapping m_build_customer_DW 2. Press OK

200

Working with Data


Note: This is the source view of the same staging table we wrote the good customer data to in the last lab.

Add the source GOOD_CUST_STG to the mapping 1. Drag the GOOD_CUST_STG source into the work area

201

Working with Data

The GOOD_CUST_STG source is added to the mapping

202

How to build a Mapping

Add the target tables to the mapping 1. Expand the Targets folder 2. Select and drag CUSTOMER_NONAME and GOOD_CUST_STG tables onto the workspace
203

How to build a Mapping


All sources and targets are now imported 1. Collapse the Navigation Pane for now to give us more space to work space 2. Collapse the Output Window at the bottom of our screen

204

Working with Data

The Lookup Transformation will allow us to pull back the product description names from our PRODUCT table. This is required by the end user so they can see exactly what products were purchased by our customers.

Add a Lookup Transformation 1. Click the Lookup Transformation icon once and single click in the workspace
205

Working with Data

Select the Lookup Table 1. Click on the Import tab 2. Select From Relational Table

206

Working with Data

We have to connect to the database instance that holds our lookup table. Note that PowerCenter will NEVER override database level security.

1. Click on ODBC data source the dropdown box 2. Select the source ODBC connection

207

Working with Data

1. Enter the Username source 2. Enter the Password source 3. Press Connect

208

Working with Data

1. Expand up TABLES folder 2. Select the PRODUCT table 3. Press OK

209

Working with Data

The Lookup Transformation appears in the workspace. We will use the Product_ID from the source to lookup the data we need
1. Highlight the PRODUCT_ID field from the Source Qualifier and drag it onto the white space at the bottom of the Lookup

210

Working with Data

Open and configure the Lookup Transformation 1. Double-click on the Lookup

211

Working with Data

1. Press Rename 2. Rename the transformation lkp_product_description 3. Press OK

212

Working with Data

Much like the joiner, the lookup transformation requires a condition to be true for it to pass values. In this case, we want the product ID from the TRANSACTIONS file to match the product ID in the PRODUCTS table. Once there is a match, the lookup will return the proper product description value.

1. Select the Condition tab 2. Click the Add condition button

213

Working with Data

We want to return the Product Description that matches the Product_ID we passed in. Designer automatically identified the correct ports to compare for the lookup. No change is required 1. Click on the Ports tab

214

Working with Data

Select the Return value

1. Check the box in the R column for PRODUCT_DESC 2. Press OK

215

Working with Data

We would like to do some formatting on our source data. We want the initial character of our customer names and product descriptions to be Upper Case

Add an Expression for formatting data 1. Select the Expression Transformation 2. Click on the workspace
216

Working with Data


1. Click and drag all the required fields from the Source Qualifier to the Expression Transformation (skip the PRODUCT_ID field since we only need it for the lookup)

217

Working with Data


We need to add the Lookup output in with the other ports 1. Drag the PRODUCT_DESC port from Lookup to a blank line in the Expression

218

Working with Data

Minimize completed transformations 1. Click on the minimize icon for each completed transformation 2. Next, double-click on the Expression Transformation to edit it

219

Working with Data

1. Press Rename 2. Enter exp_format_data 3. Press OK

220

Working with Data

1. Select the Ports tab 2. Add field button twice to add two output ports 3. Select the first field and rename it CUST_NAME_OUT 4. Select the second field and rename it PRODUCT_DESC_OUT
221

Working with Data

1. 2. 3.

Change the precision to 50 for each new port De-select the O (output) ports for CUST_NAME and PRODUCT_DESC (they will be replaced by the new fields) De-select the I for the new fields (they originate here and have no input) When the O is selected the expression editor box on the right will become active
222

Working with Data

1. Select the Expression box area next to the first field CUST_NAME_OUT (an arrow will appear) 2. Click the arrow to open the Expression Editor

223

Working with Data

Notice the help!!! Press F1 for more

1. 2. 3. 4.

Edit the Expression Expand the Character folder Select the Initcap function Double-click the function to add it to the Formula
224

Working with Data

This is a simple expression telling PowerCenter capitalize the first letter of the customer first and last name.

1. Edit the Formula so that it matches the one above. Remember CUST_NAME is the input being modified, CUST_NAME_OUT is the result 2. Press OK to close the editor
225

Working with Data

Repeat for PRODUCT_DESC_OUT 1. Press the down arrow to open the Expression Editor
226

Working with Data

1. Select the Initcap function 2. Edit the Formula so it matches the one above 3. Press OK

227

Working with Data


This is how the fields should look now. For housekeeping purposes, move the fields directly below the fields with which they correspond.

1. Click the row number at left and use the black arrows to move the row up or down in the list

228

Working with Data

1. Validate that the Ports are in the proper order 2. Press OK

229

Working with Data

Next we need to format our date. In our flat file, the date is an 8 character string. We need to convert that string to a date format so that it matches the format the target database (Oracle) is expecting

1. Validate the mapping it should look like this 2. Open the Navigation Pane

230

Working with Data

exp_formatted_date is a sharable, re-usable transformation

1. Select the Transformation Developer 2. Expand the Transformations folder in the left Navigation pane 3. Drag the exp_formatted_date transformation onto the workspace 4. Double-click the transformation to edit it
231

Working with Data

1. Select the Ports tab 2. Open the Expression Editor for the formatted_date port

232

Working with Data

1. Review the expression formula 2. Press OK 3. Press OK on the next screen to close the Edit Transformations window
233

Working with Data


Because this is a reusable transformation, any changes we make will propagate to every mapping that uses the transformation. We may want to know what mappings include this transformation before we change it. To do this we run a dependency report.

1. Right click on the transformation 2. Select Dependencies

234

Working with Data

1. Select any Object Types that should be included in the report 2. Press OK

235

Working with Data

1. Review the report content. In this case there are no dependencies. 2. Close the report

236

Working with Data

1. Click and drag the exp_formatted_date expression to the workspace

237

Working with Data

1. Link the DATEOFTRANSACTION port to the DATE_IN port on the new Expression Transformation 2. Add an Aggregator to our mapping

238

Working with Data

We need to calculate the total revenue for each customer. The Aggregator transformation performs these types of functions on groups of data. It can also help collapse records based on a grouping criteria (CUST_ID in this case), eliminating duplicate sets of results
239

Working with Data


1. Map the output ports from the two expressions to the Aggregator transformation 2. Minimize the Expressions now we are done with them 3. Double-click the Aggregator to edit the transformation properties

240

Working with Data

Rename the Aggregator Transformation

1. Click Rename 2. Name the transformation agg_revenue

241

Working with Data

Update port names and build the aggregate calculation 1. 2. 3. Select the Ports tab Remove the _OUT from the CUST_NAME and PRODUCT_DESC ports Click Add new port button once

A new port is added to the Aggregator


242

Working with Data

1. Rename NEWFIELD to TOTAL_REVENUE 2. Change the Datatype to Double 3. De-select the I so the Expression Editor becomes available 4. Click the arrow to open the Expression Editor
243

Working with Data

1. Build the expression shown above 2. Press OK

244

Working with Data

Our calculation computes the total revenue by customer. To accomplish this, data needs to be grouped by Customer ID 1. 2. Check the GroupBy box for the CUST_ID port Press OK

245

Working with Data

We are ready to map the fields from the aggregator to the GOOD_CUSTOMERS table

1. Select the relevant ports from the Aggregator 2. Map the selected fields to the matching ports on the GOOD_CUSTOMERS target table

246

Working with Data

We want to map three of the fields in the Aggregator to our second target, CUSTOMER_DATES. The CUST_ID field will go to both targets.

1. 2.

Select the relevant ports from the Aggregator to map to CUSTOMER_DATES Connect the selected fields to the matching ports on the target table

247

Working with Data

Almost done. Lets apply the finishing touches. 1. Save the mapping 2. Verify the mapping is VALID 3. Clean up. Right click anywhere in the workspace, select Arrange All Iconic
248

Working with Data

Congratulations!
You are now ready to load your data into the Data Warehouse.

249

Lab 4: 1 hr

250

Using the Debugger

251

What will we learn in this chapter?


What is the Debugger?
How do we use the Debugger? What are the advantages of using the Debugger? What are some of the options in the Debugger?

252

In this Scenario
As a developer you want to test the mapping you built prior to running the data to ensure that the logic in the mapping will work. For this lab we will use a pre-built mapping to review the features of the Debugger

253

Step-by-step Overview
1. Open the Debugger lab folder
2. Run the Debugger 3. Configure the Session with the Debugger Wizard 4. Edit Breakpoints 5. Step through the mapping

6. Monitor results

254

Using the Debugger

Open the DebuggerLab


1. Right-click on DebuggerLab folder and select Open
255

Using the Debugger

1. Open the Mapping Designer 2. Expand the Mappings Folder 3. Drag M_DebuggerLab to the Mapping Designer workspace

256

Using the Debugger

Debugger Toolbar

Start Debugger
Stop the Debugger Next Instance Step to Instance

Show Current Instance Continue Break Now Edit Breakpoints

257

Using the Debugger

1. Start the Debugger

258

Using the Debugger

1. Review the pre-requisites 2. Press Next

259

Using the Debugger

1. Select the Int_Workshop_Service as the Integration Service on which to run the debug session 2. Leave the defaults 3. Click Next
260

Using the Debugger

1. Choose the Source and Target database connection 2. Leave the default values of Source and Target 3. Click Next

261

Using the Debugger

Configure Session parameters. No change.


1. Click Next

262

Using the Debugger

Configure Session parameters. No change. 1. Click Next

263

Using the Debugger


Do we want to physically load the table or roll back the data before commit in the debug session? In this case we just want to test the mapping, not actually load the data.

Configure Session parameters 1. Check Discard target data 2. Click Finish to start the session

264

Using the Debugger

Lets adjust the tool bars so it is easier to work with the Debugger. 1. Right click on the tool bar and unselect the Advanced Transformations tool bar 2. Repeat and select Debugger so the toolbar is visible

265

Using the Debugger

1. Select Edit Breakpoints to establish breakpoints to stop the debug session at specific transformations

266

Using the Debugger

We want to establish a breakpoint at the Expression Transformation


1. Select the EXPTRANS object

267

Using the Debugger

1. Select the Add button to create a breakpoint at the expression transformation 2. Under Condition, click the Add box to set the breakpoint rules 3. Edit the rule so that it will stop when CUST_ID = 325 4. Click OK
268

Using the Debugger

1. Notice the Stop Sign Breakpoint set at EXPTRANS Transformation

269

Using the Debugger

Debugger Menu

Breakpoint

Output Window Debugger or Session Log

Target Instance Window

Transformation Instance Data Window

270

Using the Debugger

Next Instance

From the Debugger Toolbar 1. 2. 3. Click Next Instance to step into the mapping Review values and outputs in the debug panes Continue to step through and monitor changes

See Output

Examine values

271

Using the Debugger

1. Click Next Instance until 9 records have been processed. 2. Monitor Output below 3. Click Stop the Debugger

272

Using the Debugger

When the Debugger closes, Designer returns to the normal design view

1. Note the shutdown complete message in the output window

273

Additional Informatica Resources

274

Informatica Community
my.informatica.com

275

My.informatica.com Assets
Searchable knowledge base
Online support and service request management Product documentation and demos Comprehensive partner sales, support and training tools Velocity Informaticas implementation methodology Educational services offerings

Mapping Templates
Link to devnet Many more

276

Developer Network
devnet.informatica.com

277

Informatica Partner Enablement


Presales and Implementation Roles

278

Welcome to
What is beINFORMed?
Informaticas Partner Home
A variety of online tools and resources to help you sell and deliver Informatica solutions

Answers your Informatica questions


Available 24 hours 7 days a week

Where is beINFORMed?
URL: http://partners.informatica.com/

279

How to Register ...


http://partners.informatica.com

When you login, you can change your password

Log in and Register Today as a NEW USER". The system will lead you through the application process

280

What does beINFORMed Offer?


Enablement
Become an expert on selling Informatica, positioning Informatica solutions and implementing Informatica technology. Follow well-defined learning paths for your role (sales, presales, implementation) to become enabled.

Software
Ensuring that you are successful with the deployment of the Informatica platform, we offer you internal training and demonstration software.

Resource Center
A one-stop shop for technical, marketing, and sales information around Informatica's products, solutions and programs.

Manage Sales Leads and Register for Referral Fees


You can always use a little extra help when it comes time to meet with your customer...we're here to support you.

Marketing Center
Review and participate in joint programs to drive pipeline

281

beINFORMed
What It Looks Like
Increase your Informatica skills Request software Log your opportunities Find resources Do joint marketing

282

beINFORMed
Submit and Manage Software Requests

Submit Requests

Track approval through fulfillment

283

beINFORMed
Enablement Paths Your Steps to Success
Increase your selling skills

Understand solutions

Hone your technical, hands-on Skills

Fast, comprehensive solution information

284

beINFORMed
Current Pre-Sales Enablement Paths

Solution Basics
INFORMATICA

Demos and Positioning

Support & Awareness

Chalktalk eLearnings Download software Solution Positioning Presentations Topical INFOCenters

Chalktalk eLearnings Demos / recordings Solution Starter kits

POC requirements docs 24x7 Support Solution Webinar series

DELIVERY

beINFORMed

beINFORMed SC workshops VMWare Image

Phone / email support SC pairing/mentoring

285

beINFORMed
Presales Accreditation 2010-2011 Next Steps
Silver Accreditation Presentation and Positioning Expert
DELIVERY

Gold Accreditation Demo and Presentation Expert

Platinum Accreditation POC, Demo and Presentation Expert

Presales Accreditation on Platform, DI, DQ, MDM and other Informatica Solution Areas

beINFORMed Alliances Webinar Series Solution InfoCenters eLearnings

beINFORMed Solution InfoCenters Online SC Webinars eLearnings Demo Recordings/Scripts Modular Web-based consumption

beINFORMed Solution InfoCenters SC Bootcamps VMWare-based POC scenarios POC reviews and validation POC Shadowing

Success Measures 2010 Manual review process 2011 Automated review process per solution area 2010 Manual review process 2011 Automated review process per solution area 2010 Manual review process 2011 Automated review process per solution area

286

beINFORMed (From Enablement Page)


Solutions at your finger tips

287

beINFORMed
Comprehensive Resource Center

Find hot information, collateral, demos

Search by category or solution area

288

beINFORMed
Implementer Enablement Paths Data Quality
Follow the Initial Steps

Stay in touch, resolve issues

Identify proper Education

289

Step 1 QuickStart

6 eLearnings Software Guides

Quickstart, MetaData Manager, New Features, Unified Security, Data Analyzer, Real-Time eLearning Download Software for Training or evaluation purposes Installation Guide, Getting Started

Documentation Install Guide, Getting Started Guide, User Guide Demos Real-Time Edition, MetaData Manager, Data Masking

290

Step 2 Education
Global Education Services PowerCenter 8.x - Level 1 Developer
4 day course (Virtual or classroom based) - More Details >>

PowerCenter 8.x - Level 2 Developer


4 day course (Virtual or classroom based) - More Details >>

PowerCenter 8.X+ Administration


4 day course (Virtual or classroom based) More Details >>

Metadata Manager 8.6 3 day course (Virtual or classroom based) More Details >>

291

Step 3 Services
During Projects you can use the following services
Global Customer Support 24 x 7 support
Raise service request via Email / web Search our knowledge base via http://my.informatica.com Phone (North America: +1 866 563 6332)

Professional Services
For initial engagements DI experts can be contracted To compliment your team Velocity Methodology Available for Partners, Informatica Best Practices Search with Velocity on beINFORMed PowerCenter Baseline Architecture PowerExchange CDC Deployment for Mainframe Systems Data Migration Jumpstart

292

Step 4 Certification
Global Education Services
Informatica Certified Developer
PowerCenter QuickStart eLearning PowerCenter 8.X+ Administrator course PowerCenter Developer 8.x Level I course PowerCenter Developer 8 Level II course Three Exams

Informatica Certified Consultant


PowerCenter QuickStart eLearning PowerCenter 8.X+ Administrator course PowerCenter Developer 8.x Level I course PowerCenter Developer 8 Level II Five Exams

293

beINFORMed
Lead Management Opportunity to Close
Register Leads Obtain Sales Support Collaborate with Alliances

Log your opportunities

Report on joint pipelineReceive Referral Fees

294

beINFORMed
Joint Marketing Leverage Existing Programs and Content
Find Marketing Info & Opportunities

Do joint PR

Download Programs In a Box

295

Putting it all together

296

Suggested Naming Conventions


Targets T_TargetName. Mappings m_MappingName. Sessions s_MappingName. Workflows wf_WorkflowName Transformations (in a mapping) two or three letter transformation abbreviation followed by name Expression = exp_ExpressionName
Some Examples. A more complete set of transformation naming conventions can be found in the Appendix C at the end of this Guide
297

In this Scenario
You are the regional manager for a series of car dealerships. Management has asked you to track the progress of your employees. Specifically, you need to capture:
Employee name Name of the dealership they work at What they have sold

How much they have sold (net revenue)

298

Step-by-step Overview
1. Create a new target definition to use in the mapping, and create a target table based on the new target definition. 2. Create a mapping using the new target definition. You will add the following transformations to the mapping:
Lookup transformation. Finds the name of the employees, dealerships they work at, and all products they have sold.

Aggregator transformation. Calculates the net revenue that the employee has sold.
Expression transformation. Format all employees names and product descriptions.

3. Create a workflow to run the mapping in the Workflow Manager 4. Monitor the workflow in the Workflow Monitor

299

Step 1: Sources and Targets


1. Open up the MappingLab folder 2. Use the mm_data user (user_id/password: mm_data/mm_data) to bring in the mm_transactions table 3. Create your own Oracle target in the Target Designer in the target instance in Oracle Target should be named T_Employee_Summary Columns should include: EMP_NAME (varchar2, length 50, primary key) PRODUCT_SOLD (varchar2, length 50)

DEALERSHIP (varchar2, length 50)


NET REVENUE (number)

300

Step 2: Mapping
1. Open up Mapping Designer 2. Create a new mapping call it whatever you like 3. Bring in mm_transaction source and T_Employee_Summary target 4. Find dealership name (hint: Use the mm_data user as all dealerships names are kept in the mm_dealership table)

5. Find product description (hint: Use mm_data user as all product descriptions are kept in the mm_product table)
6. Find employee name (hint: Use mm_data user as all employees names are kept in the mm_employees table) 7. Format the employee name and make sure the name is capitalized 8. Format the product description and make sure the initial letters are capitalized 9. Calculate net revenue (hint: keep it simple, net revenue is revenue cost) 10. Group by Employee_ID to collapse all unique employees 11. Map to target table
301

Step 3: Build the workflow


1. Open Workflow Manager
2. Create a new workflow 3. Use a session task to execute mapping 4. Configure session task 5. Execute mapping

302

Step 4: Monitor workflow


1. Open Workflow Monitor
2. Monitor workflow 3. Debug using session log as necessary 4. Preview data to be sure it worked properly Hint: To preview data, go to Designer open up Target Designer right click on T_Employee_Summary select Preview Data enter username and password (target/target)

303

Thank You!

304

Appendix A PowerCenter Transformations

305

Transformation Objects
Transformation
Aggregator Application Source Qualifier Custom Data Masking Expression External Procedure Filter HTTP Input Java Joiner Lookup Normalizer Output Rank

Description
Performs aggregate calculations. Represents the rows that the PowerCenter Server reads from an application, such as an ERP source, when it runs a session. Calls a procedure in a shared library or DLL. Replaces sensitive production data with realistic test data for non-production environments. Calculates a value. Calls a procedure in a shared library or in the COM layer of Windows. Filters data. Connects to an HTTP server to read or update data. Defines mapplet input rows. Available in the Mapplet Designer. Executes user logic coded in Java. The byte code for the user logic is stored in the repository. Joins data from different databases or flat file systems. Looks up values. Source qualifier for COBOL sources. Can also use in the pipeline to normalize data from relational or flat file sources. Defines mapplet output rows. Available in the Mapplet Designer. Limits records to a top or bottom range.

306

Transformation Objects
Transformation Router Sequence Generator Sorter Source Qualifier SQL Stored Procedure Transaction Control Union Unstructured Data Update Strategy XML Generator XML Parser XML Source Qualifier Description Routes data into multiple transformations based on group conditions. Generates primary keys. Sorts data based on a sort key. Represents the rows that the PowerCenter Server reads from a relational or flat file source when it runs a session. Executes SQL queries against a database. Calls a stored procedure. Defines commit and rollback transactions. Merges data from different databases or flat file systems. Transforms data in unstructured and semi-structured formats. Determines whether to insert, delete, update, or reject rows.

Reads data from one or more input ports and outputs XML through a single output port.
Reads XML from one input port and outputs data to one or more output ports. Represents the rows that the Integration Service reads from an XML source when it runs a session.

307

Appendix B PowerCenter Functions

308

Aggregate Functions
Function
AVG COUNT FIRST LAST MAX MEDIAN MIN PERCENTILE STDDEV SUM VARIANCE

Description
Returns the average of all values in a group. Returns the number of records with non-null value, in a group. Returns the first record in a group. Returns the last record in a group. Returns the maximum value, or latest date, found in a group. Returns the median of all values in a selected port. Returns the minimum value, or earliest date, found in a group. Calculates the value that falls at a given percentile in a group of numbers. Returns the standard deviation for a group. Returns the sum of all records in a group. Returns the variance of all records in a group.

309

Character Functions
Function
ASCII

Description
In ASCII mode, returns the numeric ASCII value of the first character of the string passed to the function. In Unicode mode, returns the numeric Unicode value of the first character of the string passed to the function. Returns the ASCII or Unicode character corresponding to the specified numeric value. In ASCII mode, returns the numeric ASCII value of the first character of the string passed to the function. In Unicode mode, returns the numeric Unicode value of the first character of the string passed to the function. Concatenates two strings. Capitalizes the first letter in each word of a string and converts all other letters to lowercase. Returns the position of a character set in a string, counting from left to right.

CHR

CHRCODE

CONCAT INITCAP INSTR

310

Character Functions (contd)


Function
LENGTH LOWER LPAD

Description
Returns the number of characters in a string, including trailing blanks. Converts uppercase string characters to lowercase. Adds a set of blanks or characters to the beginning of a string, to set a string to a specified length.

LTRIM
METAPHONE REPLACECHR REPLACESTR

Removes blanks or characters from the beginning of a string.


Encodes characters of the English language alphabet (A-Z). It encodes both uppercase and lowercase letters in uppercase. Replaces characters in a string with a single character or no character. Replaces characters in a string with a single character, multiple characters, or no character. Converts a string to a specified length by adding blanks or characters to the end of the string. Removes blanks or characters from the end of a string. Works for characters in the English alphabet (A-Z). It uses the first character of the input string as the first character in the return value and encodes the remaining three unique consonants as numbers. Returns a portion of a string. Converts lowercase string characters to uppercase.

RPAD
RTRIM SOUNDEX SUBSTR UPPER

311

Conversion Functions
Function
TO_BIGINT

Description
Converts a string or numeric value to a bigint value.

TO_CHAR

Converts numeric values and dates to text strings.

TO_DATE

Converts a character string to a date datatype in the same format as the character string.

TO_DECIMAL

Converts any value (except binary) to a decimal.

TO_FLOAT

Converts any value (except binary) to a double-precision floating point number (the Double datatype).

TO_INTEGER

Converts any value (except binary) to an integer by rounding the decimal portion of a value.

312

Data Cleansing Functions


Function
GREATEST IN INSTR IS_DATE IS_NUMBER IS_SPACES ISNULL

Description
Returns the greatest value from a list of input values Matches input data to a list of values Returns the position of a character set in a string, counting from left to right. Returns whether a value is a valid date. Returns whether a string is a valid number. Returns whether a value consists entirely of spaces. Returns whether a value is NULL. Returns the smallest value from a list of input values. Removes blanks or characters from the beginning of a string. Encodes characters of the English language alphabet (A-Z). It encodes both uppercase and lowercase letters in uppercase.

LEAST
LTRIM METAPHONE

313

Data Cleansing Functions (contd)


Function
REG_EXTRACT
REG_MATCH REG_REPLACE REPLACECHR REPLACESTR RTRIM SOUNDEX SUBSTR TO_BIGINT TO_CHAR TO_DATE

Description
Extracts subpatterns of a regular expression within an input value
Returns whether a value matches a regular expression pattern Replaces characters in a string with a another character pattern. Replaces characters in a string with a single character or no character Replaces characters in a string with a single character, multiple characters, or no character. Removes blanks or characters from the end of a string. Encodes a string value into a four-character string. Returns a portion of a string. Converts a string or numeric value to a bigint value. Converts numeric values and dates to text strings. Converts a character string to a date datatype in the same format as the character string.

TO_DECIMAL
TO_FLOAT

Converts any value (except binary) to a decimal.


Converts any value (except binary) to a double-precision floating point number (the Double datatype). Converts any value (except binary) to an integer by rounding the decimal portion of a value.

TO_INTEGER

314

Date Functions
Function Description ADD_TO_DATE

Adds a specified amount to one part of a date/time value, and returns a date in the same format as the specified date.
Returns a value indicating the earlier of two dates. Returns the length of time between two dates, measured in the specified increment (years, months, days, hours, minutes, or seconds). Returns the specified part of a date as an integer value, based on the default date format of MM/DD/YYYY HH24:MI:SS. Returns whether a string value is a valid date. Returns the date of the last day of the month for each date in a port. Returns the date and time based on the input values Rounds one part of a date. Sets one part of a date/time value to a specified value. Date/Time datatype. Passes the date values you want to convert to character strings Truncates dates to a specific year, month, day, hour, or minute.

DATE_COMPARE DATE_DIFF

GET_DATE_PART IS_DATE LAST_DAY MAKE_DATE_TIME ROUND SET_DATE_PART TO_CHAR (DATE) TRUNC

315

Encoding Functions
Function Description

AES_DECRYPT
AES_ENCRYPT COMPRESS CRC32 DEC_BASE64 DECOMPRESS ENC_BASE64

Returns encrypted data to string format


Returns data in encrypted format Compresses data using the zlib compression algorithm Returns a 32-bit Cyclic Redundancy Check (CRC32) value Decodes the value and returns a string with the binary data representation of the data Decompresses data using the zlib compression algorithm

Encodes data by converting binary data to string data using Multipurpose Internet Mail Extensions (MIME) encoding
Calculates the checksum of the input value. The function uses Message-Digest algorithm 5 (MD5)

MD5

316

Financial Functions
Function
FV NPER

Description
Returns the future value of an investment, where you make periodic, constant payments and the investment earns a constant interest rate Returns the number of periods for an investment based on a constant interest rate and periodic, constant payments Returns the payment for a loan based on constant payments and a constant interest rate Returns the present value of an investment Returns the interest rate earned per period by a security

PMT
PV RATE

317

Numeric Functions
Function ABS CEIL CONVERT_BASE CUME EXP FLOOR LN LOG MOD MOVINGAVG MOVINGSUM POWER RAND ROUND SIGN SQRT TRUNC Description Returns the absolute value of a numeric value. Returns the smallest integer greater than or equal to the specified numeric value. Converts a number from one base value to another base value Returns a running total of all numeric values. Returns e raised to the specified power (exponent), where e=2.71828183. Returns the largest integer less than or equal to the specified numeric value. Returns the natural logarithm of a numeric value. Returns the logarithm of a numeric value. Returns the remainder of a division calculation. Returns the average (record-by-record) of a specified set of records. Returns the sum (record-by-record) of a specified set of records. Returns a value raised to the specified exponent. Returns a random number between 0 and 1 Rounds numbers to a specified digit. Notes whether a numeric value is positive, negative, or 0. Returns the square root of a positive numeric value. Truncates numbers to a specific digit.

318

Scientific Functions
Function Description

COS

Returns the cosine of a numeric value (expressed in radians).

COSH

Returns the hyperbolic cosine of a numeric value (expressed in radians).

SIN

Returns the sin of a numeric value (expressed in radians).

SINH

Returns the hyperbolic sin of a numeric value (expressed in radians).

TAN

Returns the tangent of a numeric value (expressed in radians).

TANH

Returns the hyperbolic tangent of a numeric value (expressed in radians).

319

Special Functions
Function Description

ABORT

Stops the session and issues a specified error message.

DECODE

Searches a port for the specified value.

ERROR

Causes the PowerCenter Server to skip a record and issue the specified error message.

IIF

Returns one of two values you specify, based on the results of a condition.

LOOKUP

Searches for a value in a lookup source column. Informatica recommends using the Lookup transformation.

320

String Functions
Function Description

CHOOSE

Chooses a string from a list of strings based on a given position.

INDEXOF

Finds the index of a value among a list of values.

REVERSE

Reverses the input string.

321

Test Functions
Function
IS_DATE

Description
Returns whether a value is a valid date..

IS_NUMBER

Returns whether a string is a valid number.

IS_SPACES

Returns whether a value consists entirely of spaces.

ISNULL

Returns whether a value is NULL.

322

Variable Functions
Function
SETCOUNTVARIABLE

Description
Counts the rows evaluated by the function and increments the current value of a mapping based on the count.

SETMAXVARIABLE

Sets the current value of a mapping variable to the higher of two values:the current value of the variable or the value specified. Returns the new current value.

SETMINVARIABLE

Sets the current value of a mapping variable to the lower of two values: the current value of the variable or the value specified. Returns the new current value.

SETVARIABLE

Sets the current value of a mapping variable to a value you specify. Returns the specified value. Returns the current date and time of the node hosting the Integration Service with precision to the nanosecond.

SYSTIMESTAMP

323

Appendix C Transformation Naming Conventions

324

Transformation Naming
Each object in a PowerCenter repository is identified by a unique name. This allows PowerCenter to efficiently manage and track statistics all the way down to the object level. When an object is created, PowerCenter automatically generates a unique name. These names, however, do not reflect project/repository specific context. As a best practice Informatica recommends the following convention for naming PowerCenter objects:

<Object Type Abbreviation>_<Descriptive Name>


Abbreviations are usually 1 3 letters, the minimum needed to easily identify the object type. For example: agg_quarterly_total (an aggregator that computes a quarterly totals) m_load_dw (a mapping that loads the datawarehouse) exp_string_to_date (an expression that converts a string to a date)

325

Suggested Naming Conventions


Transformation Aggregator Application Source Qualifier Suggested Convention AGG_TransformationName ASQ_TransformationName

Custom
Expression External Procedure Filter Joiner Lookup MQ Source Qualifier

CT_TransformationName
EXP_TransformationName EXT_TransformationName FIL_TransformationName JNR_TransformationName LKP_TransformationName SQ_MQ_TransformationName

Normalizer
Rank

NRM_TransformationName
RNK_TransformationName
326

Suggested Naming Conventions


Transformation Router Suggested Convention RTR_TransformationName

Sequence Generator
Sorter Stored Procedure

SEQ_TransformationName
SRT_TransformationName SP_TransformationName

Source Qualifier
Transaction Control Union Update Strategy XML Generator XML Parser XML Source Qualifier

SQ_TransformationName
TC_TransformationName UN_TransformationName UPD_TransformationName XG_TransformationName XP_TransformationName XSQ_TransformationName
327

Lesson 6: Lab - homework

328

S-ar putea să vă placă și