Sunteți pe pagina 1din 98

Introduction to Informatica

Agenda
Overview of Informatica Powercenter Power Center: Terminology and Architecture Overview of Informatica Client components Overview of Informatica Connectivity Overview of Informatica Flow Source qualifier Transformation Expression Transformation Aggregator Transformation Sorter Transformation Rank Transformation Joiner Transformation

Course Prerequisites
To gain full benefit from this session, you should have a strong working knowledge of Relational database management systems (RDBMS) Structured Query Language (SQL) Windows GUI Open Database Connectivity (ODBC)

3
3

About Informatica
Company founded in 1993 A recognized leader in enterprise solution products Headquarters in Redwood City, CA Informatica has more than 4,600 customers Worldwide Distributorship

Informatica Products

The platform includes: Informatica PowerCenter Informatica PowerExchange Informatica Data Explorer Informatica Data Quality Informatica B2B Data Transformation Informatica B2B Data Exchange Informatica Identity Resolution

Informatica Versions
Informatica Powermart 4.x Informatica Powercenter 5.x Informatica Powercenter 6.x Informatica Powercenter 7.x Informatica Powercenter 8.1.X, 8.5.x, 8.6.X,9.x

6
6

Informatica Resources
www.informatica.com provides product and service information Professional Services Education Services my.informatica.com sign up to access: Technical Support Product Documentation Velocity (development/process methodology) Knowledgebase Webzine Mapping templates devnet.informatica.com sign up for Informatica Developers Network: Discussion Forums Web Seminars Technical Papers

Informatica PowerCenter Architecture

Informatica PowerCenter Architecture

PC Server
Native/ODBC Native/ODBC

Sources
Native/ODBC
TCP/IP

Targets
Native/ODBC

TCP/IP

Heterogeneous Sources
TCP/IP

Repository Server
TCP/IP

Heterogeneous Targets

Repository Agent
Native

Repository Designer Workflow Workflow Manager Monitor Manager

Repository

10

PowerCenter 8 Architecture
ODBC

Sources

Native drivers/ ODBC

Domain
Integration Service
TCP/IP

Native drivers/ ODBC

Targets

TCP/IP ODBC TCP/IP

Repository Service Repository Service Process


Native drivers

HTTP

Administratio n Console

PowerCenter Client

Repository

11

Definitions
Service: A logical representation of a server or process e.g. Repository Service, IS Nodes: A logical representation of hardware Is started when you start Informatica Services Domain: A domain is a logical collection of nodes and services that you can group in a folder likedeployment Service Process: The physical instance of a service on anode. i.e. a process or set of processes

Connectivity Overview
Network Protocol

ODBC
Source(s)

Target(s) Target(s)

Client Repository Manager

Native / ODBC

Native / ODBC

Designer Repository ODBC Server Manager Network Protocol Native

Server

12
12

Power Center: Terminology


Folders :Folders provide a way to organize and store all metadata in the repository, including mappings and sessions They are used to store sources, transformations, cubes, dimensions, Mapplets, business components, targets, mappings, sessions and batches Mapping: Mappings represent the data flow between sources and targets

Transformations: A transformation is a repository object that generates, modifies, or passes data. The Designer provides a set of transformations that perform specific functions. For example, an Aggregator transformation performs calculations on groups of data.
Session: A session is a set of instructions that tells the Informatica server how to move data from sources to targets. Workflow: A workflow is a set of instructions on how to execute tasks such as sessions, emails, and shell commands

13

Informatica Architecture Components

Repository
Stores the metadata created using the Informatica Client tools
Repository Manager creates the metadata tables in the database Tasks in the Informatica Client application such as creating users, analyzing sources, developing mappings or Mapplets, or creating sessions creates metadata Informatica Server reads metadata created in the Client application when a session runs

14

Overview.. Informatica Server


The Informatica Server reads mapping and session information from the Repository. It extracts data from the mapping Sources and stores the data in memory while it applies the transformation rules in the mapping. The Informatica Server loads the transformed data into the mapping Targets. Platforms
Windows NT/2000 UNIX / Linux Solaris
15

Informatica Minimum Requirements

16
16

Questions???

17

PowerCenter Client Tools

Repository Manager

Designer

Workflow Manager

Workflow Monitor

Administration Console

Manage Repository: Connections Folders

Build ETL Mappings

Build and start workflows to run mappings

Monitor and start workflows

Administer repositories on a Repository Server: Create/Upgrade/Delete Configuration Start/Stop Backup/Restore Users and Groups

18

Repository Manager

19

Repository Manager

Navigator Window

Analysis Window

Dependency Window

Output Window

20

Security
Drilldown Access

User ID and password

User ID

Repository

Repository Privileges
Folder Permissions Folder

Object Locking

Object

21
21

Repository Manager Tasks

Implement repository security: Assign and revoke repository privileges and folder permissions
Perform folder functions: Create, edit, and delete folders Copy a folder within the repository or to another repository Compare folders within a repository or in different repositories

Dependency Window

The Dependency window can display the following types of dependencies:

Source-target dependencies - lists all sources or targets related to the selected object and relevant information Mapping dependencies - lists all mappings containing the selected object as well as relevant information

23

Questions???

24

Power Center Designer

25

Designer Window

Subject Area

Work Folders

Repository Navigator

Output Window

26

Designer Tools
Source Analyzer Warehouse Designer Transformation Developer Mapplet Designer

Mapping Designer

27
27

27

Designer Appendix

Informaticas Designer is the client application used to create and manage sources, targets, and the associated mappings between them. The Designer allows you to work with multiple tools, in multiple folders, and in multiple repositories at a time. The client application provides five tools with which to create mappings:
Source Analyzer. Use to import or create source definitions for flat file, ERP, and relational sources. Warehouse Designer. Use to import or create target definitions. Transformation Developer. Used to create reusable object that generates or modifies data. Mapplet Designer. Used to create a reusable object that represents a set of transformations. Mapping Designer. Used to create mappings.

28

Design Process

1. Create Source definition(s)


2. Create Target definition(s) 3. Create a Mapping

4. Create a Session Task


5. Create a Workflow from Task components 6. Run the Workflow 7. Monitor the Workflow and verify the results

Designer

Source Analyzer

30

Source Analyzer

Designer Tools

Analyzer Window
Navigation Window

Methods of Analyzing Sources

Import from Database Import from File Import from Cobol File Import from XML file Create manually

Repository

Source Analyzer

Relational

XML file

Flat file

COBOL file

32

Source Analyzer

The following types of source definitions can be imported or created or modified in the Source Analyzer:
Relational Sources Tables, Views, Synonyms Files Fixed-Width or Delimited Flat Files, COBOL Files Microsoft Excel Sources XML Sources XML Files, DTD Files, XML Schema Files SAP R/3, SAP BW, Siebel, IBM MQ Series by using PowerConnect

33
33

Analyzing Relational Sources


Source Analyzer Relational Source

ODBC

Table View Synonym


DEF

Repository Server
TCP/IP

Repository Agent

native
Repository

DEF

34

Importing Relational Source Definitions

After importing a relational source definition, Business names for the table and columns can be entered
35
35

Analyzing Relational Sources


Editing Source Definition Properties

36

Analyzing Flat File Sources


Source Analyzer
Mapped Drive NFS Mount Local Directory

Flat File
DEF

Fixed Width or Delimited


Repository Server
TCP/IP

Repository Agent

native
Repository
37

DEF

Import from Source Flat File


Fixed-Width Files

38

Import from Source Flat File


Delimited Files

39
39

39

Questions???

40
40

Designer

Warehouse Designer

Creating Target Definitions


Methods of creating Target Definitions Import from Database Import from an XML file Manual Creation Automatic Creation

42

Overview .. Targets
Relational - Oracle, Sybase, Sybase IQ, Informix, IBM DB2, Microsoft SQL Server, and Teradata. File - Fixed and delimited flat files and XML Extended
Integration server to load data into SAP BW. PowerConnect for IBM MQSeries to load data into IBM MQSeries message queues.

Other - Microsoft Access. ODBC or native drivers, FTP, or external loaders.

43

Import from Target


Relational Table
Add User DSN / ODBC

44
44

44

Automatic Target Creation


Drag-anddrop a Source Definition into the Warehouse Designer Workspace

45

Manual Target Creation


1. Create empty definition 2. Add desired columns

3. Finished target definition

ALT-F can also be used to create a new column


46

Target Definition Properties

47

Creating Physical Tables

DEF

DEF

DEF

Execute SQL via Designer PHYSICAL Target database tables

LOGICAL Repository target table definitions

48

Creating Physical Tables


Create tables that do not already exist in target database
Connect - connect to the target database Generate SQL file - create DDL in a script file Edit SQL file - modify DDL script as needed Execute SQL file - create physical tables in target database

Use Preview Data to verify the results (right mouse click on object)
49

Questions???

50
50

Designer

Mapping Designer

51

Mapping Designer
Mapping

Create mapping

52

Mapping

Mappings represent the data flow between sources and targets When the Informatica Server runs a session, it uses the instructions configured in the mapping to read, transform, and write data Every mapping must contain the following components:
Source / Target definitions. Transformation / Transformations. Connectors Or Links.

53

Mapping
Transformation

Source
Source Qualifier Links or Connectors
Sample Mapping
54

Target

Transformations used in Informatica

55

Transformations

A transformation is a repository object that generates, modifies, or passes data The Designer provides a set of transformations that perform specific functions Transformations in a mapping represent the operations the Informatica Server performs on data Data passes into and out of transformations through ports that you connect in a mapping or mapplet Transformations can be active or passive
56

Transformation - Types

An active transformation can change the number of rows that pass through it A passive transformation does not change the number of rows that pass through it Transformations can be connected to the data flow, or they can be unconnected
Connected Transformation Connected transformation is connected to other transformations or directly to

UnConnected Transformation An unconnected transformation is not connected to other transformations in the mapping. It is 57 called within another

Transformation
Aggregator Expression Filter Joiner Lookup Normalizer

Type
Active/ Connected Passive/ Connected Active/ Connected Active/ Connected

Description
Performs aggregate calculations. Calculates a value. Filters data. Joins data from different databases or flat file systems.

Passive/ Looks up values. Connected or Unconnected Active/ Connected Source qualifier for COBOL sources. Can also use in the pipeline to normalize data from relational or flat file sources. Limits records to a top or bottom range. Routes data into multiple transformations based on group conditions. 58

Rank Router

Active/ Connected Active/ Connected

Transformation

Type

Description

Sequence Generator Sorter


Source Qualifier

Passive/ Connected Active/Connected


Active/ Connected

Generates primary keys. Sorts data based on a sort key.


Represents the rows that the Integration Service reads from a relational or flat file source when it runs a session. Calls a stored procedure. Defines commit and rollback transactions. Merges data from different databases or flat file systems. Determines whether to insert, delete, update, or reject rows.

Stored Procedure Transaction Control Union

Passive/ Connected or Unconnected Active/ Connected Active/Connected

Update Strategy

Active/ Connected

59

Transformations Overview
A transformation has three views:

Iconized View -- shows transformation in relation to mapping Normal View -- shows data flow through transformation Edit View -- shows transformation properties and allows for editing

60
60

60

Character & Number Functions


1. 2. 3. 4. 5. 6. CONCAT SUBSTR INSTR LENGTH INITCAP UPPER 7. 8. 9. LOWER LPAD LTRIM

Character Functions Used to manipulate character data

10. RPAD 11. RTRIM 12. ASCII / CHR

1. 2. 3. 4. 5. 6.

ABS CEIL CUME EXP FLOOR LN / LOG

7. 8. 9.

MOD POWER ROUND

10. SIGN 11. SQRT 12. TRUNC

Numerical Functions Used to perform mathematical operations on numeric data

61
61

61

Conversion, Date & Aggregate


Conversion
1. 2. 3. 4. 5. 6. TO_CHAR TO_DATE TO_DECIMA L TO_FLOAT TO_INTEGE R TO_NUMBE R

1. 2. 3. 4. 5. 6. 7. 8. 9.

Date
ADD_TO_DATE DATE_COMPA RE DATE_DIFF GET_DATE_PA RT LAST_DAY ROUND (date) SET_DATE_PA RT TO_CHAR (date) TRUNC (date)

1. 2. 3. 4. 5. 6.

Aggregate
SUM AVG COUNT FIRST LAST MAX

7.
8. 9.

MIN
STDDEV VARIANCE

10. MEDIAN

62

Test & Special Functions


1. 2. ISNULL IS_DATE

3.
4.

IS_NUMB ER
IS_SPACE S

Test Functions Used to validate data Used to test if a lookup result is null

1. 2. 3.

ABORT DECODE ERROR

4.
5.

IIF
LOOKUP

Special Functions Used to handle specific conditions within a session; search for certain values; test conditional statements

63
63

63

Built-in Variables and Constants


Built-in Variables
1. 2. SESSSTARTTIME SYSDATE

Built-in Constants
1. 2. 3. 4. TRUE FALSE NULL DD_INSERT

5.
6.

DD_DELETE DD_REJECT
DD_UPDATE

64
64

64

Transformation Ports
Input ports - Receive data. Output ports - Pass data. Input/output ports - Receive data and pass it unchanged.

65

Transformations : Source Qualifier

Reads and represents the source record set. Mandatory to use after relational or flat file sources Active Transformation Join from different Sources

Filter Rows Sorted Ports Distinct Rows Custom Query

66

Transformations : Sorter
Sorts the data coming passed to it Ascending/Descending Distinct Active Transformation

67
67

67

Transformations : Expression
Calculations Input, Output Ports Passive Transformation

68
68

68

Transformations : Filter
Filter rows Active Transformation Can Filter rows from Non-relational sources

69
69

69

Transformations : Router
Multiple Filter Conditions Input Groups / Output Groups / UserDefined Groups Active Transformation

70
70

70

Transformations : Aggregator
Input, Output, Variable Ports Active Transformation Functions
AVG, COUNT, FIRST, LAST, MAX, MEDIAN, MIN, PERCENTILE, STDDEV, SUM, VARIANCE

Expressions (non-aggregate / aggregate / nested / conditional clauses) ex : sum(sal, bonus > 1000) IIF ( max(sal) > 5000,max(sal),0) Group by Port Sorted Input Aggregate Cache

71
71

71

Transformations : Sequence Generator


Generate Numeric Values Passive Transformation Create Keys Replace Missing values
Ex :IIF( ISNULL(ORDER_NO), ORDER_NO ) NEXTVAL,

Cycle through a sequential Range of Numbers

72
72

72

Joiner Transformation

Joins two related heterogeneous sources residing in different locations or file systems Can be used to join
Two relational tables existing in separate databases Two flat files in potentially different file systems Two different ODBC sources Two instances of the same XML source A relational table and a flat file source A relational table and an XML source
73

Transformations : Joiner
Active Transformation Join Heterogeneous sources Join Types Normal Master Outer Detail Outer Full Outer

74

74

Aggregator Transformation Aggregator transformation performs aggregate funtions like average, sum, count etc. on multiple rows or groups. The Integration Service performs these calculations as it reads and stores data group and row data in an aggregate cache. It is an Active & Connected transformation. Difference b/w Aggregator and Expression Transformation? Expression transformation permits you to perform calculations row by row basis only. In Aggregator you can perform calculations on groups. Aggregator transformation has following ports State, State_Count, Previous_State and State_Counter. Components: Aggregate Cache, Aggregate Expression, Group by port, Sorted input. Aggregate Expressions: are allowed only in aggregate transformations. can include conditional clauses and non-aggregate functions. can also include one aggregate function nested into another aggregate function. Aggregate Functions: AVG, COUNT, FIRST, LAST, MAX, MEDIAN, MIN, PERCENTILE, STDDEV, SUM, VARIANCE

75

Application Source Qualifier Transformation Represents the rows that the Integration Service reads from an application, such as an ERP source, when it runs a session.It is an Active & Connected transformation.

76

Custom Transformation It works with procedures you create outside the designer interface to extend PowerCenter functionality. calls a procedure from a shared library or DLL. It is active/passive & connected type. You can use CT to create T. that require multiple input groups and multiple output groups. Custom transformation allows you to develop the transformation logic in a procedure. Some of the PowerCenter transformations are built using the Custom transformation. Rules that apply to Custom transformations, such as blocking rules, also apply to transformations built using Custom transformations. PowerCenter provides two sets of functions called generated and API functions. The Integration Service uses generated functions to interface with the procedure. When you create a Custom transformation and generate the source code files, the Designer includes the generated functions in the files. Use the API functions in the procedure code to develop the transformation logic. Difference between Custom and External Procedure Transformation? In Custom T, input and output functions occur separately.The Integration Service passes the input data to the procedure using an input function. The output function is a separate function that you must enter in the procedure code to pass output data to the Integration Service. In contrast, in the External Procedure transformation, an external procedure function does both input and output, and its parameters consist of all the ports of the transformation.
77

Data Masking Transformation

Passive & Connected. It is used to change sensitive production data to realistic test data for non production environments. It creates masked data for development, testing, training and data mining. Data relationship and referential integrity are maintained in the masked data.

For example: It returns masked value that has a realistic format for SSN, Credit card number, birthdate, phone number, etc. But is not a valid value. Masking types: Key Masking, Random Masking, Expression Masking, Special Mask format. Default is no masking.

78

Expression Transformation Passive & Connected. are used to perform non-aggregate functions, i.e to calculate values in a single row. Example: to calculate discount of each product or to concatenate first and last names or to convert date to a string field. You can create an Expression transformation in the Transformation Developer or the Mapping Designer. Components: Transformation, Ports, Properties, Metadata Extensions. External Procedure Passive & Connected or Unconnected. It works with procedures you create outside of the Designer interface to extend PowerCenter functionality. You can create complex functions within a DLL or in the COM layer of windows and bind it to external procedure transformation. To get this kind of extensibility, use the Transformation Exchange (TX) dynamic invocation interface built into PowerCenter. You must be an experienced programmer to use TX and use multithreaded code in external procedures.
79

Filter Transformation Active & Connected. It allows rows that meet the specified filter condition and removes the rows that do not meet the condition. For example, to find all the employees who are working in NewYork or to find out all the faculty member teaching Chemistry in a state. The input ports for the filter must come from a single transformation. You cannot concatenate ports from more than one transformation into the Filter transformation. Components: Transformation, Ports, Properties, Metadata Extensions.

80

HTTP Transformation Passive & Connected. It allows you to connect to an HTTP server to use its services and applications. With an HTTP transformation, the Integration Service connects to the HTTP server, and issues a request to retrieves data or posts data to the target or downstream transformation in the mapping.

Authentication types: Basic, Digest and NTLM. Examples: GET, POST and SIMPLE POST

81

Java Transformation Active or Passive & Connected. It provides a simple native programming interface to define transformation functionality with the Java programming language. You can use the Java transformation to quickly define simple or moderately complex transformation functionality without advanced knowledge of the Java programming language or an external Java development environment.

82

Joiner TransformationActive & Connected. It is used to join data from two related heterogeneous sources residing in different locations or to join data from the same source. In order to join two sources, there must be at least one or more pairs of matching column between the sources and a must to specify one source as master and the other as detail. For example: to join a flat file and a relational source or to join two flat files or to join a relational source and a XML source. The Joiner transformation supports the following types of joins: Normal Normal join discards all the rows of data from the master and detail source that do not match, based on the condition. Master Outer Master outer join discards all the unmatched rows from the master source and keeps all the rows from the detail source and the matching rows from the master source. Detail Outer Detail outer join keeps all rows of data from the master source and the matching rows from the detail source. It discards the unmatched rows from the detail source. Full Outer Full outer join keeps all rows of data from both the master and detail sources. Limitations on the pipelines you connect to the Joiner transformation: *You cannot use a Joiner transformation when either input pipeline contains an Update Strategy transformation. *You cannot use a Joiner transformation if you connect a Sequence Generator transformation directly before the Joiner transformation. 83

Lookup Transformation Passive & Connected or UnConnected. It is used to look up data in a flat file, relational table, view, or synonym. It compares lookup transformation ports (input ports) to the source column values based on the lookup condition. Later returned values can be passed to other transformations. You can create a lookup definition from a source qualifier and can also use multiple Lookup transformations in a mapping. You can perform the following tasks with a Lookup transformation: *Get a related value. Retrieve a value from the lookup table based on a value in the source. For example, the source has an employee ID. Retrieve the employee name from the lookup table. *Perform a calculation. Retrieve a value from a lookup table and use it in a calculation. For example, retrieve a sales tax percentage, calculate a tax, and return the tax to a target. *Update slowly changing dimension tables. Determine whether rows exist in a target. Lookup Components: Lookup source, Ports, Properties, Condition. Types of Lookup: 1) Relational or flat file lookup. 2) Pipeline lookup. 3) Cached or uncached lookup. 84 4) connected or unconnected lookup.

Normalizer Transformation Active & Connected. The Normalizer transformation processes multipleoccurring columns or multiple-occurring groups of columns in each source row and returns a row for each instance of the multiple-occurring data. It is used mainly with COBOL sources where most of the time data is stored in de-normalized format. You can create following Normalizer transformation: *VSAM Normalizer transformation. A nonreusable transformation that is a Source Qualifier transformation for a COBOL source. VSAM stands for Virtual Storage Access Method, a file access method for IBM mainframe. *Pipeline Normalizer transformation. A transformation that processes multipleoccurring data from relational tables or flat files. This is default when you create a normalizer transformation. Components: Transformation, Ports, Properties, Normalizer, Metadata 85 Extensions.

Rank Transformation Active & Connected. It is used to select the top or bottom rank of data. You can use it to return the largest or smallest numeric value in a port or group or to return the strings at the top or the bottom of a session sort order. For example, to select top 10 Regions where the sales volume was very high or to select 10 lowest priced products. As an active transformation, it might change the number of rows passed through it. Like if you pass 100 rows to the Rank transformation, but select to rank only the top 10 rows, passing from the Rank transformation to another transformation. You can connect ports from only one transformation to the Rank transformation. You can also create local variables and write non-aggregate expressions.
86

Router Transformation

Router Transformation Active & Connected. It is similar to filter transformation because both allow you to apply a condition to test data. The only difference is, filter transformation drops the data that do not meet the condition whereas router has an option to capture the data that do not meet the condition and route it to a default output group. If you need to test the same input data based on multiple conditions, use a Router transformation in a mapping instead of creating multiple Filter transformations to perform the same task. The Router transformation is more efficient.

87

Sequence Generator Transformation Passive & Connected transformation. It is used to create unique primary key values or cycle through a sequential range of numbers or to replace missing primary keys. It has two output ports: NEXTVAL and CURRVAL. You cannot edit or delete these ports. Likewise, you cannot add ports to the transformation. NEXTVAL port generates a sequence of numbers by connecting it to a transformation or target. CURRVAL is the NEXTVAL value plus one or NEXTVAL plus the Increment By value. You can make a Sequence Generator reusable, and use it in multiple mappings. You might reuse a Sequence Generator when you perform multiple loads to a single target. For non-reusable Sequence Generator transformations, Number of Cached Values is set to zero by default, and the Integration Service does not cache values during the session.For non-reusable Sequence Generator transformations, setting Number of Cached Values greater than zero can increase the number of times the Integration Service accesses the repository during the session. It also causes sections of skipped values since unused cached values are discarded at the end of each session. For reusable Sequence Generator transformations, you can reduce Number of Cached Values to minimize discarded values, however it must be greater than one. When you reduce the Number of Cached Values, you might increase the number of times the Integration Service accesses the repository to cache values during the session.
88

Sorter Transformation

Sorter Transformation Active & Connected transformation. It is used sort data either in ascending or descending order according to a specified sort key. You can also configure the Sorter transformation for case-sensitive sorting, and specify whether the output rows should be distinct. When you create a Sorter transformation in a mapping, you specify one or more ports as a sort key and configure each sort key port to sort in ascending or descending order.

89

Source Qualifier Transformation Active & Connected transformation. When adding a relational or a flat file source definition to a mapping, you need to connect it to a Source Qualifier transformation. The Source Qualifier is used to join data originating from the same source database, filter rows when the Integration Service reads source data, Specify an outer join rather than the default inner join and to specify sorted ports. It is also used to select only distinct values from the source and to create a custom query to issue a special SELECT statement for the Integration Service to read source data

90

SQL Transformation Active/Passive & Connected transformation. The SQL transformation processes SQL queries midstream in a pipeline. You can insert, delete, update, and retrieve rows from a database. You can pass the database connection information to the SQL transformation as input data at run time. The transformation processes external SQL scripts or SQL queries that you create in an SQL editor. The SQL transformation processes the query and returns rows and database errors.

91

Stored Procedure Transformation Passive & Connected or UnConnected transformation. It is useful to automate time-consuming tasks and it is also used in error handling, to drop and recreate indexes and to determine the space in database, a specialized calculation etc. The stored procedure must exist in the database before creating a Stored Procedure transformation, and the stored procedure can exist in a source, target, or any database with a valid connection to the Informatica Server. Stored Procedure is an executable script with SQL statements and control statements, user-defined variables and conditional statements.

92

Transaction Control Transformation Active & Connected. You can control commit and roll back of transactions based on a set of rows that pass through a Transaction Control transformation. Transaction control can be defined within a mapping or within a session. Components: Transformation, Ports, Properties, Metadata Extensions.

93

Union Transformation Active & Connected. The Union transformation is a multiple input group transformation that you use to merge data from multiple pipelines or pipeline branches into one pipeline branch. It merges data from multiple sources similar to the UNION ALL SQL statement to combine the results from two or more SQL statements. Similar to the UNION ALL statement, the Union transformation does not remove duplicate rows. Rules 1) You can create multiple input groups, but only one output group. 2) All input groups and the output group must have matching ports. The precision, datatype, and scale must be identical across all groups. 3) The Union transformation does not remove duplicate rows. To remove duplicate rows, you must add another transformation such as a Router or Filter transformation. 4) You cannot use a Sequence Generator or Update Strategy transformation upstream from a Union transformation. 5) The Union transformation does not generate transactions. Components: Transformation tab, Properties tab, Groups tab, Group Ports tab.

94

Unstructured Data Transformation Active/Passive and connected. The Unstructured Data transformation is a transformation that processes unstructured and semi-structured file formats, such as messaging formats, HTML pages and PDF documents. It also transforms structured formats such as ACORD, HIPAA, HL7, EDI-X12, EDIFACT, AFP, and SWIFT. Components: Transformation, Properties, UDT Settings, UDT Ports, Relational Hierarchy.

95

Update Strategy Transformation Active & Connected transformation. It is used to update data in target table, either to maintain history of data or recent changes. It flags rows for insert, update, delete or reject within a mapping. XML Generator Transformation Active & Connected transformation. It lets you create XML inside a pipeline. The XML Generator transformation accepts data from multiple ports and writes XML through a single output port. XML Parser Transformation Active & Connected transformation. The XML Parser transformation lets you extract XML data from messaging systems, such as TIBCO or MQ Series, and from other sources, such as files or databases. The XML Parser transformation functionality is similar to the XML source functionality, except it parses the XML in the pipeline 96

XML Source Qualifier Transformation Active & Connected transformation. XML Source Qualifier is used only with an XML source definition. It represents the data elements that the Informatica Server reads when it executes a session with XML sources. has one input or output port for every column in the XML source. External Procedure Transformation Active & Connected/UnConnected transformation. Sometimes, the standard transformations such as Expression transformation may not provide the functionality that you want. In such cases External procedure is useful to develop complex functions within a dynamic link library (DLL) or UNIX shared library, instead of creating the necessary Expression transformations in a mapping.

Advanced External Procedure Transformation Active & Connected transformation. It operates in conjunction with procedures, which are created outside of the Designer interface to extend PowerCenter/PowerMart functionality. It is useful in creating external transformation applications, such as sorting and aggregation, which require all input rows to be processed before emitting any output rows.
97

Questions???

98
98

S-ar putea să vă placă și