Sunteți pe pagina 1din 7

Source Qualifier

--------------Active
Connected
It is a default transformation.
Only for 2 sources - database and flat files
Purpose:
It is used to convert normal datatype into informatica datatype
varchar - string
number - decimal(default),integer
date - date/time
Important Properties:
1.Source Filter
2.No of Sorted Ports
3.User Defined Join
4.SQL Query
Mapping Steps:
------------Src: EMP
Tgt: EMP_TGT
Step1: drag and drop from src to target.
Src ------> TGT (This will populate all records from source to target).
If u want to apply
upon what type of
The SQL query will
fferent query than

some condition then go for transformation. Here SQ.SQ depends


database is used.
differ according to the source database like DB2 will have di
teradata.

Source Filter <----> Replacement for filter


-------------------------------------------1.SQ->Transformation->Source Filter->SAL>2500
2.enamelike '%s%' ----> show names contain s
Difference b/w Filter and SQ filter:
-----------------------------------1.Filter multiple sources like xml,flat files etc.
2.SQ filter is efficeient because while extracting itself it applies transformat
ion but filter applies transformation while
loading.
3.Based on database SQ coding will differ but filter transformation will not dif
fer.
Number of Sorted Ports:
---------------------give 1 ---> will sort according to 1st column.
If u give 6 it will not sort according to 6th column instead it will sort based
on all 6 columns.
If u want to sort according to 6th column go to SQL editor generate query and ed

it the OrderBY value


to the desired column.
Select Distinct - it will remove duplicate values.(again u have to generate SQL
if u have used SQL editor already).
SQ Join:
------2 src with common columns : EMP and dept table
TGT: EMP_JOIN
if u use 2 src 2 SQ will come . no need of multiple SQ so delete one SQ and drag
and drop from 2nsd table to same SQ.
if condition is not present use user defined join to specify common columns.
dept.deptno=emp.deptno and then generate SQL.

-------------------------------------------------------------------------------------------------------------------Normalizer:
---------src: Store(flat file),Student(Flat File)
tgt: store_TGT,Student_TGT
multiple columns ----> single column
Normalizer is mainly for cobol sources. If it is a cobol source u can drag and d
rop from source bit not in relational
database src.
--------------------------------------------------------------------------------------------------------------------Router :
-----Active
Connected
Similar to Filter.
Single condition means Filter
Multiple Condition means Router.
Output Port
|----> User Defined
|----> Default
User Defined ----> create to acheive custom condition.
Default -----> doesnt allow edit or delete. rejected or usatisfied conditions av
ailable in default.
Scenario:
single source multiple Targets.
Src: EMP

Targets: EMP_T1,EMP_T2,EMP_T3
Router means adding Groups.
group 1 : SAL>3000
group2 : DeptNo=20
group3 : default(Sal<3000 AND deptno<>20)
Limitaion:
group1 and group2 -----> target1 (Not Possible)
--------------------------------------------------------------------------------------------------------------------UNION Transformation:
--------------------Active
Connected
Multiple Input Group transformation.
UNION = UNION ALL in Oracle
all input grps and output grps should have matching ports. (i.e) Table structure
should be same.
SRC: dept1(Flat File),dept(oracle) (same structure with different data )
Target: Dept_TGT
by default Union have output and NewGroup(input)
doesnt remove any duplicate records.
--------------------------------------------------------------------------------------------------------------------AGGregator
-----------for aggregate calc.
this is a group or multi row function.
ports:
input
output
variable
Group BY
Properties
Sorted input
Functions
Avg
Count
Max
Min
SRC:
EMP
TGT:
EMP_AGG

If sorted Input is enabled means aggregator assumes that the input is already so
rted else it will sort.
connect only required column from src into aggregator.
Aggregator means adding additoional column (tip)
i.e pull from target to aggregator.... enable output and disable input.
SUM_SAL ---> SUM(Sal)
MAX_SAL ---> MAX(SAL)
AVG_COMM ---> AVG(COMM)
------------------------------------------------------------------------------------------------------------------------MApping Parameters:
mapping parameter starts with $$
single $ means session.
if sentence like this in parameter file $$condition=IN(DEPTNO,10,20)(logic) then
in mapping parameter you have to choose IsExprVAr as TRUE.
mapping parameter:
It represents a constant value that we can define before running a session.
It retains the same value throughout the entire session.
Steps:
1.Create a parameter file (.prm) in C:
2.Define a parameter name in mapping designer
3.Apply a parameter in transformations(Filter)
4.Setting a parameter file path in session property
Syntax:
[folder_name.wf:workflow_name.ST:Session_name]
mapping_parameter_name=value
Example:
[DEMO.WF:wf_para.ST:s_m_para]
$$Deptno=20
$$condition=IN(DEPTNO,10,20)
[Global]
[folder name.WF:workflow
[folder name.WF:workflow
[folder name.WF:workflow
[folder name.WF:workflow

name]
name.WT:worklet name]
name.ST:session name]
name.WT:worklet name.ST:session name]

mapping Variable:
It represents a value that can change through the session.
The Integration Service saves the mapping parameter value to the repository
at the end of each successful session run and uses that value the next time you
run the session.
Steps:
1.Define a mapping variable in mapping designer.
2.Apply a mapping variable in Transformations(Filter)
3.Increment the mapping variable using expression variable Port(V)
SETVARIABLE($$dno,$$dno+10)

Session Parameter
[Demo.WF:wf_para.ST:s_m_para]
$DBConnection_src=ORA_SAI_CONN
$DBConnection_tgt=ORA_Target_CONN
Source File: $InputFile[NAme]
Lookup File:$LookupFile[Name]
-----------------------------------------------------------------------------------------------------------------------Reusable Objects:
---------------Mappings can contain reusable and /or non reuseable objects
non reuseable transformations exists within single mapping
Resuseable transformations can be used in multiple mappings.
Creating single reuseable transformation
Design it in transformation Developer -(Any transformation developed in transfor
mation developer is by default reuseable)
Promote a non-reuseable transformation from the mapping designer (i.e check make
reuseable check box)
In Transformation developer you cannot copy links because it lets u create only
single transformation only
so for collection of reuseable transformations we go for mapplets.
A mapplet is a reuseable object that you can create in the mapplet designer
It contains a set of transformation and lets u reuse the transformations.
mapplet - reuseable mapping
Limitations
-

Normalizer transformations
cobol sources
XML SQ transformations
Target definitions
pre- and post- session stored proc
other mapplets

Steps for mapplet creation:


1. maplet ---> create
2. instead of source and target maplet Input and maplet output.
3.Drag and drop source --> copy all columns to maplet input and delete the sourc
e.
4.Add transformations as usual.
later in mappings instead of putting multiple transformations you can put a sing
le maplet.
malet displays input and output ports only.
If you want to view the transformations inside a mpalet you click on expand and
unexpand.
eg:
Src: ORDER_ID
CUSTOMER_ID

Target: POSTAL_CODE
SUM_SALES
LOAD_DATE
Src tables : CUSTOMERS and ORDERS
---------------------------------------------------------------------------------------------------------------Worklets:
--------workeet Designer:
reuseable and non-resuseable sessions.
Sessions created using task developer means its by default reuseable.
If u are creating in workflow designer means its non-reuseable by default and by
checking make reuseable check box you
can make it resuseable.
In task developer we cannot link or execute.
IF we want collection of reuseable sessions which we can link together then we g
o for Worklet designer.
In worklet designer we can create and link but we cannot execute.
you have to drag and drop worklet into workflow to execute it.
If even one single task fails u want to show the overall workflow fails the doub
le click on the task and check fail the parent if this task fais check box.
IF you have a session depends upon another session and you want to restrict it b
ased on the parent session success the click on the link and give status = succe
ded.
---------------------------------------------------------------------------------------------------------------Versioning:
---------By default all objects are in checkout mode.
Check Out
--------you must check out an object each time you want to change it and save it
An object is in read only mode until you or another user checks it out.
Check IN
-------you must check in the object to allow other users to make changes to it.
View History
-----------Checking in an Object adds a new Version to the object History.
-----------------------------------------------------------------------------------------------------------------------Migration
---------

to make previous version as current version.


go to mapping ->right click->view history->select the previous version-> right c
lick-> Export to XML File
Repository -> Import Objects -> unresolved issues -> check replace check box.
Steps for mapplet creation:
1. maplet ---> create
2. instead of source and target maplet Input and maplet output.
3.Drag and drop source --> copy all columns to maplet input and delete the sourc
e.
4.Add transformations as usual.
later in mappings instead of putting multiple transformations you can put a sing
le maplet.
malet displays input and output ports only.
If you want to view the transformations inside a mpalet you click on expand and
unexpand.
eg:
Src: ORDER_ID
CUSTOMER_ID
Target: POSTAL_CODE
SUM_SALES
LOAD_DATE
Src tables : CUSTOMERS and ORDERS
----------------------------------------------------------Command task : del C:\ind.txt
ksh trigger.sh --> shell script can be called to execute set of commands using s
hell script and command task.
email task : In session as well as seperate email task is there .session->edit>components->on success email/on failure email.
In session level u can attach log to the mail -> make it non resuseable before.
In direct email task u cannot attach logs.
------------------------------------------------------------

S-ar putea să vă placă și