Sunteți pe pagina 1din 91

316852746.

doc

5/4/2016

Informatica Tutorial
Informatica Exercises

MOORTHY E S
MATHIVANAN S

TCS Internal

316852746.doc

5/4/2016

1. Terminologies
2. Source Analyzer
Ex_1 - Import RELATIONAL SOURCE definition
Ex_2 - Import FLAT FILE SOURCE definition
Ex_3 - Import XML Source definition
3. Warehouse Designer
Ex_1 Import RELATIONAL target definition
Ex_2 Create a SQL target definition in Warehouse Designer &
Generate the Table
structure in Database
Ex_3 Create a FLAT FILE target definition manually.
4. MAPPING DESIGNER
CREATING MAPPING
5. WORKFLOW MANAGER:
BEFORE CREATING SESSION:
STEPS FOR REGISTERING A POWERCENTER SERVER
Starting the informatica services
SETTING UPA RELATIONAL DATABASE CONNECTION
CREATING SESSION
CONFIGURING CONNECTION STRINGS OF SESSION
CREATING WORKFLOW
RUNNING A WORKFLOW
EVALUATING THE RESULTS
TRANSFORMATIONS
6. Source Qualifier Transformation
Ex_1 Pass thro mapping (Direct mapping Relational source to
Relationa target)
Ex_2 Extracting Data from FLAT FILE (Flat File Source)
Ex_3 Loading Data to Flat File (Indirect Sources- Optional Practices)
Ex_4 Using Source Qualifier Transformation Properties (Filter,
Sorter)
Ex_5 Using SQL Override in Source Qualifer Transformation.

7. Expression Transformation

TCS Internal

316852746.doc

Ex_1
Ex_2
Ex_3
Ex_4
Ex_5

Using Expression Transformation


Using Inbuilt Function in expression editor
Using nested functions (Optional Practises)
Handling Null values
Using Logical chr for newline (Optional Practises)

8. Filter Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5

Using Filter Transformation


(Optional Practices)
(Optional Practices)
(Optional Practices)
(Optional Practices)

9 Router Transformation
Ex_1 Using Router Transformation
Ex_2 Practice (Optional Practices)
Ex_3 Practice (Null value Handling)
10 Aggregator Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5

Using Aggregator Transformation


(Optional Practices)
(Other Frequently used Aggregation Functions)
(Optional Practices)
(Optional Practices)

11 Joiner Transformation
Ex_1
Ex_2
Ex_3
Ex_4
Ex_5
Ex_6

Using Joiner Transformation


Joining More than two tables
Using Transformation Variables
(Optional Practices)
Pivoting
(Optional Practices)

12 LookUp Transformation
Ex_1 Using LookUp Transformation
Ex_2 Using LookUp Override & Unconnected LookUp
Ex_3 Using Self Join
13 Sorter Transformation
Ex_1 Using Sorter Transformation
Ex_2 (Optional Practices)
Ex_3 (Optional Practices)
14 Sequence Generator Transformation
Ex_1 Using Sequence Generator Transformation

TCS Internal

5/4/2016

316852746.doc
15 Rank Transformation
Ex_1 Using Rank Transformation
16 Update Strategy Transformation
Ex_1 Using Update Strategy Transformation (Slowly Growing
Targets)
Ex_2 Supporting only Updations
Ex_3 Slowly Changing Dimensions
17 Stored Procedure Transformation
Ex_1 Using Stored Procedure Transformation (Procedures).
Ex_2 Multiple Input/Output parameters.
Ex_3 (Optional Practice).
Ex_4 Using Functions, Using Unconnected Stored Procedure
Transformation.
18 Union Transformation
Ex_1 Using Union Transformation.
Ex_2 More than two input data sets (Optional Practices)
19 Normalizer Transformation
Ex_1 Using Normalizer Transformation
Ex_2 Optional Practices

==========================Page
Starter==========================
WORKFLOW MANAGER TASKS

TCS Internal

5/4/2016

316852746.doc

5/4/2016

1. Event Wait task - using Event Raise and using filewatch


2. Event Raise task
3. Decision task

4. Assignment task
5. Command task
6. Timer task
7. Control task
8. Scheduling feature task
9. Email task

INFORMATICA PRACTICAL LAB EXERCISE


Notes:
Create a folder named BasicTraining in Repository manager. Use
this folder to practice exercises.

TCS Internal

316852746.doc

5/4/2016

Create a separate database named BasicTraining for the target


table. Use this database for practicing exercises.
The folder and database will be reviewed later.
Prerequisites:
1. Install Informatica server and client tools.
2. Install MS SQL Server.
Before doing the exercises do the following steps:
1.
2.
3.
4.

Create repository
Create ODBC Connectivity for Source and Target data bases.
Register the Powercenter Server in the Workflow Manager.
Create Connection strings for source and target databases.

--TERMINOLOGIES:
SOURCE :
To extract data from a table or file, you must first define sources in the repository. You
can import or create the following types of source definitions in the Source
Analyzer:

TCS Internal

316852746.doc

5/4/2016

Relational tables (any database), views, and synonyms


Flat files Fixed-width and delimited.
COBOL files
XML files
TARGETS :
Similar to source, you must define targets in the repository to load data in to the
table or flat file. Use the Warehouse Designer to import and design target definitions.
Target types we can create are below.
Relational. You can create a relational target for a particular database platform.
Flat File. You can create fixed-width and delimited flat file target definitions.
XML File. You can create an XML target definition to output data to an XML file.
MAPPINGS :
A mapping is a set of source and target definitions linked by transformation objects,
that define the rules for data transformation or different functions/operations on data.
Mappings represent the logical data flow between sources and targets.
Every mapping must contain the following components:
Source definition.
Transformation. Performs different functions or operations on data.
Target definition.
Links. Connect sources, targets, and transformations to represent data movement.
SESSIONS :
session is a type of task that tells us to move data from where to where - that is the
source and target locations (database for tables or folder for files). and when
(scheduling feature stating when to move the data) using the logic of the mapping,
Features are scheduling, connection strings (locations), executing scripts before or
after starting session, mailing on success or failure criteria.
WORKFLOWS :
A workflow is a object that states how to execute tasks such as sessions, email tasks
& commands with conditional links between the tasks.

NorthWind (Source) Schema Diagram (Take a print out for quick reference)

TCS Internal

316852746.doc

Source Analyzer

TCS Internal

5/4/2016

316852746.doc

5/4/2016

1. Import RELATIONAL SOURCE definition for all tables in Northwind


database of SQL server.
Aim: Learn to import table definition using Source Analyzer.
To import a source definition follow the steps below
1. Open the Informatica Designer, connect to the repository and open your folder.
2. Choose Source Analyzer. click on the marked icon in the figure below .

Figure: 1
3. choose Sources of Menu bar:
Sources -> Import from Database. You will see a new window as in Figure 2:
1. Select the ODBC data source used to connect to the source database.
2. Enter a appropriate database user name and password with valid
permission
to connect to the database, and view the objects.
3. You may need to specify the owner name for database objects you want to
use as sources. Click Connect. If no table names appear, click All of show
owner tab. You can see the list of tables in the Select tables window.

Figure: 2

TCS Internal

316852746.doc

5/4/2016

4.Select the relational objects you want to import.


You can hold down
the Ctrl
key to select multiple objects. Click OK. The source definition will
appears in the
Source Analyzer window. Save the repository by pressing Ctrl+S
key
Note: you can manually edit the column in the definition by double clicking on
it.
2. Import SOURCE definition of a flat file and name it as src_ft_sales_office.
Aim: Learn to importing a flat file structure, using Source Analyzer.

C:\Program Files\
Informatica PowerCenter 7.1\Server\SrcFiles\src_ft_sales_office.txt

Double click on the icon above and Save the flat file attached in the location
C:\Program Files\Informatica PowerCenter 7.1\Server\SrcFiles on your machine and
import the source definition of the same file to Source Analyzer.
STEPS:
Choose Source Analyzer:
1. Sources (Menu bar) -> Import from files , you will get the window as in
figure 3.

Figure: 3
2. Browse to the folder (where you have saved the file) and select the file
3. Click OK. You will see the window for editing flat file structure as in figure 4.

TCS Internal

316852746.doc

5/4/2016

Figure: 4
4. Choose the file system (Delimited or Fixed Width). For the exercise file
given

the file is of CSV (Comma Separated Variable) type.

5. If the file contains the column names then enable Import field names
from first
line, else ignore it. For the exercise file given field names are given
so enable.
6. click next you can see the imported source structure in source analyzer.

7. Save the repository.

TCS Internal

316852746.doc

5/4/2016

3. Import the source definition of the XML file attached. Follow the same
steps of previous exercise.
Aim: Learn to import XML Structure using Source Analyzer.

cd_catalog.xml

STEP:
1.
2.
3.
4.
5.
6.

Click Sources(Menu) -> Import XML Definition.


Browse to the folder (where you have saved the file) and select the file.
Click Open. You will see the window for editing XML file structure as below.
It will prompt to override infinite length select NO.
Click NEXT & then FINISH.
You can see the imported XML file in the Source Analyzer as below.

For detailed explanation see Informatica help index

Warehouse Designer

TCS Internal

316852746.doc

5/4/2016

1. Create the table named Dim_Region in target database.


Aim: Learn to Import table structure using Warehouse Designer.
Run the DDL script in your target database. Using Query Analyzer.
CREATE TABLE [Dim_Region] (
[RegionID] [int] NOT NULL,
[RegionDescription] [nchar] (50) NOT NULL
)
GO
Import the structure of the table Dim_Region from Warehouse Designer.
STEPS:

1. Click warehouse designer icon next to the Source Analyzer icon.

2. Choose
Targets(Of Menu bar) -> Import from database
3. Choose the target ODBC Data source and import the objects required. As
you import you can see the imported structure as below.

2. Create a SQL target definition similarly to the source flat file


definition sales_office.
Generate the table in the target database, using Warehouse
Designer. Check for the existence of table in target database.

TCS Internal

316852746.doc

5/4/2016

Note: Use Generate/Execute SQL option of target menu in Warehouse


Designer.
Aim: To generate table in database using Warehouse Designer.
Step:1 Create a target structure in Warehouse Designer using Target Icon

Click on the icon and click in Warehouse Designer panel. You can see
the empty structure in the panel, as below which is of Microsoft SQL Server Database
type.

Step 2: Create the required fields. Double click on the structure and choose
columns tab. You can see empty fields. Create columns using the icon
shown
below.

Click on the icon shown

TCS Internal

316852746.doc

5/4/2016

You can see a new column has been created. Create two more columns.
Rename the columns and table name and set to required datatypes, precision and
scale for the columns. Click Apply and then OK.

You can see the Column names renamed and datatypes and prec are set.
STEP 3: Generate the structure created in the designer in the database using
Generate/Execute SQL option of Targets menu bar.
Select the required structure in the Warehouse designer panel, and click
Generate/Execute SQL option of Targets in menu bar.

TCS Internal

316852746.doc

5/4/2016

Click Connect option to choose the database.

Choose the Database DSN we have created and give User name and password of the
database, and click connect.

Enable Create table option of Generate options and click Generate and execute
button. If the table is already existing in database we can drop the table before
creating by enabling Drop table option
You can see the notification below in the generate tab of the output window. You can
check the table in the database.

Notification from server.

TCS Internal

316852746.doc

5/4/2016

5. Manually create a flat file target definition similar to flat file source
src_ft_sales_office, and name it as trg_ft_sales_office.
Aim: To create flat files Structures or other database type.
Steps:
Create the required structure.
1. Change the Database type to Flat File. If you need to create structures
for some other database you can choose here. Set the file properties as
fixed width or Delimited. Rename the structure.

2. Click Apply and then ok, we can see the target structure created in
warehouse designer as shown below.

Note: If we need to create a target structure similar to source structure we no


need to create manually. We can just drag the source structure and drop in the
warehouse designer. We will get the same structure.

TCS Internal

316852746.doc

5/4/2016

Mapping Designer
Note: 1. mapping name you are creating should be as
m_<Abbreviated_transformation_name>_ex_<no>_<description for
understanding> e.g. m_SQ_ex_1_direct_mapping for the first exercise.
2. Session name should be named as s_<mapping_name>
Source Qualifier Transformation:
ex_1: Load the table Dim_Region with data of Region table.
Aim: To load data from One relational table to another relational table.
Step 1: Create a simple pass thro mapping to load data from Region table of
source database to Dim_Region table of Target database.
Step 2: Create a session in Workflow manager.
Step 3: Create a Workflow in Workflow manager. Add the session and run the
workflow. View the status in Workflow monitor.
Creating Mapping:

TCS Internal

316852746.doc
1. Click the mapping designer icon
in menu bar.

5/4/2016
and choose Create icon of Mappings

2. you can see the prompt asking for the mapping name. Type the mapping
name and click OK.

3. The mapping will be created with its name being displayed as below.

4. Drag the required source from the navigator to the workspace. When you
drag the source you can see source qualifier connected to source by
default.

TCS Internal

316852746.doc

5/4/2016

5. Drag the target to the workspace. You can see the view as below.

6. Create link between source qualifier and target.


i)
You can do by autolink ( right click on the panel and click
Autolink).
ii)
Are link manually ( Click on the field in the source qualifier and
drag without releasing the button and place the pointer on the
target field and release the button. You can see the new link has
been created. The figure below shows the complete mapping.

TCS Internal

316852746.doc

5/4/2016

Figure showing a mapping after creating links.


7. Click Validate of Mappings in menubar to validate the mapping. You can
see the notification in validate tab of output window. Save the
repository.

Note:
When the mapping is valid you can continue with session creation. The mapping we
have created is a simple pass thro mapping (Direct mapping), we do not perform any
operations or function on the data flow. We are just loading the target with source
data.
But still when we get the source to mapping we get source qualifier transformation by
default, because what ever the source type may be (Oracle, SQL Server, DB2, flat file
) Informatica converts source data types to ( informatica ) transformation datatype.
CREATING SESSIONS:

TCS Internal

316852746.doc

5/4/2016

Before creating sessions


i)
ii)

we need to register the server in the repository.


Create Connection strings.

STEPS FOR REGISTERING A POWERCENTER SERVER


Aim: To register a new server.
Step 1: Open the Informatica Workflow Manager and connect to repository and the
folder.
Step 2: In the menu bar choose Server -> Server Configuration. You can see The
Server Object Browser dialog box appears as below. Click New.. to add new server.
You can see The Server Editor dialog box appears as shown in the Figure 5.

Step 3: Configure the server settings as given below.


Server Name: incdxpp025 (Give the name as you gave in Repository Console)
Host Name or IP address: incdxpp025 ( Give the machine which will be acting as
PowerCenter Server) and click Resolve Server. You can see IP
appeared below Resolved IP Address. Note: When ever you are about to restart
your machine you need to Resolve the IP Address and then start the services.
Advanced -> Server Variables-> $PMRootDir: use the PowerCenter Server
installation directory as the root directory C:\Program Files\Informatica PowerCenter
7.1\Server. You can see filled Server editor dialog box in the below figure. Click OK,
you can see the newly registered server in the Server

TCS Internal

316852746.doc

5/4/2016

Figure 5
Start the informatica services
Run -> services.msc -> informatica
Right click on the informatica services above informatica Repository services,
and click start. Refresh to check.
Note: After you start the PowerCenter Server, the Services dialog box displays the
status as Started. Due to some errors the status may automatically change to
stopped.
If so check for the following:
i) Check for Repository server has started. If your repository database is in your local
system check you MS SQL SERVER services has started.
ii) Go to Informatica Server configuration in workflow manager and click Resolve
server. As we dont have static IP we need to resolve the IP each time we restart our
machine.
iii) You can set the services to Automatic mode for starting the services automatically.

TCS Internal

316852746.doc

5/4/2016

SETTING UPA RELATIONAL DATABASE CONNECTION


AIM: To Create a relational connection string for the source database Northwind of
MS
SQL Server.
Step 1: Connect to repository in the workflow manager. In Menu Choose
Connections -> Relational

Step 2: Click New, you can see the Subtype window. Choose Microsoft SQL Server
and
click OK, The Connection Object Definition dialog box appears

TCS Internal

316852746.doc

5/4/2016

Step 3: Fill in the box with the following details.


Name: Northwind (Name used to refer the database in Workflow Manager).
User name: bit (User Name of the DataBase).
Password: bit (Password used to connect Database).
ATTRIBUTES:
Database name: Northwind (Name of the Database).
Server name: incdxpp025 (Server in which the database is located).
Domain name: Systechin (Name of the Domain)

After filling the box with required details click OK, you can see the newly created
connection string in objects of Relational connection Browser.
Create one more connection string for target database BasicTraining as above.
You have to create connection strings for database that we are using as our source or
target. To create connections strings for other type of database refer to help index.

CREATING SESSION:

TCS Internal

316852746.doc

5/4/2016

AIM: To create SESSION task.


STEP 1: Choose Tasks(Menu) -> Create as shown below.

STEP 2: You can see the creat task dialog box prompting for task and name for the
task. Selec the task as Session and give the name for the task, and click Create.

STEP 3: It will show the list of mappings as shown below. Currently it will show only
one mapping. Choose one and click OK. You can see the session in the task developer
panel. Click Done.

Short Cut for CREATING SESSION:

TCS Internal

316852746.doc

5/4/2016

You can also create session as follows. Click the session icon in the toolbar and click
in the task developer panel. It will prompt for selecting mapping. Select the mapping
and click OK.

You can see the Session in the Task Developer panel.


CONFIGURING CONNECTION STRINGS OF SESSION:
Aim: To specify Connections Stings for source and targets.

Double click on the SESSION and choose Mapping tab. In the navigator choose the
Sources -> SQ_Region and select Northwind Connections. Similarly choose Targets
Dim_Region and select BasicTraining as Connections. Enable Truncate option to
truncate target table before loading. Click apply. Save the Repository.
Note:
The sessions created in Task developer will be reuable tasks. We can use the same
session in multiple workflows. We can even create a session in workflow designer or

TCS Internal

316852746.doc

5/4/2016

Worklet designer diretly which is non reusable i.e it can be used only by that workflow
or worklet.
CREATING THE WORKFLOW:
AIM: To create a workflow.
STEP 1: (Creating Workflow). Click Workflows(menu) -> Create

WORKFLOW General tab: Give the name of the workflow and choose the Server
using
and enable Tasks must run on Server option. Click OK.
You can see workflow has been created with Start Task. Below figure shows
workflow with its name.

TCS Internal

316852746.doc

5/4/2016

STEP 2: Adding Session to workflow.

Click the required sessions in the Sessions folder of Repository Navigator and drag
the object to the Workflow Designer panel and release the mouse button.
STEP 3: Give a link between Start task and Session. Use Link Task in Tasks menu
or use

link icon in tool bar.

Link Task in Tasks menu bar.

i) Select the link task.

TCS Internal

316852746.doc

5/4/2016

ii) click the first task (Start task) you want to connect and drag it to the second task
and release the mouse cursor. You can see the link between two tasks as shown
below.

WORKFLOW in Workflow Designer.


Short cut for creating workflow:
Drag the session from the Navigator and drop in the empty workflow designer it will
prompt for Workflow name. Give the workflow name and set the sever and enable the
task must run on server option. Click apply and save.
RUNNING A WORKFLOW
1. Connect to Repository in workflow manager and connect to folder.
2. Right click on the workflow in navigator and click start, or drag the workflow to
workflow designer and click

start workflow button in menubar.

3. Open the Workflow monitor and see the status of the running workflow with
Task view, as shown below.

Running status of the task in Workflow Monitor.


4. After the task got completed the Status will turn to Succeeded, as show below.

TCS Internal

316852746.doc

5/4/2016

5. You can double click on the session that is succeeded and see the detailed
status of the task. You can see Source success rows and target success rows.
You can swith to Transformation Statistics to view more details statistically.

Note: You can look into session log for more details. To see session log, click the
session in the panel and click the Session log icon shown below. You can see the
session log opening in the wordpad.

Open the SQL Query analyzer and query the target table. Check for the availability of
data in target table.

TCS Internal

316852746.doc

5/4/2016

Evaluating The Resutls:

ex_2: Load the data from flat file src_ft_sales_office to the table
Dim_sales_office in the target
database.
Note: Create the table Dim_sales_office in the target database using
Generate/Execute SQL option of target menu in Warehouse Designer.

TCS Internal

316852746.doc

5/4/2016

STEP 1: Import the souce file structure in the Source Analyzer (we have done
already).
STEP 2: Create the target structure in Warehouse Designer and generate the
structure in the target database using Generate/Execute SQL option of target
menu (We have done already).
STEP 3: Create a direct mapping with source as flat file and target as relational
table.
STEP 4: Create a non rereusable Session in the workflow wf_Source_Qualifier
and
configure the target string with the required relational string
(BasicTraining).
STEP 5: Configure the flat file source with the source file location and its name.

Properties
Source file directory: $PMSourceFileDir\ (server variable directory we set
while registering the server in workflow manager Note: we can specify source file
folder path e.g: C:\Program_files\Source If the source file was stored in the folder.
Source filename: src_ft_sales_office (Name of the Source file with extension
.csv or .txt and so on)
Source filetype: Direct.

Click Set File Properties


properties.
You can see the Flat Files dialog box as below.

TCS Internal

to set file

316852746.doc

5/4/2016

If you open the file you can see the data below which has column Delimited
(separated ) by comma.
SalesOffice,Region,Country
0001,NW,USA
0002,NE,USA
0031,SW,Finland
Choose fixed width if the file is of fixed width format. Example of same file is
shown below
SalesOffice Region
0001
NW
0002
NE
0031
SW

Country
USA
USA
Finland

Click Advanced:
You can see Delimiter file Properties dialog box appearing as shown below. If the
delimiter was comma choose , or if something else choose the delimiter.
If the column values are enclosed in double quotes or single quotes choose
Single or Double of Optional Quotes. If no Choose None. Click OK and Apply.
Save the repository.
Example of file values enclosed with double quotes.
SalesOffice,Region,Country
0001,NW,USA
0002,NE,USA
0031,SW,Finland

TCS Internal

316852746.doc

5/4/2016

Set Numbe of initial rows to skip to 0. You can skip first n rows using this option.
If Escape Character was specified here the PowerCenter Server reads the
delimiter character as a regular character after reading the Escape Character
STEP 6: Create a Link task between start task and the newly created session.
STEP 7: Right click on the session and click Start Task, to start the current
session alone.

STEP 8: Monitor the task status in workflow monitor and validate the results.
Evaluation Process:

Note: We have created non reusable session. We can make it reusable by


enabling Make reusable option in session properties.

TCS Internal

316852746.doc

5/4/2016

When you enable the option it prompts as below. Click Yes to make it reusable.

If we make it reusable we cannot revert the process. With the requirement we


need to decide weather we need to create reusable tasks or non resuble tasks.

ex_3: Load the data from flat file src_ft_sales_office.txt &


src_ft_sales_office_2.txt to another flat file trg_ft_sales_office use INDIRECT
source type to extract source data.
Note: Create a folder Info_exe_files in your C drive set this location as
target file location. Use the target definition you have created before.
STEP 1: Create the pass thro mapping with source as src_ft_sales_office.
STEP 2: Create the session and configure the session with target file locations and
names.
STEP 3: Create a file Sales_Office.txt and save the location and name of the file in
the file. STEP 4: Configure the session.
INDIRECT Source Type: Representing the list of source files with same structure
and its locations in a file. In this exercise the source file is src_ft_sales_office.txt.
Duplicate the file with the name src_ft_sales_office_2.txt. Create a file named
Sales_Office.txt with the data of source files location and names as below.
C:\Program Files\Informatica PowerCenter 7.1\Server\SrcFiles\src_ft_sales_office.txt
C:\Program Files\Informatica PowerCenter 7.1\Server\SrcFiles\src_ft_sales_office_2.txt

Give the source file name in session as Sales_Office.txt and its location in Source file
directory and Source filetype as Indirect.

TCS Internal

316852746.doc

5/4/2016

The power center reads data from the source file src_ft_sales_office.txt &
src_ft_sales_office_2.txt in the location specified in Sales_Office.txt file. Similarly we
can have a list of files and its locations in the Sales_Office.txt file. Set the file
Properties. Save.
STEP 5: Create the link between start task and newly created task. Save. Start
Running the new session alone. Check the status and check for the file in the target.
Validate the results.
ex_4: Load the data from Employees to Dim_Sales_Rep (Table structure should
contain EmployeeID, Last_name, First_name and Title with data types similar to
employees table). Extract only employees with Title Sales Representative
and sort the output by First name.
STEP 1: Create the mapping, using source as Employees table and target as
Dim_Sales_Rep (Create it manually and generate in database using
Generate/Execute).
STEP 2: Double click on the Source Qualifier Transformation and select Properties
tab.
We need to set two properties.
i) where condition Employees.Title='Sales Representative'
ii) Sort by first name.

TCS Internal

316852746.doc

5/4/2016

STEP 3: Setting the Where Clause. Click Source Filter, you can see a SQL Editor.
Give the required where condition. You can use columns in Ports tab (by double
clicking on it). After giving required condition, Click OK.

Setting the where clause.


STEP 4: Sorting. Give the number of columns needs to be sorted in the Number of
Sorted Ports options. Set the columns in the Ports tab as order of sorting as shown
below.

Before ORDERING the PORTS


We need to arrange based on FirstName. So we can bring FirstName to the top using
the Arrow mark in the right corner.
Note: If we need to order by the result set by FirstName, LastName,TitleOfCourtesy
then we need to bring FirstName to the top followed by LastName and
TitleOfCourtesy.
Click Apply and OK. Save.

TCS Internal

316852746.doc

5/4/2016

After ORDERING The PORTS.


STEP 5: Create the non reusable session in the workflow and run the workflow.
Check for the status in workflow monitor.
STEP 6: Evaluate the results, Create source and target queires and evaluate the
results using excels sheet.
Note: Get in to Sql Query option of Properties tab and click Generate SQL, after
doing the required modifications. You can see the Query as shown below that will be
applied in database to retrieve data. We cannot do filteration, order by or any other
functions for flat file source.

We can do the inverse action defined as SQL Override, link the corresponding ports.
Paste the custom query and validate against the database using Validate option. If
the query is valide click ok.

TCS Internal

316852746.doc

5/4/2016

ex_5: Load the data from Employees to Dim_Emp_Loc (Table structure should
contain EmployeeID, Last_name, First_name, Title, city and Address with data types
similar to employees table). Extract only employees residing in London. Use
SQL Override.
STEP 1: Create a target structure in Warehouse designer with required fields &
generate in target database.
STEP 2: Create a Mapping with source as Employees and Target as Dim_Emp_Loc.
STEP 3: Create a Custom Query to retrieve data from source table.
STEP 4: Get in to Sql Query option of Properties tab Paste the Query created, and
validate against the source database use corresponding source string, username
and password.

If Query is valid against the Source database it will raise a prompt sayin No errors
detected as below. Click OK and Save.

STEP 5: Create Session for the mapping in the workflow w_Source_Qualifier. Run the
session and monitor the status. Validate the results.
Note: The order of fields in select list of query and output links in SQ must match.

TCS Internal

316852746.doc

5/4/2016

We have seen enough abt creating mapping, working with source & targets and
sessions and workflow. Following exercises will explain only about working with
transformations.
Expression Transformation:
ex_1: Extract records from Employees table as shown in sample below.
Create the table with required data types.
Table name: Dim_Emp_Detail
(
Full_name
sample record: Mr. Robert King (TitleOfCourtesy+First_name Last_name)
Designation sample record: Sales Representative (Title)
Location
sample record: London, UK (city, country)
)
Note: Use Expression Transformation to get the results.
STEP 1: Create a target structure in the warehouse designer with fields as below,
and generate the target in target database.

The data type should match with the requirement. If you see Full_name the column is
formed using three columns and a space as mentioned in sample record. The
datatype is derived by adding the precision of three fileds and adding one for space.
This is to be followed for the following exercises.
STEP 2: Create a mapping with source and target.
STEP 3: Create a Expression transformation.

Click on the icon shown above and click in the mapping designer you can see a
expression transformation as shown below. We can create ports or else we can get
the query from prior transformation (just drag the field and drop in expression.

STEP 4: Bring the required columns to Expression transformation from source


qualifier. As shown below. These are the required columns to be formatted.

TCS Internal

316852746.doc

5/4/2016

STEP 5: Create the ouput ports to get the output after performing the required
operations. You can see the O ouput port is enabled we cannot link input ports to
these ports.

STEP 6: Configure the ports with functions and operators.


Click on the required port under Expression column, you will see the expression
editor. Fill it with the required functions and field names as we use in SELECT clause
of SQL query.

Click OK and Configure the other two ports.

TCS Internal

316852746.doc

5/4/2016

Output ports after configuring.


Apply the changes. Above figure shows the configured output ports.
STEP 7: Link the required output ports of expression transformation to the target.

STEP 8: Save the repository, create a non reusable session, a new workflow named
wf_Expression_Transformation, load the target table and validate the results.
Follow the same methodology for the following exercises.
ex_2: Extract records from Order_Details table to Dim_Sales with
following structure.
Create the table with required data types.
Table name: dim_sales
(
Order id
Sales
(Calculate using formula (Unit Price*Quantity)-(Unit Price*Quantity*Discount)
Discount (Populate Y if Discount !=0 and Populate N if Discount=0)
)
Note: Use Expression Transformation to get the results. While configuring the
Discount port use the in built functions in the funcations tab of the transformation as
shown in the figure.
Search Informatica help index for syntax of all the functions listed in the tab.

TCS Internal

316852746.doc

5/4/2016

Configuring the port using inbuilt functions.


To know more details regarding the inbuilt functions use Function

Categories phrase

in the informatica search.


ex_3: Extract Product list from Products table. Create the table with
required data types.
Table name: Dim_Prod_Class
(
ProductName
Class
(if UnitPrice>100
Populate A
If UnitPrice>50
and UnitPrice<100
Populate B
If UnitPrice<50
Populate C)
QuantityPerUnit
)
Allow only the rows having Discontinued column of Product table equal to zero;
ex_4: Extract a Supplier list having Suppliers company name and contact
name and Fax
If suppliers Fax is null populate NO FAX. Create the target structure
named
tgt_ft_Sup_Contacts
Note: Create the list in a flat file named supplier_fax folder location mentioned
before.

In The Default value option we can provided values for handling null values.
Validate the data using the mark next to it.
We can handle the null values by creating another port and using the inbuilt function
IIF and ISNULL. Eg: IIF(ISNULL(Fax), 'NO FAX', Fax).
Try both the ways.

TCS Internal

316852746.doc

5/4/2016

ex_5: Extract the following record format from customer table in a flat file
named customer_contact and target structure named
tgt_ft_cust_contacts. Extract only for Germany.
The record should look like below format:
Maria Anders
Alfreds Futterkiste
Obere Str. 57
Berlin
Germany- 12209
Column details:
ContactName
CompanyName
Address
city
country-postalCode
Get all the required ports in the expression transformation and create a new ouput
port. Create a expression using CHR() function to get the above formatted output.
Note: Create the flat file target with single port and use the logical expression to get
the above format.

Filter Transformation:

TCS Internal

316852746.doc

5/4/2016

ex_1: Load the data from Employees to Dim_Sales_Rep (Use the structure
and table created before. Extract only employees with Title Sales
Representative and sort the output by First name.
Note: Use Filter transformation to extract the data. The table is already
loaded with the data of ex_4 of source qualifier transformation. Use
truncate option to truncate the table, before loading the data.
STEP 1: Open the mapping m_SQ_ex_4, using Copy As.. Option of Mappings tab
create mapping named m_FLT_ex_1.

Click OK.
STEP 2: Alter the mapping as below
i)
ii)

Remove the filter in the source qualifier


Add a filter transformation and configure.

Adding a Filter transformation:


Use the icon shown below to create the Filter transformation.

Newly created transformation will appear as below. Drag the required ports from the
source qualifier transformation to filter transformation.

TCS Internal

316852746.doc

5/4/2016

Open the Properties tab of Filter transformation and give the filter condition as
Title=Sales Representative in the Filter Condition Editor and Click OK.

STEP 3: Give the link between Filter transformation and target. Validate the
mapping, create session in the new workflow named wf_FLT_ex_1. Run and validate
the results.
ex_2: Extract records from Order Details table to Dim_Sales with the
table structure and logic used in ex_2 of expression transformation.
Use filter to load sales data with discount, neglect the rest.
Copy the mapping m_EXP_ex_2 as m_FLT_ex_2 and make necessary changes and
proceed.
ex_3: Load customer data from customers table to dim_cust_clean with
out any null values in any column, and data with null values in any column
into dim_cust_bad in target database.
Create dim_cust_clean and dim_cust_bad with structures similar to
customers table.
Note: Use Filter transformation to extract the data.
STEP 1: Creating a mapping, drag the source and targets to the mapping.
Add a Filter transformation and drag all the columns from source qualifier to filter &
name the filter as FLT_Cust_Clean. Add the required condition to filter out not null
columns.

TCS Internal

316852746.doc

5/4/2016

STEP 2: Add another filter and rename it as FLT_Cust_bad and get all the columns
from source qualifier to filter, and add condition to filter records with null values.
STEP 3: Link FLT_Cust_Clean with target dim_cust_clean and FLT_Cust_bad with
target dim_cust_bad. Validate the mapping save, create session, run and validate the
results.
ex_4: Extracts stock required to fulfill the orders from products table to
Prod_delivery file in the target folder mentioned earlier.
Note: Check column UnitsOnOrder, to find order status.
Target File_strucuture:
ProductName
QuantityPerUnit
SupplierID
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
ex_5: Achieve the query result.
SELECT
OrderID, CustomerID, EmployeeID, OrderDate, RequiredDate, ShipVia
FROM
Orders
WHERE
(ShippedDate IS NULL)
AND (ShipVia = 3)
Load the data in a file named Convey_Via3.txt

TCS Internal

316852746.doc

5/4/2016

Router Transformation:
ex_1: Load customer data from customers table to dim_cust_clean with
out any null values and data with any null values into dim_cust_bad in
target database.
Create dim_cust_clean and dim_cust_bad with structures similar to
customers table.
Note: Use Router Transformation to route the data. Use truncate option to
truncate tables.
If any circumstance is there to use more than one filter transformation we can use
Router transformation.
STEP 1: Create the mapping with source and two target and a Router
Transformation. Get all fields from source qualifier to Router transformation.
STEP 2: Click Groups tab of router transformation and create a group and name it as
CleanRecords, using Add group option shown below. In the Group Filter Condition add
the condition to filter records with no null values.

STEP 3:
Link the fields of group CleanRecords of the Router with the target structure
Dim_Cust_Clean, and the group Default with the target structure Dim_Cust_bad as
shown in the figure below.
Validate the mapping, create the session in the workflow wf_Router_Transformation,
run and evaluate the results.

TCS Internal

316852746.doc

5/4/2016

Figure: Showing the view of router connected with targets.


ex_2: Achieve the following result sets using Router Transformation.
SELECT
OrderID, CustomerID, EmployeeID, OrderDate, RequiredDate, ShipVia
FROM
Orders
WHERE
(ShippedDate IS NULL)
AND (ShipVia = 3)
Note-Load the data in a file named Convey_Via3
SELECT
OrderID, CustomerID, EmployeeID, OrderDate, RequiredDate, ShipVia
FROM
Orders
WHERE
(ShippedDate IS NULL)
AND (ShipVia = 2)
Note-Load the data in a file named Convey_Via2
SELECT
OrderID, CustomerID, EmployeeID, OrderDate, RequiredDate, ShipVia
FROM
Orders
WHERE
(ShippedDate IS NULL)
AND (ShipVia = 1)
Note-Load the data in a file named Convey_Via1
Use Router transformation to extract data and write in to files.

TCS Internal

316852746.doc

5/4/2016

ex_3: Create dim_emp_uk and dim_emp_usa tables with structure similar to


employees.
Load corresponding data from employees table to target tables.
Note: Handle all the nulls in the tables with corresponding data.
Data Type
Replacement
CHARACTER
NUMERIC
DATE

X
-1
01-01-2500.

Hint: Use a expression to handle the null values and then use route the values based
on the conditions specified.

Aggregator Transformation:

TCS Internal

316852746.doc

5/4/2016

ex_1: Extract records from Order Details table to Dim_Sales_order with


following structure.
Create the table with required data types.
Table name: Dim_Sales_Order
(
Order id
Sales Calculate using formula SUM ((Unit Price*Quantity)-(Unit
Price*Quantity*Discount))
)
Note: Get the result aggregated to Order id. Use Aggregator Transformation.
STEP 1: Create the mapping with required source and targets.
STEP 2: Create the Aggregator Transformation. Get required ports and create a
output port for calculating the Sales value, as shown below.

Port structures and Group by options.


STEP 3: Enable the Group by option for OrderID, as we our requirement is to order by
only OrderID. Open the Expression Editor for Sales column and using the inbuilt
Aggregator functions and operators to built the expression and validate it as shown
below.

Expression Editor.
Link the port of Aggregator and target structure. Validate the mapping, create the
session in a new workflow wf_Aggregator_Transformation. Run and validate the
results.

ex_2: Find no: of: units ordered and its worth based on category of the
product from products table. Populate the results in dim_Order_cat

TCS Internal

316852746.doc

5/4/2016

Table Structure: Dim_Order_Cat


(
CategoryID
Units_Ordered
SUM(UnitsOnOrder)
Amount
SUM(UnitPrice*UnitsOnOrder)
)
Group by CategoryID.
ex_3: Based on Supplier ID load the max (unit_price), min (unit_price) and
avg (unit_price) of products.
Table_name:
(
SupplierID
MAX_PRICE
MIN_PRICE
AVG_PRICE
)

dim_price_analyse
MAX(UnitPrice)
MIN(UnitPrice)
AVG(UnitPrice)

Note: Use Inbuilt functions in the Aggregator folder of the Functions tab to use Max,
Min...
ex_4: Extract records from Order Details table to dim_prod_sales with
following structure.
Create the table with required data types.
Table name: Dim_Prod_Sales
(
ProductID
Sales
(Calculate using formula SUM ((Unit Price*Quantity)-(Unit Price*Quantity*Discount))
)

ex_5: Load the freight expense given for each ship from orders table.
Table_name: Dim_Freight
(
ShipName
Freight_Expense
SUM (Freight)
)

Joiner Transformation:
Note: Use fixed width format for all flat file targets. Write column names to target
files.

TCS Internal

316852746.doc

5/4/2016

ex_1: Prepare a stock list of products which are continued (Refer


discontinued flag), with following details
Target Table: dim_stock_list
(
CategoryName
ProductName
QuantityPerUnit
UnitsInStock
Discontinued
)
Using Categories & Products tables.
STEP 1: Create the mapping with required source and target. Create the Joiner
transformation and get the required columns from two of the source qualifiers to the
joiner (Joiner can accept fields from two sources only not more than that).

STEP 2: Go to Conditions tab of the Joiner Transformation and create the join
condition using the Add a new condition icon . Select the ports for the condition
as shown below.

TCS Internal

316852746.doc

5/4/2016

STEP 3: Go to Properties tab. Check for the Join Type option and set to required
condition based on our requirement.

You can see the Ports tab to check the master and detail groups

A Normal Join gets only if the join condition is met. Full Outer join retrieves
matched as well as unmatched rows of master and detailed data sets. A detail
outer join gets all records from the master source and the matching rows from the
detail source, A master outer join gets all records from the detail source and the
matching rows from the master source.
STEP 4: Continue with the process, and evaluate the results.

TCS Internal

316852746.doc

5/4/2016

Testing Query:
SELECT C.CategoryName, P.ProductName, P.QuantityPerUnit, P.UnitsInStock,
P.Discontinued
FROM
Categories C JOIN
Products P
ON C.CategoryID = P.CategoryID
WHERE
(P.Discontinued <> 1)
ex_2: Find the sales of each product in the year 1997(shipped date year
should be 1997).
Extract the data to Dim_Sales_97.
Table name: Dim_Sales_97.
(
CategoryName
ProductName
ProductSales
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
)
The target tables sample data.
CategoryName
Meat/Poultry
Condiments

ProductName
Alice Mutton
Aniseed Syrup

ProductSales
16580.850044250488
1724.0

STEP 1: Create the mapping with required source and targets.


STEP 2: Create a joiner and join Category and Products table using the join condition
specified in the testing query.
STEP 3: Create another joiner and join the output of the first joiner with the Orders
table.
STEP 4: Get the output of the second joiner to the filter transformation, to retrieve
data based on shippeddate year to 1997.
Note: Add filter transformation immediately next to Orders table and then link to
joiner. That will improve performance. Even you can add the filter condition in the
source qualifier.
STEP 5: Add the aggregator and apply group by conditions. Link the output of
aggregator to the target. Validate the mapping and continue with the process
Evaluate the result using the testing query.

TCS Internal

316852746.doc

5/4/2016

Testing Query:
SELECT C.CategoryName, P.ProductName,
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount)) AS ProductSales
FROM Categories C JOIN Products P
ON C.CategoryID = P.CategoryID
JOIN Orders O JOIN [Order_Details] D
ON O.OrderID = D.OrderID
ON P.ProductID = D.ProductID
WHERE (O.ShippedDate BETWEEN '19970101' AND '19971231')
GROUP BY C.CategoryName, P.ProductName
ex_3: Prepare order list to suppliers to fulfill the stock requirement from
products table based on the condition given below.
Note: Check column UnitsOnOrder, to find order status. Neglect records if orders can
be satisfied.
Route Supplier with Fax to Dim_Order_with_Fax & supplier without fax to
Dim_Order_with_Phone
Target File_structure: Dim_Order_with_Fax
(
ProductName
QuantityPerUnit
CompanyName
ContactName
Fax
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
)
Target File_structure: Dim_Order_with_Phone
(
ProductName
QuantityPerUnit
CompanyName
ContactName
Phone
CategoryID
RequiredUnits Calculate (UnitsInStock-UnitsOnOrder)-Neglect records if orders
can be
satisfied
Amount
RequiredUnits*UnitPrice
)

STEP 1: Create Mapping with required sources and targets. Add the required filter in
the source qualifier transformation if possible.
STEP 2: Create a joiner transformations to join two tables. Get the output of the
joiner transformation to the expression transformation.

TCS Internal

316852746.doc

5/4/2016

We are calculating RequiredUnits and Amount using the formula mentioned in


target structures. We are using the expression UnitsInStock-UnitsOnOrder in both the
expressions. Instead of calculating twice we can create a variable and use the
variable in the expressions.

To create the variable create a new port and enable the V variable port check box
option. Open the expression editor and give the required expression. This variable
can be used only with in this expression.

Using Transformation variable In the ports tab you can see the newly created
variable as shown above. You can use this variable like a port in the expressions.

STEP 3: Use Router transformation to split up the records for targets Fax and Phone.
Create the required groups & group conditions in the transformation.

TCS Internal

316852746.doc

5/4/2016

Iconic view of the mapping.


Testing Query:
Dim_Order_with_Fax
SELECT P.ProductName, P.QuantityPerUnit, CompanyName, ContactName,
Fax, P.CategoryID, ABS ((P.UnitsInStock-P.UnitsOnOrder)) RequiredUnits,
(ABS ((P.UnitsInStock-P.UnitsOnOrder))*P.UnitPrice) Amount
FROM Products P JOIN Suppliers S
ON P.SupplierID=S.SupplierID
WHERE (P.UnitsInStock-P.UnitsOnOrder) <0
AND
S.Fax IS NOT NULL
ORDER BY P.ProductName
Dim_Order_with_Phone
SELECT P.ProductName, P.QuantityPerUnit, CompanyName, ContactName,
Phone, P.CategoryID, ABS ((P.UnitsInStock-P.UnitsOnOrder)) RequiredUnits,
(ABS ((P.UnitsInStock-P.UnitsOnOrder))*P.UnitPrice) Amount
FROM Products P JOIN Suppliers S
ON P.SupplierID=S.SupplierID
WHERE (P.UnitsInStock-P.UnitsOnOrder) <0
AND
S.Fax IS NULL
ORDER BY P.ProductName

ex_4: Calculate Sales by Employees for Employees whos Title is Sales


Representative
Populate the results in file named SalesByEmployees.csv
File Structure:

TCS Internal

316852746.doc

5/4/2016

(
FULL_NAME FIRST_NAME+LAST_NAME
TOTAL_SALES SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
)
From Employees, Orders, Order Details.
The target tables sample data.
FULL_NAME
Margaret Peacock
Janet Leverling

TOTAL_SALES
225763.69595336914
202812.84279346466

Testing Query:
SELECT E.FirstName+' '+E.LastName FULL_NAME,
SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount)) TOTAL_SALES
FROM
Employees E JOIN Orders O
ON
E.EmployeeID=O.EmployeeID
JOIN [Order_Details] D
ON
O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL
AND
E.Title='Sales Representative'
GROUP BY E.FirstName+' '+E.LastName
ORDER BY TOTAL_SALES DESC
ex_5:

Build logic to extract sales for each category by year.

Table name: Dim_Cat_Year


(
CategoryName
Sales_96
Sales_97
Sales_98
)
Use ShippedDate of Order table to extract sales year.

The target tables sample data.


CategoryNa
me
Confections

TCS Internal

Sales_96

Sales_97

Sales_98

27257.499817848
206

80894.151594161
987

56520.400411605
835

316852746.doc
Dairy Products

5/4/2016
36711.369998931
885

114749.77032089
233

79490.024902343
75

Testing Query:
SELECT C.CategoryName,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19960101' AND '19961231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_96,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19970101' AND '19971231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_97,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19980101' AND '19981231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_98
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON
D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
GROUP BY C.CategoryName
ORDER BY C.CategoryName
Total sales query: Just for comparison (to validate test script).
SELECT C.CategoryName, SUM ((D.UnitPrice * D.Quantity) * (1 - D.Discount))
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON
D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL
GROUP BY C.CategoryName
ORDER BY C.CategoryName
ex_6: Built logic to extract sales for each category by year and percentage
growth for adjacent years.

TCS Internal

316852746.doc

5/4/2016

Note: Copy the mapping for previous exercise and modify to get the following
results.
File name: Category_Sales_Growth target file should be in fixed width format.
(
CategoryName
Sales_96
Sales_97
96-97 Growth%
(SALES_97-SALES_96)*100/SALES_96
Sales_98
97-98 Growth%
(SALES_98-SALES_97)*100/SALES_97
)
The target tables sample data.
Target Sample data.
CategoryNa
me

Sales_96

Sales_97

Confections

27257.499817848
206
36711.369998931
885

80894.151594161
987
114749.77032089
233

Dairy Products

96-97
Growth
%
196 %
212 %

Sales_98
56520.400411605
835
79490.024902343
75

Testing Query:
SELECT CategoryName,
SALES_96, SALES_97, CAST (CAST ((SALES_97-SALES_96)*100/SALES_96 AS INT) AS
VARCHAR (5)) +' %' AS "96-97 Growth%",
SALES_98, CAST (CAST ((SALES_98-SALES_97)*100/SALES_97 AS INT) AS VARCHAR
(5)) +' %' AS "97-98 Growth%"
FROM
(SELECT C.CategoryName AS CategoryName,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19960101' AND '19961231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_96,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19970101' AND '19971231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_97,
SUM (CASE
WHEN O.ShippedDate BETWEEN '19980101' AND '19981231' THEN D.UnitPrice
ELSE 0
END* D.Quantity*(1 - D.Discount)) AS SALES_98
FROM
Categories C JOIN Products P
ON
C.CategoryID=P.CategoryID
JOIN [Order_Details] D
ON

TCS Internal

97-98
Growth
%
-30 %
-30 %

316852746.doc

5/4/2016

D.ProductID=P.ProductID
JOIN Orders O
ON
O.OrderID=D.OrderID
GROUP BY C.CategoryName
) TEMP
Lookup Transformation:
ex_1: Prepare a list with CategoryName, CompanyName & ProductName
from Products, Categories, and Suppliers.
Note: Use Lookup transformation to get CompanyName & CategoryName.
The target tables sample data.
CategoryNa
me
Beverages
Beverages

CompanyName

ProductName

Aux joyeux
ecclsiastiques
Aux joyeux
ecclsiastiques

Cte de Blaye
Chartreuse
verte

STEP 1: Create the mapping with Product as source and Dim_Prod_Company as


target.
STEP 2: Create a Look up transformation with lookup table as Category.
CREATING LOOK UP Transformation
STEP 1: Click the Lookup transformation icon in the tool bar and click in the mapping
panel.
STEP 2:Prompt for selecting Lookup table will be displayed. You can select the lookup
table from the existing source, Target definitions or you can import new structure
using Import option. Select the required structure and click Ok.
We can Look up data in a flat file or a relational table, view, or synonym. Look up
makes a left outer join with source at left position to retieve all records from source
table.
STEP 3: Get the required columns from source to look up transformation. Create a
join condition in the conditions tab. If multiple conditions are created the logical
operator And is set by default. We can use comparison operators =, >, <, >=, <=, !=
in join conditions which is not possible in Joiner transformation. Validate. Take the
required fields to next TRT.

TCS Internal

316852746.doc

5/4/2016

Prompt for selecting Lookup data source.


LookUp is similar to a joiner but differs with following characteristics.
TABLE: ORDERS (SOURCE)

OrderID
10248
10249
10250
10295
10541
10542
10737
10738

TABLE: CUSTOMERS (LookUp)

CustomerID EmployeeID
VINET
5
TOMSP
6
HANAR
4
VINET
2
HANAR
2
KOENE
1
VINET
2
NewCustomer
1

CustomerKey
1
2
3
4
5
6
7
8

CustomerID
VINET
TOMSP
VINET
HANAR
KOENE
VINET
CACTU
CENTC

CustomerName
Paul Henriot
Karin Josephs
Peter Franklin
Mario Pontes
Philip Cramer
Ruther Martyn
Patricio Simpson
Francisco Chang

With the above Orders & Customers tables we will see the difference between LookUp
transformation and Joiner transformation. When we use joiner with the Join condition
Orders.CustomerID=Customers.CustomerID(+) to retrieve all rows from Orders table
then the resultant data set will be as below.
OrderID
10248
10248
10248
10249
10250
10295
10295
10295
10541
10542
10737
10737

CustomerID
VINET
VINET
VINET
TOMSP
HANAR
VINET
VINET
VINET
HANAR
KOENE
VINET
VINET

TCS Internal

EmployeeID
5
5
5
6
4
2
2
2
2
1
2
2

CustomerKey
1
3
6
2
4
1
3
6
4
5
1
3

CustomerID
VINET
VINET
VINET
TOMSP
HANAR
VINET
VINET
VINET
HANAR
KOENE
VINET
VINET

CustomerName
Paul Henriot
Peter Franklin
Ruther Martyn
Karin Josephs
Mario Pontes
Paul Henriot
Peter Franklin
Ruther Martyn
Mario Pontes
Philip Cramer
Paul Henriot
Peter Franklin

316852746.doc
10737
10738

VINET
NewCustomer

5/4/2016
2
1

6
(null)

VINET
(null)

Ruther Martyn
(null)

For the above case CustomerID VINET of Orders table is having multiple match case
with the Customer table, retrieving all the multiple match cases, as mentioned
above.
In case of LookUP it wont retrieve all the multiple match case. For handling the
multiple match case we are provided with a option Lookup policy on multiple
match in properties tab.

Taking example of case OrderID 10248,


if we select Use First Value it will select only customer record with CustomerKey=1,
if we select Use Last Value it will select only customer record with CustomerKey=6,
if we select Report Error the corresponding Output rows will be written to session
log as error report.
For the above case if Use First Value option is choosen then the result data will be
as below.
OrderID
10248
10249
10250
10295
10541
10542
10737
10738

CustomerID
VINET
TOMSP
HANAR
VINET
HANAR
KOENE
VINET
NewCustomer

EmployeeID
5
6
4
2
2
1
2
1

CustomerKey
1
2
4
1
4
5
1
(null)

CustomerID
VINET
TOMSP
HANAR
VINET
HANAR
KOENE
VINET
(null)

CustomerName
Paul Henriot
Karin Josephs
Mario Pontes
Paul Henriot
Mario Pontes
Philip Cramer
Paul Henriot
(null)

the PowerCenter Server determines which row is first and which is last by generating
an ORDER BY clause for each column in the lookup source. The PowerCenter Server
then sorts each lookup source column in the lookup condition in ascending order.

In the properties tab we can see another option Connection information, set the
required connection string.

TCS Internal

316852746.doc

5/4/2016

Create another lookup transformation for Supplier and link to target. Continue with
the process.

Iconic view of the mapping


Testing Query:
SELECT CategoryName, CompanyName, P.ProductName
FROM Products P LEFT OUTER JOIN Categories C
ON P.CategoryID=C.CategoryID
LEFT OUTER JOIN Suppliers S
ON P.SupplierID=S.SupplierID
ORDER BY 1, 2
ex_2: Prepare a list of undelivered order (refer to ShippedDate) with
following details
TableName: Dim_Due_Deliverables
(
OrderID
FullName
Employees.FirstName+ + Employees.LastName
RequiredDate
)
From Orders & Employees.
Note: Use Unconnected Transformation with Lookup override. Lookup table to get
employees FullName.
The target tables sample data.
OrderI
D
10248
10249

FullName

RequiredDate

Steven
Buchanan
Michael Suyama

1996-08-01
00:00:00.000
1996-08-16
00:00:00.000

LookUp Override:

TCS Internal

316852746.doc

5/4/2016

STEP 1: Create a mapping with orders table as source and Dim_Deliverables as


target. Add required filter condition in source qualifier.
STEP 2: Create a Lookup transformation with the required columns. From the lookup
table we require only EmployeeID for joining with Orders table and Employees
FullName. Another column for accepting input column from Orders table for join
condition.

STEP 3: Create the join condition. EmployeeID=EmployeeID1.


STEP 4: Open Lkp sql Override SQL Editor and click on Generate Sql option. You
can see the query generated as as per the ports created.
SELECT Employees.FullName as FullName, Employees.EmployeeID as EmployeeID
FROM Employees
STEP 5: Modify the query as per our requirement & validate. Set the connection
string as Northwind.
SELECT
Employees.FirstName+Employees.LastName as FullName,
Employees.EmployeeID as EmployeeID
FROM Employees
The Lkp Override: We can take a flat file, table, view or sysnonym as lookup source
in genera. Instead we can take the result of the query in the LookUp Override as a
LookUp source. Using the above steps we can create LookUp override.
Unconnected:

Unconnected LookUp transformation will not be connected in the mapping


pipelines.
Input to the Unconnected Lookup transformation will be given thro inbuilt
function :LKP() in the expression, aggregator transformations...
Unconnected LookUp transformation will return only one column.

STEP 6: Enable the Return port option in the Lookup transformations required field.
STEP 7: Create a Expression transformation and get required fields from source.
Create another port to get return value from LookUP TRT. Open the expression Editor
and use :LKP() function as shown below.

TCS Internal

316852746.doc

5/4/2016

Using :LKP() Expression to pass i/p to Unconnected Lookup TRT and get o/p.
STEP 8: Connect the expression to the target and continue with session creation
and validations of results.

Iconic view of the mapping.


Testing Query:
SELECT O.OrderID, E.FirstName+' '+E.LastName AS FullName, O.RequiredDate
FROM Orders O LEFT OUTER JOIN Employees E
ON
O.EmployeeID=E.EmployeeID
WHERE O.ShippedDate IS NULL
ex_3: Extract Employees and there Reporting authority.
Dim_hierarchy
(
EmployeeID,
FullName
Employees.FirstName+ +Employees.LastName
Title
Manager
Manager.FirstName+ +Manager.LastName
)
From Employees.
Note: Use self join concept. If Manager has NULL Handle with No Manager.

Sample Data:

TCS Internal

316852746.doc
Employee
ID
1
2

5/4/2016

FullName

Title

Manager

Davolio
Nancy
Fuller Andrew

Sales
Representative
Vice President,
Sales

Fuller
Andrew
No Manager

Take the source as Employees and create a lookup transformation with Employees
table and continue with the process
Testing Query:
SELECT E.EmployeeID, E.LastName+' '+E.FirstName AS FullName, E.Title,
ISNULL (M.LastName+' '+M.FirstName, 'No Manager') AS Manager
FROM Employees E LEFT OUTER JOIN Employees M
ON
E.ReportsTo=M.EmployeeID
SORTER Transformation:
ex_1: Populate the ContactName, ContactTitle, CompanyName & Phone in a
target file named customer_contact Order by ContactName. The target file
should be text pad with fixed width format.
STEP 1: Create a mapping with source as customer and
STEP 2: Create a Sorter transformation and get the required columns from source
Qualifier to sorter transformation.
STEP 3: Open the properties tab and check the columns to be sorted in the Key
column and specify the type of ordering (Ascending or descending) in Direction
column.
Note: If you want the distinct data out of sorter transformation, we can enable
Distinct option in Properties tab.

STEP 4: Link the output of the sorter transformation to the target. Continue with the
session & validation or results.
Testing Query:
SELECT ContactName, ContactTitle, CompanyName, Phone
FROM Customers
ORDER BY ContactName ASC

TCS Internal

316852746.doc

5/4/2016

ex_2: Extract Orders taken by employees and sort the result by EmployeeID
From Orders & Employees.
Target_file: Orders_by_Employees
(
EmployeeID
FullName
OrderID
OrderDate
)
Testing Query:
SELECT DISTINCT E.EmployeeID, E.LastName+' '+E.FirstName AS FullName,
O.OrderID
FROM Employees E JOIN Orders O
ON
O.EmployeeID=E.EmployeeID
ORDER BY E.EmployeeID
ex_3: Find the age of employees on 5/3/1999(mm/dd/yyyy) and Sort the
result by age in ascending order.
Dim_emp_age
(
FullName
BirthDate
Age
)
Testing Query:
SELECT , E.LastName+' '+E.FirstName AS FullName,
DATEDIFF (day, E.BirthDate, CAST ('19990503' AS DATETIME))/365 AS Age,
E.BirthDate
FROM Employees E
ORDER BY BirthDate desc

SEQUENCE GENERATOR Transformation:

TCS Internal

316852746.doc

5/4/2016

It is similar to the SEQUENCE in Oracle, and Identity in MS SQL.


ex_1: Create dim_customers table with columns from Customers table and
Customer_Key column.
Note: Use Sequence Generator Transformation to generate Customer_Key.
Enable Reset option in Sequence Generator Transformation.
Target table: dim_customers.
(
Customer_Key
+
All columns from Customers table
)
STEP 1: Create a mapping with required source and target structures.
STEP 2: Create a Sequence Generator Transformation. You can see the two columns
in the transformation NEXTVAL (Sequence Number) & CURRVAL (NEXTVAL + 1). Link
the column NEXTVAL to the target column CustomerKey.

Properties Tab
Start Value: Starting Value of the Sequence Number generated.
Increment By: Difference between two consecutive numbers generated.
End Value: The maximum value of the sequence number generated. If it got
reached when session is running the session will fail.
Cycle: When the sequence number generation reaches the maximum value (End
Value) and if our requirement was to restart the sequence generation from start
value, we can enable Cycle option.
Reset: Generated sequence starting with Start Value for each session run.

TCS Internal

316852746.doc

5/4/2016

Iconic view of the mapping.


STEP 3: Create session and continue with the validation process..
RANK Transformation:
ex_1: Extract Top 10 Customers by sales to Dim_Top_10_Cust.
Dim_Top_10_Cust
(
CompanyName
TotalSales
Rank
)
Sample Data:
CustomerNa
me
QUICK-Stop
Ernst Handel

TotalSales
110277.30497741
699
104874.97871398
926

Ran
k
1
2

STEP 1: Create a mapping with required sources & target. Create two joiner to join
three source tables as in testing query.
STEP 2: Add a Aggregator to aggregate the result of the second joiner to customer
level. Use the calculation logic with in the Aggregator.
STEP 3: Add a Rank Transformation, you can see a column named RANKINDEX
already existing in the transformation. This will generate the Sequence number for
Rank column. Get the required columns from Aggregator to Rank transformation.

TCS Internal

316852746.doc

5/4/2016

Enable the Rank column for the field which has to be


There is other column named Group. You can rank among the groups using this
option.
Example:
Source
Employee
100
101
102
100
100
101
102
102

Customer Sales
100A
1000
101A
1050
102A
1200
100B
1600
100C
1810
101B
1100
102B
1675
102C
1675

Rank By Sales

Rank By Sales Group By Employee

RANK Employee Customer Sales


8
100 100A
1000
7
101 101A
1050
6
101 101B
1100
5
102 102A
1200
4
100 100B
1600
3
102 102B
1675
2
102 102C
1675
1
100 100C
1810

RANK Employee Customer Sales


3
100 100A
1000
2
100 100B
1600
1
100 100C
1810
2
101 101A
1050
1
101 101B
1100
3
102 102A
1200
1
102 102B
1675
1
102 102C
1675

If we rank the source using Sales then we will arrive at Rank By Sales data set. If we
rank the source using Sales and grouping by Employee then we will get the result as
next data set. The Ranking will be for each Employee group.
Properties:
1) Number of Ranks: You can fill with number n. These n number of rows will
be retrieved.
2) Top/Bottom: You can fetch Top n rows or Bottom n rows(n from Number of
Ranks).

TCS Internal

316852746.doc

5/4/2016

Iconic View of the Mapping


Testing Query:
SELECT TOP 10 C.CompanyName CustomerName, SUM (D.UnitPrice*D.Quantity*(1D.Discount)) AS TotalSales
FROM Orders O JOIN [Order_Details] D
ON
O.OrderID=D.OrderID
JOIN
Customers C
ON
C.CustomerID=O.CustomerID
GROUP BY C.CompanyName
ORDER BY TotalSales DESC

TCS Internal

316852746.doc

5/4/2016

UPDATE STRATEGY Transformation:


Duplicate the Employees table in Northwind database as UPD_SRC_Employees using
Data Transformation Services of MS-SQL. Use UPD_Employees table as source table to
work with UPDATE STRATEGY transformation.
ex_1: Create table dim_employees with same structure as employees table.
Insert rows from UPD_SRC_Employees to Dim_SGT_Emp using UPDATE
STRATEGY Transformation.
Create a logic in Mapping to Insert the rows only if the row doesnt exist in
target table.
Note: This logic can be built using wizard - Slowly Growing Targets. It will
not support updation.
Use the insert script below for testing.
INSERT INTO UPD_SRC_Employees
VALUES(10,'McKenna','Patricia','Sales Associate','Mr.',
'1962-04-24 00:00:00.000','1995-11-15 00:00:00.000',
'8 Johnstown Road',
'Cork','Co. Cork',null,'Ireland','2967 542','428',
null,'Patricia McKenna served in the Hungry Owl All-Night Grocers and, he joined the
company for the sake of Sales Association commitee. After completing his project he
will be transferred Hungry Owl All-Night Grocers',2,null);
Using DTS Import/Export objects
STEP 1: Connect to Northwind (source) database. Right click on the database and
select All Tasks -> Export Data..

TCS Internal

316852746.doc

5/4/2016

STEP 2: DTS Import/Export Wizard will appear. Click Next>. You can see the Wizard
asking for source details. Fill it will required data ( Server, DatabaseName, UN/PW)
and click next. It will prompt for target details ( fill it with required data) and click
next.

STEP 3: It will prompt for following options. Choose Copy tables(s) options and
continue..

STEP 4: Select the required source tables as shown below and in the Destination
columns give reqired target names. Give the target name as UPD_Employees.
Click Next. Continue the process. Using this Export/Import utility we can export or
import tables not only in SQL Server We can use it for any database. We can see
the list in the drop down box of the data source selection of wizards first figure.
We can create packages similar to informatica mappings. It is a free ETL tool of SQL
Server.

TCS Internal

316852746.doc

5/4/2016

Logic In Building Slowly Growing Targets:


Our idea in getting logic for Slowly Growing target is by comparing source and target
tables.
Get a record from the source.
Check whether the same record is available in target using unique columns (Primary
key). We can use Lookup transformation to achieve this step.
If the source record is not existing in target insert the record to the source table, else
ignore the record.
use expression transformation to generate a flag for insertion. Filter out records for
insertion and pass the data to Update Strategy transformation. Use DD_INSERT
command in the Update Strategy Expression Editor option of Properties tab. Link
columns of Update
Strategy TRT and Target.
ex_2: Run the Update scripts given below in Northwind database.
UPDATE UPD_SRC_Employees
SET Region ='*UNKNOWN'
WHERE Region is null;
Use UPDATE STRATEGY Transformations DD_UPDATE Option in Update Strategy
Expression Editor option of Properties tab to UPDATE the Dim_UPD_Emp tables
with respect to changes made in UPD_SRC_Employees.
Note: Create Mapping to Update only if changes has been made on column Region.
It should not support new insertion.
Compare the Region column of source and target tables. If there is any change
Update the column.

ex_3: SCD TYPE 1: Slowly Changing Dimension

TCS Internal

316852746.doc

5/4/2016

Create a mapping for slowly changing dimension. Check with the mapping with the
above scripts. Use source & target tables used in previous exercise. This mapping
should support both new insertion and Updation.
Run the script below in Northwind database.
INSERT INTO UPD_SRC_Employees
VALUES(11,'Crowther','Simon','Sales Associate','Mr.',
'1962-04-24 00:00:00.000','1995-11-15 00:00:00.000',
'345 Queensbridge',
'London',null,'SW7 1RZ','UK','(171) 555-7733','428',null,'Simon Crowther served in the
North/South and, he joined the company for the sake of Sales Association commitee.
After completing his project he will be transferred North/South to continue',2,null);
UPDATE UPD_SRC_Employees
SET ReportsTo =-1
WHERE ReportsTo is null ;
Test for the changes.

STORED PROCEDURE Transformation:

TCS Internal

316852746.doc

5/4/2016

ex_1: CREATE the procedure given below in the source database. Create a
STORED PROCEDURE Transformation & use it to retrive FullName of a
employee.
Complete Ex_2 of Lookup transformation using stored procedure transformation to
retrieve employees fullname.
Create a procedure given below in the source database using Query Analyzer.
CREATE PROCEDURE SP_FullName @@EmployeeID int, @@FullName varchar(30)
OUTPUT
AS
SELECT @@FullName = FirstName+LastName
FROM Employees
WHERE EmployeeID=@@EmployeeID
GO
Executing the Procedure for testing. (These testing execute commands are only for
single record parameters. For multiple records use cursors)
DECLARE @@FullName varchar(30) EXECUTE SP_FullName 1, @@FullName OUTPUT
BEGIN PRINT @@FullName END
STEP 1: Copy the mapping Ex_2 of Lookup transformation. Change the logic to get
the

Fullname using stored procedure transformation instead of using lookup.

TCS Internal

316852746.doc

5/4/2016

STEP 2: Create a Stored Procedure transformation using icon in the toolbar. It will
prompt for the selecting procedures in the database. As shown above. Select
appropriate database, UN/PW and procedure/Function you need to import and Click
OK.
STEP 3: You can note the new Stored procedure transformation has been created in
the mapping. You can see the input and Output ports for the Stored Procedure
Transformation. Give the required input links and take the required ouput port.

Imported SP TRT

Iconic view of the mapping.


STEP 4: Create session and configure and run and validate.
ex_2: Multiple input/output parameters
Use the procedure below to get the EmployeeID and Country and generate
Annual Income and Taxation. Load the result in the file named Taxation.txt
containing FullName, Annual Income and TaxPayable. The target file should
be of fixed width.
-- DROP PROCEDURE SP_Emp_Taxation
CREATE PROCEDURE SP_Emp_Taxation
@@EmployeeID int,
@@Country varchar (15),
@@FullName varchar(30) OUTPUT,
@@AnnualIncome decimal OUTPUT,

TCS Internal

316852746.doc

5/4/2016

@@Taxation decimal OUTPUT


AS
SELECT
@@FullName = FirstName+LastName,
@@AnnualIncome = 15000*12,
@@Taxation = CASE WHEN @@Country='UK' THEN 15000*12*.4 ELSE 15000*12*.2
END
FROM Employees
WHERE EmployeeID=@@EmployeeID
GO
Executing the Procedure: For testing
DECLARE
@@FullName varchar(30),
@@AnnualIncome decimal,
@@Taxation decimal
EXECUTE SP_Emp_Taxation 1, UK, @@FullName OUTPUT, @@AnnualIncome OUTPUT,
@@Taxation OUTPUT
BEGIN
PRINT @@FullName + ' ,' +CAST(@@AnnualIncome AS CHAR)+' ,'+
CAST(@@Taxation AS CHAR)
END
ex_3: Practice
Use the below procedure to extract Employee FullName, Total Bonus in a
flat file.
-- DROP PROCEDURE SP_Emp_Bonus
CREATE PROCEDURE SP_Emp_Bonus
@@EmployeeID int,
@@Country varchar (15),
@@FullName varchar(30) OUTPUT,
@@TotalBonus decimal(23,4) OUTPUT
AS
SELECT @@FullName = E.FirstName+' '+E.LastName ,
@@TotalBonus = SUM (CASE WHEN @@Country='UK' THEN (D.UnitPrice *
D.Quantity) * (1 - D.Discount)*0.05
ELSE (D.UnitPrice * D.Quantity) * (1 - D.Discount)*0.08 END)
FROM
Employees E JOIN Orders O
ON
E.EmployeeID=O.EmployeeID JOIN [Order_Details] D
ON

TCS Internal

316852746.doc

5/4/2016

O.OrderID=D.OrderID
WHERE O.ShippedDate IS NOT NULL AND
E.EmployeeID=@@EmployeeID
GROUP BY E.FirstName+' '+E.LastName
GO
Executing the Procedure: For testing
DECLARE
@@FullName varchar(30),
@@TotalBonus decimal(23,4)
EXECUTE SP_Emp_Bonus 1, UK, @@FullName OUTPUT, @@TotalBonus OUTPUT
BEGIN
PRINT @@FullName + ' ,' +CAST(@@TotalBonus AS CHAR)
END
ex_4: Use the below function and extract Customer & Sales by Customer
information. Load the data to a Notepad. The same way
-- DROP FUNCTION SF_Cust_Sales
CREATE FUNCTION SF_Cust_Sales
(@CustomerID varchar(5) )
RETURNS decimal(12,3) -- Customer Total Sales.
AS
BEGIN
RETURN ( SELECT SUM (D.UnitPrice*D.Quantity*(1-D.Discount))
FROM Orders O JOIN [Order_Details] D
ON
O.OrderID=D.OrderID
JOIN
Customers C
ON
C.CustomerID=O.CustomerID
WHERE C.CustomerID=@CustomerID )
END
Testing Query:
SELECT CustomerID, dbo.SF_Cust_Sales(CustomerID) AS Sales
FROM Customers;
UNION Transformation
ex_1: Create a Contact lists of Suppliers and Customers with the fields
listed below.

TCS Internal

316852746.doc

5/4/2016

City, CompanyName, ContanctName, Relationship ( If the contact is for


supplier hardcode as Supplier, and if he is a Customer hardcode as
Customer)
Sample Data
City
Aachen
Albuquerque
Anchorage
Ann Arbor
Annecy

CompanyName
Drachenblut Delikatessen
Rattlesnake Canyon Grocery
Old World Delicatessen
Grandma Kelly's Homestead
Gai pturage

ContactName
Sven Ottlieb
Paula Wilson
Rene Phillips
Regina Murphy
Eliane Noz

Relationship
Customers
Customers
Customers
Suppliers
Suppliers

STEPS 1: Create a new mapping. Add Suppliers & Customers Source definitions.
STEPS 2: Create a new target definition with required structures and use Generate &
Execute Options to create the table definition in target database.
STEPS 3: Create two expression transformation for hardcoding the Relationship fiels.
If the contact is for supplier hardcode as Supplier, and if he is a Customer hardcode
as Customer.
STEPS 4: Add a Union Transformation to do a UNION operation on the two data sets
from Suppliers and Customers. Link the output of the Union transformation to the
target structure. Creating Union transformation was explained below.
STEPS 5: Create a session, validate and add to the new workflow and run workflow,
monitor the status. Use the testing Query to validate the results.
Creating UNION Transformation:
STEP 1: Click on the icon in the menu bar and click in the click in the mapping
designer.

STEP 2: You can see the new transformation in the designer palette as below.

STEP 3: Select the required fields from source transformation ( in this exercise select
the fields from expression transformation) and drag the cursor and place it on the
Union transformation and just release the mouse button.

TCS Internal

316852746.doc

5/4/2016

You can see the two different set of fields appearing. One for the OUTPUT ports and
the other for the Input group with name NEWGROUP.

STEP 4: Double click on Union transformation and open the Groups tab. You can see
the properties as below.

You can rename the existing groups and can create any number of groups. To this
exercise rename the existing group for our convenience.

As we create new port, corresponding new set of fields will be created in the ports
tab. After creating two fields you can see the transformation as in below figure.

TCS Internal

316852746.doc

5/4/2016

Link the different data source sets to UNION transformation and link the output to the
target. The Iconic view of the mapping is shown below.

TCS Internal

316852746.doc

5/4/2016

Testing Query :
SELECT
FROM
UNION
SELECT
FROM

City, CompanyName, ContactName, 'Customers' AS Relationship


Customers
City, CompanyName, ContactName, 'Suppliers'
Suppliers

Ex_2: Create a new target table Dim_Locations with the fields City, Country,
PostalCode, Region from Customers, Suppliers and Employees. Just like
confirmed dimensions.
STEP 1: Create a mapping with source and target structures.
STEP 1: Add a union transformation, create three input groups and connect sources
to input groups.
STEP 1: Link the output of the UNION transformation to the Sorter and extract the
distinct values and load to target.
STEP 1: Create session and workflow. Run the workflow and monitor the status.
STEP 1: Check the result against the Query below.
Testing Query :
SELECT
FROM
UNION
SELECT
FROM
UNION
SELECT
FROM

City, Country, PostalCode, Region


Customers
City, Country, PostalCode, Region
Suppliers
City, Country, PostalCode, Region
Employees.

Normalizer Transformation

TCS Internal

316852746.doc

5/4/2016

Ex_1: Use the files embedded below as the source and load the data to the
target table with the structure specified in DDL statement below.

D16220w1.cbl

cus t_acct_own.out

DROP TABLE Dim_D16220W1;


CREATE TABLE Dim_D16220W1
(
WS_CUST_ACCT_NBR varchar(9),
WS_OWN_ID_NBR varchar(9),
WS_OWN_ID_NAM varchar(30),
WS_NAT_ID
varchar(9),
WS_NAT_ID_NAM varchar(30),
WS_CSHOTLY_CD varchar(2),
WS_CSHOTLY_NAM varchar(30)
);
Before Starting:

Double click on the files embedded and save it to the source file locations. The
.cbl file is the file for representing the structure of the file and .out file is the data
file.
Execute the DDL statement in the target data base using SQL Query Analyzer.

STEP 1: Import the Cobol structure to Source Analyzer.

Use Import from Cobol File Option of sources menu to import the cobol structure.
Point to the .cbl file which contains the structure of the file and click ok.

TCS Internal

316852746.doc

5/4/2016

You will get the cobol file (VSAM) structure as below.

STEP 2: Create a new mapping and add cobol source to the mapping you can see
the Normalizer transformation appearing next to the source structure Instead of
Source Qualifier transformation.

TCS Internal

316852746.doc

5/4/2016

STEP 3: Add the target structure to the mapping and link the output ports of the
Normalizer transformation to target.

STEP 4: While configuring session, you need to set the source filename as .out
filename not the .cbl filename. The .out file is the data file. Set the target filename as
cust_acct_own.txt

Open the Set File Properties to set the source file properties. Set the file
properties to Fixed width properties.
Click Advanced and set the Number of initial rows to skip: to 0.
Set Code Page: to IBM EBCDIC US English as below

Run the session and monitor the status and check the results.

TCS Internal

316852746.doc

5/4/2016

Ex_2: Normalize Employee_skills table to Dim_Employee_Skills using


Normalizer transformation.
Run the DDL Script to create the table named Employee_Skills in your
source database:
DROP TABLE Employee_Skills;
CREATE TABLE Employee_Skills
(
EmployeeID int,
Skill_1 nvarchar(25),
Skill_2 nvarchar(25),
Skill_3 nvarchar(25)
);
Run the Insert Script below in source database after creating the table:
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills
'Business Objects');
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills
Objects');
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills
INSERT INTO Employee_Skills

(1, 'Informatica', 'Microstrategy', 'Business Objects');


(2, 'Data Integrator', 'Oracle Warehouse Builder',
(3,
(4,
(5,
(6,

'Informatica', 'Microstrategy', 'Data Stage');


'Data Integrator', NULL, NULL);
'Informatica', 'Microstrategy', NULL);
'Informatica', 'Oracle Warehouse Builder', 'Business

(7, 'Data Integrator', 'Microstrategy', 'Data Stage');


(8, 'Informatica', 'Microstrategy', NULL);
(9, 'Informatica', 'Microstrategy', 'Data Stage');

TARGET STRUCTURE:
CREATE TABLE Dim_Employee_Skills
(
EmployeeKey int,
EmployeeID int,
SkillNumber int,
Skill_Set nvarchar(25)
);
Run the DDL in the target data base. Import the source and target structures from
source and target databases to informatica designer.
STEP 1: Create a new mapping, add the source and target structures.
STEP 2: Create the Normalizer transformation. Open the Normalizer tab and Create
the ports similar to target structure. Set the Occurs field of the pivoting column to
number of columns in source to be pivoted.
Note: In this exercise, the target field Skill_Set has three occurrences (skill_1, skill_2,
skill_3) in source table. Hence set the Occurs field of Skill_Set to 3.

TCS Internal

316852746.doc

5/4/2016

As you set the occurrence to 3 view the ports tab. You can see the three Input ports
has been created for the single output port Skill_Set.
Another two new fields namely GK_Skill_Set and GCID_Skill_Set are created.

The first field corresponds to sequence number generation (similar to Surrogate


key).
The other generates the nth occurrence of the primary field (EmployeeID in
target).

Port structures
STEP 3: Link the Output of the Normalizer transformation to the target. Link the
GK_ field to target Primary key and GCID_ to the Occurence field. Map the other
ports just directly. Validate the mapping. Create the session, run and test the results.

Mapping view

TCS Internal

S-ar putea să vă placă și