Documente Academic
Documente Profesional
Documente Cultură
3.0/3.1
Basic - Case study
1|Page
2010
Revision
Date
Revision
Description
1.0
Page
No.
All
Prev
Page
No.
Action Adde
Taken nda/
New
Page
New
Release
Notice
Reference
V1.0
Page 2
2010
Table of Contents
1. Introduction........................................................................................................................................ 5
1.1 Purpose.............................................................................................................................5
1.2 Course audience ..............................................................................................................5
1.3 Prerequisites.....................................................................................................................5
1.4 Scope................................................................................................................................6
1.5 Structure of the document................................................................................................6
1.6 Solution Roadmap............................................................................................................6
2. Creating Source and Target Metadata............................................................................................... 8
2010
Page 4
2010
1. Introduction
SAP BusinessObjects Data Services is the one platform for data delivery and data quality to move,
integrate, and improve any type of data anywhere at any frequency.
1.1 Purpose
The purpose of this document is to understand the PatformTransforms and Basic functions
available in Data Services XI 3.0/3.1. These Transforms are vital to development of Data flows which
carry the information how to extract, transform and load data. This document covers following
Transforms based on a running hands-on Case-Study conceiving the use of all these Transforms. The
Case-Study is divided in parts based on some underlying concept covering one or more
transformations at one time. The target definitions developed in one mapping may be used as source
definition in another mapping to continue the concept and complete the whole case-study.
The Platforms Transforms and Basic functions covered through this document are:
Transforms
Query
Map Operation
Validation
Merge
Case
SQL
Functions
Lookup_ext()
Decode()
Search_replace()
1.3 Prerequisites
To attend this course, learners should have experience with the following:
Advance Dimension Table Topics such as Surrogate Keys, SCDs (Slowly Changing
Dimensions)
Page 5
2010
1.4 Scope
Lab 8 is to understand and apply the troubleshoot guidelines like debugging and auditing
the Jobs
Lab 9 is to understand the basic activities that can be perform through management
console
Page 6
Define data flow objects like source, targets, functions and transformations
2010
Page 7
2010
2.2.1 Scenario
You have been hired as a Data Services designer for Alpha Acquisition. Alpha has recently acquired
Beta Business, an organization that develops and sells software products and related services.
In an effort to consolidate and organize the data, and simplify the reporting process for the growing
company, the Omega data warehouse is being constructed to merge the data for both organizations,
Page 8
2010
and a separate data mart is being developed for reporting on Human Resources data. You have also
assessed to a database for staging purpose called Delta. To start the development process, you must
create Datastores and import the metadata for all of these data sources
2.2.2 Objective
Create Datastores and import metadata for the Alpha Acquisition, Beta Businesses, Delta, HR Data
Mart, and Omega databases
2.2.3 Instructions
1. In your Local Object Library, create a new source datastore for the Alpha
2. Following are the steps to create a Alpha datastore
1. In the Datastores tab of the Local Object Library, right-click and select New
2. Enter the name of the new datastore in the Datastore name field as Alpha
The name can contain any alphabetical or numeric characters or underscores (_). It
cannot contain spaces
3. Select the Datastore type as database
4. Select the Database type as MySQL
5. Enter the appropriate information for the selected database type like Data source name is
Alpha, User name is Alpha, and Password is Alpha
3. Create some more datastore for the Beta, Delta, HR Data Mart, and Omega databases and
follow the same steps that had been followed to create a Alpha datastore
Page 9
2010
4. Following are the details to create datastore for Beta, Delta, HR Data Mart, and Omega
databases
Datastore
name
Datastore
type
Database
type
Database
version
Alpha
Delta
Omega
HR_Datama
rt
Database
Database
Database
Database
MySQL
MySQL
MySQL
MySQL
MySQL 5.0
MySQL 5.0
MySQL 5.0
MySQL 5.0
Data
source
name
(DSN
name)
Alpha
Delta
Omega
hruser
User
name
Password
Alpha
delta
omega
hruser
alpha
delta
omega
hruser
5. Follow the same steps to import the tables for Beta, Omega and HR_Datamart
Page 10
2010
Page 11
2010
Page 12
Alpha(Source 1)
Beta (Source 2)
Omega (Target)
2010
Tables
Alpha.category
Alpha.city
Alpha.country
Alpha.customer
Alpha.department
Alpha.employee
Alpha.hr_comp_update
Alpha.last_run
Alpha.orders
Alpha.order_details
Alpha.product
Alpha.region
Beta.addrcodes
Beta.categories
Beta.city
Beta.country
Beta.customers
Beta.employees
Beta.orderdetails
Beta.orders
Beta.products
Beta.region
Beta.shippers
Beta.suppliers
Beta.usa_customers
Omega.emp_dim
Omega.product_dim
Omega.product_target
Omega.time_dim
Delta (Staging)
HR_Datamart (Target)
Hr_datamart.emp_dept
Hr_datamart.employee
Hr_datamart.hr_comp_update
Hr_datamart.recovery_status
Page 13
2010
2. Click the plus sign (+) next to the Alpha datastore to view the object types in the Alpha
For example, database Datastores have functions, tables, and template tables. These are
Datastore objects
3. Click the plus sign (+) next to an object type like Tables to view the objects of that type
imported from the datastore
Page 14
2010
4. Select the table name in the list of imported tables to see its metadata
5. Right-click and select Open
Page 15
2010
6. Click the Indexes tab. The left portion of the window displays the Index list.
7. Select the table name in the list of imported tables to view the associated data
8. Right-click and select View Data
Page 16
2010
Page 17
2010
3.1.1 Scenario
After analyzing the source data, we have determined that the structure of the customer data for Beta
Business is the appropriate structure for the customer data in Omega data warehouse, and we must
therefore change the structure of Alpha Acquisitions customer data to use the same structure in
preparation for merging customer data from both Datastores
3.1.2 Objective
Use the Query transform to change the schema of the Alpha Acquisitions Customer table and move
the data into the Delta staging database
3.1.3 Instructions
1. Create a new project called Basic_Training
There are two ways to create a project, one is through the start page of the Designer and
other is using the Local object Library
a)
b)
c)
d)
Go to the Project tab in the Local Object Library click New >> Project
Enter the project name as Basic_Training
Click Create
Notice that the new project appears in the project area
Page 18
2010
b) Right-click the new job and click Rename. Alternatively, left-click the job twice
(slowly) to make the name editable.
c) Type Alpha_customers_JOB
d) Left-click or press Enter
e) To add the job to the open project, drag it into the project area
f)
Notice that the job appears in the project hierarchy under Basic_Training and in
the project tab of the object library
Page 19
2010
f)
g) Double-click the data flow to open the data flow work space
Page 20
appears
in
2010
the
workspace
of
6. Create a new template table called alpha_customers in the Delta datastore as the target
object
a) Open the data flow in the workspace
b) In the Tool palette, click the Template Table icon and click the workspace to
add a new template table to the data flow
c) The Create Template dialog box displays
d) In the Table name field, enter the name for the template table
e) In the In datastore drop-down list, select the datastore for the template table
Page 21
f)
2010
Click OK
7. Add the Query transform to the workspace between the source and target
a) In the Tool palette, click the Transform icon and click the workspace to add a
Query to the data flow
9. Click the name of the query in the project area or in the workspace
Page 22
2010
10. In the transform editor for the Query transform, create the following output columns:
a) To create a column in the Schema Out pane, right click at the Query and Select New
Output Column
Page 23
2010
d) Repeat the same steps for the column mentioned in the table below
Name
Data type
Content type
CustomerID
Firm
ContactName
Title
Address1
City
Region
PostalCode
Country
Phone
Fax
Int
Varchar(50)
Varchar(50)
Varchar(30)
Varchar(50)
Varchar(50)
Varchar(25)
Varchar(25)
Varchar(50)
Varchar(25)
Varchar(25)
Firm
Name
Title
Address
Locality
Region
Postcode
Country
Phone
Phone
Page 24
2010
Page 25
Schema In
Schema Out
CUSTOMERID
COMPANYNAME
CONTACTNAME
CONTACTTITLE
ADDRESS
CITY
REGIONID
POSTALCODE
COUNTRYID
PHONE
FAX
CustomerID
Firm
ContactName
Title
Address1
City
Region
PostalCode
Country
Phone
Fax
2010
13. Validate the data flow by clicking the Validate Current button on the toolbar.
14. The Warning tab indicates that Data Services will convert the data type for the columns that
has been modified by you
Page 26
2010
17. Execute Alpha_Customer_JOB with the default execution properties and save all objects
that have been created
a) First verify the job server is running by looking at the job server icon at the bottom
right of the Designer window. Move the pointer over the icon to see the Job Server
name, machine name, and port number in the status area. If the job server is not
running, the icon will have a red X on it.
b) Select the job name in the project area, in this case JOB_SalesOrg.
c) Right-click and click Execute
d) If you have changed any objects and not yet saved them, Data Services prompts you
to save your work. Click OK
Page 27
2010
e) Data Services validates the job again before running it. If there are syntax errors,
review the error messages and return to the design and validate the effected steps of
the job. Also review any warning messages before dismissing them
f)
If the job validates properly, the Execution Properties dialog box appears
g) Click Ok
Page 28
2010
Page 29
2010
18. Return to the data flow workspace and click the magnifier to view the data in the target table
to confirm that 25 records were loaded
Page 30
2010
Page 31
2010
4. Using Functions
4.1
4.1.1 Scenario
When evaluating the customer data for Alpha Acquisition, we discover a data entry error where the
contact title of Account Manager has been entered as Accounting Manager. We want to cleana up this
data before it moved to the data warehouse.
4.1.2 Objective
Use the search_replace function in an expression to change the contact title from Accounting
Manager to Account Manager.
4.1.3 Instructions
1. In the Alpha_Customers_DF workspace, open the transform editor for the Query Transform
2. On the Mapping tab, delete the existing expression for the Title column
3. Using the Function wizard, create a new expression for the Tiltle column using the
search_replace function (under string functions)to replace the full string of Accounting
Manager with Account Manager
Page 32
2010
Page 33
2010
7. Click Finish
Page 34
2010
8. Note: Be aware that the search_replace function can react unpredictably if you use the
external table option
9. Notice that the new expression appears on the Mapping tab.
10. Execute Alpha_Customers_JOB with the default execution properties and save all objects
you have created
11. Return to the data flow workspace and view data for the target table
12. Notice that Account Manager and not Accounting Manager appears in the Title column
for CustomerID 12358, CustomerID 12359, and CustomerID 12360.The data has been
cleaned up as required.
Page 35
4.2
2010
4.2.1 Scenario
In the Alpha Acquisition database, the country for a customer is stored in a separate table and
referenced with a foreign key. To speed up access to information in the data warehouse, this lookup
should be eliminated.
4.2.2 Objective
Use the Lookup_ext function to swap the ID for the country in the Customers table for Alpha
Acquisitions with the actual value from the Countries table.
4.2.3 Instructions
1. In the Alpha_Customers_DF workspace, open the transform editor for the Query Transform
2. On the Mapping tab, TO delete the current expression for the Country column, click country
in the schema out pane of the Query transform
Page 36
2010
13. Using the Function wizard, create a new lookup expression for the Country column using
the lookup_ext() function with the following parameters :
Field/Option
Value
Translate table
Condition
Table column
Op
Expression
Output Parameters
Table column
Alpha.alpha.country
COUNTRYID
=
Customer.COUNTRYID
COUNTRYNAME
Page 37
2010
Page 38
2010
Page 39
2010
lookup_ext([Alpha.alpha.country,'PRE_LOAD_CACHE','MAX'],
[COUNTRYNAME],[NULL],[COUNTRYID,'=',customer.COUNTRYID]) SET
("run_as_separate_process"='no')
36. Execute Alpha_Customers_JOB with the default execution properties and save all objects
you have created
Page 40
2010
37. Return to the data flow workspace and view data for the target table after the Lookup
expression is added
38. Notice that the COUNTRYID column with numerical values has been replaced with a
country column displaying the country names
4.3
4.3.1 Scenario
Now need to calculate the total value of all orders, including their discounts, for reporting purposes.
4.3.2 Objective
Use the Sum and Decode functions to calculate the total value of orders in Order_Details table.
4.3.3 Instructions
1. In the Basic_Training project, create a new batch job called Alpha_Order_Sum_JOB with a
data flow called Alpha_Order_Sum_DF
2. In the Alpha_Order_Sum_DF workspace, add the Order_Details and Product tables from
the Alpha Datastore as the source object.
Page 41
2010
3. Add a new template table to the Delta datastore called order_sum as the target object
4. Add a Query transform and connects all the objects
5. In the transform editor for the Query transform, on the WHERE tab, propose a join between
the two source tables
a) Double-click the Query transform
b) Click WHERE
c) Click Propose Join
6. To Map the ORDERID column from the input schema to the output schema, Drag the
ORDERID input column to the schema out pane
7. Create a new output column called TOTAL_VALUE with a data type of decimal(10,2)
8. On the mapping tab of the new output column, use function wizard or the Smart Editor to
construct an expression to calculate the total value of the orders using the decode and sum
functions
The discount and order total can be multiplied to determine the total after discount. The
decode function allows you to avoid multiplying order with zero discount by zero
Consider the following:
The expression must specify that if the value in the DISCOUNT column is not
zero(conditional expression), then the total value of the order is calculated by multiply the
QUANTITY from the order_details table by the COST from the product table, and then
multiplying that value by the DISCOUNT (Case expression).
Otherwise, the total value of the order is calculated by simply multiplying the QUANTITY
from the order_details table by the COST from the product table (Default expression)
Page 42
2010
Once these values are calculated for each order, a sum must be calculated for the entire
collection of orders
The expression should be:
10. Execute the Alpha_Order_Sum_JOB with the default execution properties and save all
objects
11. Return to the data flow workspace and view data for the target table after the decode
expression is added to confirm that order 11146 has a total value of $204,000
Page 43
2010
5.1.1 Scenario
In addition to the main databases for source information, records for some of the orders for Alpha
Acquisition are stored in flat files
5.1.2 Objective
Create a file format for the orders flat files so you can use them as source objects
5.1.3 Instructions
o
In the Local Object Library, create a new delimited file format called Orders_Format for the
orders_12_21_06.txt flat file in the CS0_Source folder
To adjust the format so that it reflect the source file, consider the following:
The column delimiter is a semicolon (;)
Page 44
2.
2010
In the Column Attributes pane, adjust the data type for the columns based on their content
Column
Data type
ORDERID
EMPLOYEEID
ORDERDATE
CUSTOMERID
COMPANYNAME
CITY
COUNTRY
Int
Varchar(15)
Date
Int
Varchar(50)
Varchar(50)
Varchar(50)
Page 45
3.
2010
Save your changes and view the data to confirm that order 11196 was placed on December
12, 2006
Page 46
2010
6.1.1 Scenario
End users of employee reports have requested that employee records in the data mart contain only
current employees.
6.1.2 Objective
Use the Map Operation transform to remove any employee records that have a value in the
discharge_date column
6.1.3 Instructions
1. In the Basic_Training project, Create a new batch job called Alpha_Employees_Current_JOB
with a data flow called Alpha_Employees_Current_DF.
2. In the data flow workspace, add the Employee table from the Alpha datastore as the source
object
3. Add the Employee table from the HR_datamart datastore as the target object
4. Add the Query transform to the workspace and connect all objects.
5. In the transform editor for the Query transform, map all columns from the input schema to the
same column in the output schema
6. On the WHERE tab, create an expression to select only those rows where discharge_date is
not empty
The expression should be :
Employee.discharge_date is not null
Page 47
2010
7. In the data flow workspace, disconnect the Query transform from the target table
8. Add a Map Operation transform between the Query transform and target table and connect it
to both
9. In the transform editor for the Map Operation, change the settings so that rows with an input
operation code of NORMAL have an output operation code of DELETE.
a) Double-click the Map Operation transform
b) Click in the NORMAL output type field
c) Map the input type of NORMAL to an output type of delete
d) Click Delete
Page 48
2010
10. Execute Alpha_Employees_Current_JOB with the default execution properties and save all
objects
11. Return to the data flow workspace and view data for both the source and target tables.
12. Notice that two rows were filtered and there are only 46 rows in the target table
Page 49
2010
Validation Transform
Validation Transform enables you to create validation rules and move data into target objects based
on whether they pass or fail validation
7.1.1 Scenario
Order data is stored in multiple formats with different structure and different information. We will use
the validation transform to validate order data from flat file sources and the Alpha orders table
merging it.
7.1.2 Objective
Join the data in the Orders flat files with that in the Order_Shippers flat files.
Create a column on the target table for the employee information so that orders taken by
employees who are no longer with the company are assigned to a default current employee using
the validation transform in a new column named order_assigned_to.
Create a column to hold the employeeID of the employee who originally made the sale.
Replace null values in the shipper fax column with a value of No Fax and send those rows to a
separate tables for follow up
7.1.3 Instructions
1. Create
a
file
format
called
Order_Shippers_Format for the flat file
Order_Shippers_04_20_07.txt. (as you created before) Use the structure of the text file to
determine the appropriate settings.
2. In the Column Attribute pane, adjust the data types for the columns based on their content:
Column
Data type
ORDERID
SHIPPERNAME
SHIPPERADDRESS
SHIPPERCITY
SHIPPERCOUNTRY
SHIPPERPHONE
SHIPPERFAX
SHIPPERREGION
SHIPPERPOSTALCODE
Int
Varchar(50)
Varchar(50)
Varchar(50)
Int
Varchar(20)
Varchar(20)
Int
Varchar(15)
Page 50
2010
4. Add the file formats Orders_Format and Order_Format_source as source objects to the
Alpha_Orders_File_DF data flow workspace
5. Edit the source objects so that the Orders_Format is using all three related orders flat files
and the Order_Shippers_Format source is using all three order shippers files
Page 51
2010
6. Add the Query transform to the workspace and connect it to the source objects.
Page 52
2010
7. In the transform editor for the Query transform, create a WHERE clause to join the data on
the OrderID values
The expression should be as follows:
Orders_Shippers_Format.ORDERID = Orders_Format.ORDERID
8. Add the following mapping in the Query transform
Schema Out
Mapping
ORDERID
CUSTOMERID
ORDERDATE
SHIPPERNAME
SHIPPERADDRESS
SHIPPERCITY
SHIPPERCOUNTRY
SHIPPERPHONE
SHIPPERFAX
SHIPPERREGION
SHIPPERPOSTALCODE
Orders_Format.ORDERDATE
Orders_Format.CUSTOMERID
Orders_Format.ORDERDATE
Orders_Shippers_Format.SHIPPERNAME
Orders_Shippers_Format.SHIPPERADDRESS
Orders_Shippers_Format.SHIPPERCITY
Orders_Shippers_Format.SHIPPERCOUNTRY
Orders_Shippers_Format.SHIPPERPHONE
Orders_Shippers_Format.SHIPPERFAX
Orders_Shippers_Format.SHIPPERREGION
Orders_Shippers_Format.SHIPPERPOSTALCODE
9. Insert a new output column above ORDERDATE called ORDER_TAKEN_BY with data type
of varchar(15) and map it to Orders_Format.EMPLOYEEID
10. Insert a new output column above ORDERDATE called ORDER_ASSIGNED_TO column with
data type of varchar(15) and map it to Orders_Format.EMPLOYEEID
Page 53
2010
11. Add a Validation transform to the right of the Query transform and connect the transforms.
12. In the transform editor for the Validation transform, enable validation for the
ORDER_ASSIGNED_TO column to verify the value in the column exists in the
EMPLOYEEID column of the Employee table in the HR_datamart datastore
The expression should be as follows:
a) Double-click the validation transform
b) Click ORDER_ASSIGNED_TO
c) Select the Enable Validation check box
d) Click Exists in table
e) Click the Exists in table drop-down arrow
f)
Click HR_Datamart
g) Click OK
Page 54
2010
h) Click employee(HR_Datastore.hr_datamart)
i)
Click OK
j)
Click EMPLOYEEID
k) Click OK
Page 55
2010
HR_datamart.hr_datamart.employee.EMPLOYEEID
14. Set the action on failure for the Order_Assigned_To column to send to both pass and fail
For pass, substitute 3Cla5 to assign it to default employee
Page 56
2010
15. Enable validation for the SHIPPERFAX column to send NULL values to both pass and fail,
substituting No Fax for pass
a) Click SHIPPERFAX
b) Select the Enable Validation check box
c) Click the condition drop-down list
d) Click IS NULL
e) Click Send to both
f)
16. Add two target table in the Delta datastore as targets, one called Orders_Files_Work and
one called Orders_Files_No_Fax
17. Connect the pass output from the validation to Orders_Files_Work and fail output to
Orders_Files_No_Fax
Page 57
2010
18. Now back to the second data flow Alpha_Orders_DB_DF workspace, add the Orders table from
the Alpha datastore as the source object
19. Add a Query transform to the workspace and connect it to source
20. In the transform editor for the Query transform, define the following mappings:
Page 58
2010
Page 59
2010
Page 60
2010
Page 61
2010
Page 62
Mapping
ORDERID
CUSTOMERID
ORDERDATE
SHIPPERNAME
SHIPPERADDRESS
SHIPPERCITY
SHIPPERCOUNTRY
SHIPPERPHONE
SHIPPERFAX
SHIPPERREGION
SHIPPERPOSTALCODE
Orders.ORDERID
Orders,CUSTOMERID
Orders.ORDERDATE
Orders.SHIPPERNAME
Orders.SHIPPERADDRESS
2010
Orders.SHIPPERCITYID
Orders.SHIPPERCOUNTRY
Orders.SHIPPERPHONE
Orders.SHIPPERFAX
Orders.SHIPPERREGION
Orders.SHIPPERPOSTALCODE
21. Insert a new output column above ORDERDATE called ORDER_TAKEN_BY with a data type
of varchar(10) and map it to Orders.EMPLOYEEID
22. Insert a new output column above ORDERDATE called ORDER_ASSIGNED_TO with a data
type of varchar(10) and map it to Orders.EMPLOYEEID
Page 63
2010
23. Add a Validation transform to the right of the Query transform and connect the transforms
24. Enable validation for Order_Assigned_To to verify the column value exists in the
EMPLOYEEID column of the Employee table in the HR_datamart datastore
25. Set the action on failure for the Order_assigned_To column to send to both pass and fail
For Pass, substitute 3Cla5 to assign it to the default employee
26. Enable validation for the ShipperFax column to send NULL values to both pass and fail,
substituting No Fax for pass
Page 64
2010
27. Add two target tables in the Delta datastore as targets, one named Orders_DB_Work and
one named Orders_DB_No_Fax
28. Connect the pass output from the Validation transform to Orders_DB_Work and fail to
Orders_DB_No_Fax
29. Execute Alpha_Orders_Validated_JOB with the default execution properties and save all
objects
30. View the data in the target tables to view the differences between passing and failing
records.
Page 65
2010
Merge Transform
8.1.1 Scenario
The Orders data has now been validated, but the output is for two different sources: Flat files and
databases tables. The next step in the process is to modify the structure of those data sets to they
match, and then merge them into a single data set.
8.1.2 Objective
Use the Query transform to modify any column names and data types and to perform lookups for
any columns that reference other tables.
8.1.3 Instructions
1. In the Basic_Training project, create a new batch job called Alpha_Orders_Mergerd _JOB
with a data flow called Alpha_Orders_Mergerd _DF
Page 66
2010
2. In the data flow workspace, add the orders_file_work and orders_db_work tables from the
Delta datastore as the source objects.
3. Add two Query transform to the dataflow, connecting each source object to its own Query
transform
4. In the transform editor for the Query transform connected to the orders_files_work table, map
all the columns from input to output
5. Change the data type for the following columns as specified:
Column
Data type
ORDERDATE
SHIPPERADDRESS
SHIPPERCOUNTRY
SHIPPERREGION
SHIPPERPOSTALCODE
Datetime
Varchar(100)
Varchar(50)
Varchar(50)
Varchar(50)
g) Click Alpha
Page 67
2010
h) Click OK
i)
Click Country(Alpha.alpha)
j)
Click OK
Click COUNTRYNAME
s) Click COUNTRYNAME
t)
Click Finish
7. For the SHIPPERREGION column, change the mapping to perform a lookup of RegionName
from the Region table in the Alpha datastore
Page 68
2010
8. In the transform editor for the Query transform connected to the orders_db_work table, map
all columns from the input to output.
9. Change the data type for the following columns as specified:
Column
Data type
SHIPPERCOUNTRY
SHIPPERREGION
Varchar(50)
Varchar(50)
10. For the SHIPPERCITY column, change the mapping to perform a lookup of CityName from
the City table in the Alpha datastore
11. For the SHIPPERCOUNTRY column, change the mapping to perform a lookup of
CountryName from the Country table in the Alpha datastore.
12. For the SHIPPERREGIONID column, change the mapping to perform a lookup of
RegionName from the Region table in the Alpha datastore.
13. Add a Merge transform to the data flow and connect both Query transform to the Merge
transform.
14. Add a template table called Orders_Merged in the Delta datastore as the target table and
connect it to the Merge transform.
Page 69
2010
15. Execute Alpha_Orders_Merged_JOB with the default execution properties and save all
objects.
16. View the data in the target table
17. Notice that the SHIPPERCITY, SHIPPERCOUNTRY, and SHIPPERREGION columns for the
363 records in the template table consistently have names versus ID values
Page 70
2010
Page 71
2010
Case Transform
The Case transform supports separating data from a source into multiple targets based on branch
logic does not offer any options
9.1.1 Scenario
Once the orders have been validated and merged, the resulting data set must be split out by quarter
for reporting purposes.
9.1.2 Objective
Use the Case transform to create separate tables for orders occurring in fiscal quarter 3 and 4 for
the year 2007 and quarter 1 of 2008.
9.1.3 Instructions
1. In the Basic_Training project, create a new batch job called Alpha_Orders_By_Quarter _JOB
with a data flow called Alpha_Orders_By_Quarter _DF
2. In the data flow workspace, add the Orders_Merged table from the Delta datastore as the
source objects.
3. Add a Query transform to the dataflow, connect it to the source object.
4. In the transform editor for the Query transform, map all columns from input to output.
5. Add the following two output columns:
Column
Data type
Mapping
ORDERQUARTER
ORDERYEAR
Int
QUARTER(orders_merged.ORDERDATE)
Varchar(4)
TO_CHAR(orders_merged.ORDERDATE,'YYYY')
Page 72
2010
6. Add a Case transform to the data flow and connect it to the Query transform
7. In the transform editor for the Case transform, Create the following labels and associated
expressions
Label
Expression
Q42006
Q12007
Query.ORDERYEAR
=
Query.ORDERQUARTER = 1
Query.ORDERYEAR
=
Query.ORDERQUARTER = 2
Query.ORDERYEAR
=
Query.ORDERQUARTER = 3
Query.ORDERYEAR
=
Query.ORDERQUARTER = 4
Q22007
Q32007
Q42007
'2007'
and
'2007'
and
'2007'
and
'2007'
and
8. Choose the settings to not produce a default output set for the Case transform and to specify
that rows can be true for one case only.
Page 73
2010
9. Add five template tables in the Delta datastore called Orders_Q4_2006, Orders_Q1_2007,
Orders_Q2_2007, Orders_Q3_2007, and Orders_Q4_2007.
10. Connect the output from the Case transform to the target tables selecting the corresponding
labels.
11. Execute Alpha_Orders_By_Quarter_Job with the default execution properties and save all
objects.
Page 74
2010
12. View the data in the target tables and confirm that there are 103 orders that were placed in
Q1 of 2007
13. Notice that orders_Q1_2007 only contains data with an ORDERQUARTER value of 1 and
an ORDERYEAR value of 2007.
The Source table has been correctly shorted into three target tables according to
ORDERQUARTER and ORDERYEARS values.
Page 75
2010
10.1.2 Objective
Use the SQL transform to select employee and department data
10.1.3 Instructions
1. In the Basic_Training project, create a new batch job called Alpha_Employees_Dept_JOB
with a data flow called Alpha_Employees_Dept_DF
2. In the data flow workspace, add the SQL transform as the source objects
3. Add the Emp_Dept table from the HR_datamart as the target object, and connect the
transform to it.
4. In the transform editor for the SQL transform, specify the appropriate datastore name and
database type for the Alpha datastore.
Page 76
2010
5. Create a SQL statement to select the last name and first name for the employee from the
Employee table and the department in which the employee belongs by looking up the value in
the Department table based on the Department ID
8. Execute Employees_Dept_JOB with the default execution properties and save all objects.
9. Return to the data flow workspace and view data for the target table.
10. Notice that you should have 40 rows in your target table, because there were 8 employees in
the employee table with department IDs that were not defined in the department table.
Page 77
Specifications
Specifications
2010
Framew ork
Definition
Bottom- Up Approach
Page 78