Sunteți pe pagina 1din 33

Informatica Developers Handbook

Informatica Architecture CommonLE Integration Design

Table of Contents
TABLE OF CONTENTS.................................................................................................................................. 2 1 BACKGROUND............................................................................................................................................. 3 2 DETAILED ETL PROCEDURES................................................................................................................... 4 3 INFORMATICA STANDARDS...................................................................................................................... 7 4 BUILD AND UNIT TEST ACTIVITIES......................................................................................................... 14 APPENDIX A: STEP-BY-STEP APPLICATION OF CODE TEMPLATE TO CORE PROCESSES..............21 APPENDIX B: ACCESSING COMMONLE LOGS........................................................................................24 APPENDIX C: IMPLEMENTING RECORD-LEVEL EXCEPTION LOGGING INTO CORE PROCESSES...29 APPENDIX D: IMPLEMENTING RECORD-LEVEL AUDIT LOGGING INTO CORE PROCESSES............32

Informatica Architecture CommonLE Integration Design

1 Background
1.1 Purpose
This document has been created to provide a more detailed understanding of the ETL patterns and the usage of Informatica as it related to Project OneUP. This document should be leveraged during the technical design and build phases of the development effort. This document is NOT static. As architecture patterns evolve and new best practices are introduced and implemented, the pages that follow will be updated to reflect these changes.

1.2 Intended Audience


This documentation is geared towards Integration Solution Architects, Technical Designers, and Informatica conversion and interface developers. Integration Solution Architects will gain a deeper knowledge of the technology being used to extract and load data from one system to the next. With this knowledge, the ISAs will be prepared to ask better questions of the business process teams to gain additional insight to improve the quality of data transfer as well as the quality of the SID documentation. Technical designers will use this documentation to understand when to utilize various extract and load strategies, what types of data conversion database objects need to be created, and how conversions and interfaces differ as business processes as well as units of code. The code developers will use this as a guideline for standards, conventions, and best practices as well as a first resource for answering questions relevant to development.

Informatica Architecture CommonLE Integration Design

2 Detailed ETL Procedures


2.1 Informatica ETL Interface Strategies
Within each of the patterns, a typical code design approach is outlined. In addition to this brief outline, the section on Workflow Development will also delineate the constructs of the process flow and workflow details within Informatica.

2.1.1 Interface Patterns


Interfaces that are developed using Informatica as the middleware technology will typically be point to point batch routines that are scheduled for source and target. The AI interface pattern document outlines each pattern identified for Project One Up.

2.1.1.1 Detailed Logical Architecture

1. 2.

Audit log is triggered to denote middleware will be receiving data. Source data is extracted via the specific source extract strategy defined for the interface. a. b. Source data is pulled directly from the source. Data is staged within the middleware database to support multiple requirements for the source data.

3.

Data is transformed via the ETL tool into the target-specific format(s).

Informatica Architecture CommonLE Integration Design


4. 5. 6. 7. 8. Cross reference lookups are performed during the source-to-target mapping. Data is marked for insert/update/delete to the target application. Data is loaded to the target application based upon the format specified. Audit log is triggered to denote middleware has processed the data. Error handling will be triggered based upon the status of preceding steps. a. b. All-or-nothing error handler Record-by-record error handler

This interface pattern does not require use of the middleware database. The middleware database (labeled Batch Data Store) in Step 2 is utilized to accomplish any one of the following requirements of the business process: Multiple passes through each received data set (for example, if source data is sent only once and multiple mappings will require this information, it is best to store the data within a database to facilitate one process to receive data and multiple process to load data) Audit trail for logging purposes SOX compliance requirements Error handling

2.2 Informatica Error Logging and Exception Handling 2.2.1 Informatica Standard Task Level Error Logging
When logging audit and exception data to CommonLE either task level or row level error logging can be utilized. Task level is required by all interfaces to track failure or success of all interface sessions within a workflow. The standard implementation is outlined in the Appendix for Audit Log and Error Messaging (CommonLE).

2.2.2 Informatica Row Level Error Logging


Row level error logging is specified by business requirements and is either implemented through one of the exception patterns described in the Informatica Error Handling Design document or by using informaticas row level error logging functionality (verbose logging). As an alternative to the exception patterns, verbose logging within Infomatica can be utilized. Keep in mind verbose logging within the Informatica session can greatly reduce performance of the session run. When configuring sessions, a developer has multiple options for error handling, error logging, and traceability levels. When an error occurs at the transformation level (per row/record), the PowerCenter Server logs error information that allows a support team to determine the cause and source of the error. Row error logging may be captured within a database format or using flat file structures. For Project OneUP, a decision has been made to use the database format option for row error logging purposes. The relational database structure will allow the Application Integration team to standardize the format and content of the error logs and manage this portion of the application within one central location. In addition to capturing error data based upon the row being processed within transformations, the PowerCenter Server may also be able to capture the source data associated with the row in a transformation. However, Informatica will be unable to create a link between the row level error in a transformation and the source record within the source qualifier if the error occurs after an active source. An active source within Informatica is defined as an active transformation used to generate rows. Here is a list of the following transformations that are classified as active:

Informatica Architecture CommonLE Integration Design Aggregator Application Source Qualifier Custom, configured as an active transformation (It has been assumed that SAP custom transformations fall into this category as well) Joiner MQ Source Qualifier Normalizer (VSAM or pipeline) Rank Sorter Source Qualifier XML Source Qualifier Mapplet, if it contains any of the above active transformations

By default, the PowerCenter Server will log all transformation errors within the session log file and all rejected target records into the reject or bad file. When row error logging has been enabled, all such information is now filtered to the error log database/flat file structures. If the architecture landscape determines that all errors should reside in the error logging structures and the standard session log and reject/bad file, then the configuration should include enabling Verbose Data Tracing. All of this additional logging may negatively impact the performance of sessions and workflows being executed on the PowerCenter server, as data are being processed on a row-by-row basis instead of a block of records at once.

Informatica Architecture CommonLE Integration Design

3 Informatica Standards
3.1 Workflow Development
For each business object, it is possible that multiple workflows exist to perform the full spectrum of interface activities from legacy to SAP. A workflow is defined as a set of sessions and other tasks (commands calling shell scripts, decision and control points, e-mail notifications, etc.) organized in concurrent and/or parallel processing streams. Each workflow will execute a mapping or series of mappings that extract source data and load it into target systems. Working with the AI team, each Solution Integration Design will need to be modularized into workflows that perform the required predefined business functions. As a result, the interface programs built for a particular business object within the Solution Integration Design documentation could span multiple workflows and thus multiple technical design documents (as each technical design is at the workflow level).

3.2 Code Naming Standards


The following tables reflect the naming standards that have been outlined in Pepsicos ETL-Informatica-Design-BestPractices document.

3.2.1 Code Comments


Within the Informatica code base, mappings, sessions, and workflows have a high-level description or comment field that is displayed when editing any of these units of code. Within the mapping section, be sure to add text that defines the author, date of comment, description of the mapping/session/workflow, and a version control section. Below is a sample of the mapping description that should be inserted into each mapping built for QTG1.

Author: Developer Name Date: 01/01/2005 Description: This mapping performs the core functionality for the XYZ interface. ================ Revision History: ================ 1.0 01/01/2005 - Initial development
In addition to this comment, each of the transformations within a mapping should also have a brief explanation defining its functionality within the mapping.

3.2.2 Transformation Naming Standards


Type of Transform Source Definition Naming Convention [table_name] or [flat_file_name] Description/Example The source definition should carry the same name as the Flat File or Relational Table that it was imported from. If the source was created from a shortcut, that should be indicated in the name. The target definition should carry the same name as the Relational Table it was imported from. If the target was created from a shortcut, that should be indicated in the name. Flat File targets should have _FF at the end of the name. The ACTION will correspond to the DML being performed on the target INS, UPD, DEL. sq_name of Source or sqo_name of Source if SQL override is used.

Target Definition

[table_name] or [flat_file_name]_ACTION

Source Qualifier

sq_[source_name] sqo_[source_name]

Informatica Architecture CommonLE Integration Design


Type of Transform Expression Update Strategy Router Filter Aggregator Lookup Naming Convention exp_[RelevantDescriptor] upd_[target_name]_ACTION rtr_[RelevantDescriptor] fltr_[RelevantDescriptor] agg_[RelevantDescriptor] lkp_[source_name] or lkp_[RelevantDescriptor] or lkpo_[RelevantDescriptor] seq_[RelevantDescriptor] Description/Example exp_RelevantDescriptionOfTheProcessBeingDone An update strategy should have a suffix appended to it corresponding with the particular action (INS, UPD, DEL) rtr_ RelevantDescriptionOfTheProcessionBeingDone fltr_ RelevantDescriptionOfTheProcessionBeingDone agg_ RelevantDescriptionOfTheProcessionBeingDone If one table: lkp_LookupTableName; If multiple tables are joined to bring back a result: lkp_ RelevantDescriptionOfTheProcessionBeingDone. If SQL override is used lkpo_... Typically the description is based upon the target table and the primary key column that the sequence will be populating. This is used when executing stored procedures from the database. Used for external procedures Used for advanced external procedures

Sequence Generator Stored Procedure External Procedure Advanced External Procedure Joiner Normalizer Rank Mapplet Sorter Transformation Transaction Control Union XML Parser XML Generator Custom Transformation IDoc Interpreter

sp_StoredProcedureName ext_ProcedureName aep_ProcedureName

jnr_SourceTable/FileName1_ SourceTable/FileName2 Nrm_[RelevantDescriptor] rnk_[RelevantDescriptor] Mplt_[RelevantDescriptor] srt_[RelevantDescriptor] tc_[RelevantDescriptor] un_[RelevantDescriptor] Xmp_[RelevantDescriptor] Xmg_[RelevantDescriptor] ct_[RelevantDescriptor] int_[RelevantDescriptor]

Used to join disparate source types: Oracle to Flat File for example. Used to create multiple records from the one record being processed. For example: nrm_Create_Error_Messages rnk_ RelevantDescriptionOfTheProcessionBeingDone mplt_ RelevantDescriptionOfTheProcessionBeingDone srt_ RelevantDescriptionOfTheProcessionBeingDone tc_RelevantDescriptionOfControl un_RelevantDescriptionOfUnion xmp_RelevantDescriptionOfXMLParser xmg_RelevantDescriptionOfGenerator ct_RelevantDescriptionOfCustomTransformation int_idoc_RelevantDescriptionOfCustomTransformation

* Wherever possible, transformations should include the $PMRootDir/<release>/Temp and $PMRootDir/<release>/Cache directories. Such transformations include but are not limited to:
Transformation Name Sorter Joiner Aggregator Lookup Rank Directory

$PMRootDir/<release>/Temp $PMRootDir/<release>/Cache $PMRootDir/<release>/Cache $PMRootDir/<release>/Cache $PMRootDir/<release>/Cache

Informatica Architecture CommonLE Integration Design

3.2.3 Informatica Code Object Naming Standards


Code Object Mapping Naming Convention m_ <RICEF_TYPE> _ <PROCESS_AREA> _ <SOURCE> _ <TARGET> _ <Optional Information> Description/Example The mapping is the main unit of code for Informatica. It will be important to include the RICEF type, typically it will be CONV for Conversions. The target is required and the source is typically used when trying to differentiate among multiple mappings that affect the same target. Version numbers will not be used for this implementation. s_m_MappingName without the version number attached. The session is the wrapper for the mapping containing all connection information necessary to extract and load data. The workflow is a job stream that strings all necessary tasks together to create a data flow from source to target systems.

Session

s_m_ <RICEF_TYPE> _ <PROCESS_AREA> _ <SOURCE> _ <TARGET> _ <Optional Information> Wf_ <RICEF_TYPE> _ <PROCESS_AREA > _ <SPECIFIC_DESCRIPTOR or BUSINESS_OBJECT>_ <SRC>_<TGT>_<Optional Information> (ie: wf_INTFC_ISCP_INVENT_INF O_BW_I2)

Workflow

Worklets

Wklt_description.

Worklets are objects that represent a set of workflow tasks that allow you to reuse a set of workflow logic in several workflows. This is a session that may be shared among several workflow and may execute while another instance of the same session is running. You can use the Control takes to stop, abort, or fail the toplevel workflow or the parent workflow based on an input link condition. Event-Raise task represents a user-defined event. When the Informatica Server executes the Event-Raise task, the Event-Raise task triggers the event. Use the Event-Raise task with the Event-Wait task to define events. The Event-Wait task waits for an event to occur. Once the event triggers, the Informatica Server continues executing the rest of the workflow.

Reusable Session

rs_description

Cntrl Task

Cntrl_description

Event Task

Evnt_description

Decision Task

Dcsn_description

The Decision task allows you to enter a condition that determines the execution of the workflow, similar to a link condition. The Command task allows you to specify one or more shell commands to run during the workflow. For example, you can specify shell commands in the Command task to delete reject files, copy a file, or archive target files. The Workflow Manager provides an Email task that allows you to send email during a workflow. You can create reusable Email tasks in the Task Developer for any type of email. Or, you can create non-reusable Email tasks in the Workflow and Worklet Designer. The Assignment task allows you to assign a value to a user-defined workflow variable. The Timer task allows you to specify the period of time to

Command Task

Cmd_description

Email Task

eml_description

Assignment Task Timer Task

asmt_description tm_description

Informatica Architecture CommonLE Integration Design


Code Object Naming Convention Description/Example wait before the Informatica Server executes the next task in the workflow. You can choose to start the next task in the workflow at an exact time and date. You can also choose to wait a period of time after the start time of another task, workflow, or worklet before starting the next task.

3.2.4 Port Variable Naming Standards


Port Type Variable Output Naming Convention v_ReleventName o_RelevantName or out_RelevantName (only set this for new output ports created in an expression transformation) i_Relevant_Name or in_RelevantName (only set this for input ports into a lookup) lk_RelevantName (only set this in transformations for ports that originated in a lookup transformation) r_RelevantName Description/Example Used in expression transformations Used in expression transformations to define the outgoing port for use in subsequent transformations.

Input

Used in lookup and expression transformations to denote ports that are used within the transformation and do not carry forward. Used in expression transformations for unconnected lookups.

Lookup

Return

Return values are found in lookup transformations and are typically the column from the source object being referenced in the lookup code.

3.3 Connection Configuration Standards


Each session within the workflow is associated to a mapping. The mapping consists of source, target and transformation objects. Within each of the source and target objects are connection parameters which are configured at the session level in Workflow Manger. The connection strings are documented in the QTG2 Informatica Connections List.xls spreadsheet. This document can be found under the following StarTeam directory: 1UP - Informatica\QTG2\Supplement.

3.4 General Best Practices 3.4.1 Log File Names


Log File Names Validate that all file names for logs match the unit of code. When workflow names are changed from wf_INT_LOAD to wf_INTFC_LOAD for example, the log file will remain wf_INT_LOAD.log until the developer changes the log file name. This is true of sessions as well. Validate that all workflow and session log names match the name of the corresponding unit of code.

3.4.2 Session Development Standards


All session parameters need to be set in the Task developer at session level and not overridden in the Workflow (in Workflow manager)

3.4.3 Lookup Transformations


Lookups should be created to return a default value of -1 in case of a lookup failure.

Informatica Architecture CommonLE Integration Design

3.5 Informatica Middleware Environment Standards 3.5.1 Infromatica Directory Structures


QTG1
INF Dev - phgp0233: /etlapps/dev/71/qtg1/SrcFiles/ /etlapps/dev/71/qtg1/TgtFiles/ INF QA - phgp0232: /etlapps/fit/71/qtg1/SrcFiles/ /etlapps/fit/71/qtg1/TgtFiles/

QTG2
INF Dev - phgp0233: /etlapps/dev/81/qtg2/SrcFiles/ /etlapps/dev/81/qtg2/TgtFiles/ INF QA - phgp0232: /etlapps/fit/81/qtg2/SrcFiles/ /etlapps/fit/81/qtg2/TgtFiles/

3.5.2 FMS Directory Structure on Informatica Server


INF Dev - phgp0233: /etlapps/dev/81/p1up_shared/fms/ INF QA - phgp0232: /etlapps/dev/81/p1up_shared/fms/

3.5.3 FMS Control File Names


(By default Informatica does not use control files to send files via FMS). All FMS Control Files should use the following naming standard: FMS_<Process Area>_<Src\Tgt>_<Business_Object>_<File_Description>. Xml * For mainframe systems substitute the _ for .

3.5.4 Informatica Flat File Naming Standard


All files brought into or sent from the middleware layer should adhere to the below standard. (Note: This assumes that FMS will be able to rename files from Source & to Target.) <Process Area>_<Src\Tgt>_<Business_Object>_<File_Description>. yyyyMMddHHmmss.RDY (timestamp will be an optional field - to be used when multiple files will appear before being processed.) Example: The ItemSiteMaster file for the ISCP process area, business objects Transportaion Lanes for I2RP would be as follows: ISCP_I2RP_TRNLANES_ITEMSITEMASTER.yyyyMMddHHmmss.RDY

3.5.5 Informatica Middleware Staging Table Naming Standards


All source and target staging tables will consist of a common set of columns not including the data columns required for each specific interface:

Informatica Architecture CommonLE Integration Design


Transaction ID unique sequence number for each record per interface run. Timestamp date\time stamp when the record was inserted into the staging table. Status flag to indicate whether the record has been processed, completed failed, etc Transaction Name name of interface

The STATUS field can consist of the following values. Depending on the interface not all STATUS codes will be used. N (New) flag indicating that the record has been successfully inserted into the staging DB. P (Processing) flag indicating that the middleware application is processing the record. C (Complete) flag indicating that the middleware application has successfully processed the record. F (Failed) flag indicting that the middleware application has failed to process the record. (Assumption depending on interface business rules, failed records will remain in the staging table until successfully processed). Type VARCHAR2 DATE VARCHAR2 VARCHAR2 Null No No No No

Table Design: Name TRANSACTION_ID CREATE_DTM STATUS TRANSACTION_NAME

Table naming standards for a source system loading data into middleware staging are: <Process Area>_SRC_<Src\Tgt>_<Business_Object>_<Table_Name> Example: The ItemSiteMaster table for the ISCP process area, business objects Transportaion Lanes from BW would be as follows: ISCP_SRC_BW_TRNLANES_ITEMSITEMASTER The same applies to the middleware application needing to load data into the middleware staging before sending to the target system. <Process Area>_TGT_<Src\Tgt>_<Business_Object>_<Table_Name> Example: The ItemSiteMaster table for the ISCP process area, business objects Transportaion Lanes to I2RP would be as follows: ISCP_TGT_I2RP_TRNLANES_ITEMSITEMASTER

3.6 Control-M Execution of Workflows


Most, if not all, of the interfaces built within Informatica will be executed using Pepsicos global scheduling tool Control-M. In most cases, Control-M will not only trigger Informatica workflows but also SAP and Legacy specific jobs. Each Control-M job will be linked to other jobs within the group pertaining to a particular interface. These dependencies are driven by the return codes of each of the individual jobs within the job group. To manage the execution of the workflows and return codes to Control-M, each interface built within Informatica will be executed via a Unix shell script. Below is the basic structure of the shell script:
#!/bin/sh ############################################################################### ## Variables used for commencement of the Project OneUP IDoc Listener Workflow ############################################################################### ########################################### ## Creating Variables for Execution ## ## USERNAME, PASSWORD, and INFORMAT_PORT ## ########################################### . //schedapps/p1up/env_p1up_batch.sh

Informatica Architecture CommonLE Integration Design


. //schedapps/p1up/env_p1up_batch_qtg2.sh for QTG2 and PCNA1 interfaces

############################################################################### ## ## Used to start Project OneUP Informatica Workflow ## ############################################################################### //schedapps/p1up/start_workflow.sh US_CORP_1UP_QTG1_INTFC wf_INTFC_QTG1_SHARED_IDOC_LISTENER -wait

The yellow highlighted section of the script provides the proper initialization of the environment variables for the start_workflow.sh script. User name, password, and Informatica port number are set within the env_p1up_batch.sh script. The core functionality of these scripts is highlighted in grey. There are two versions of this line, start_workflow.sh and stop_workflow.sh. In nearly all situations, the start_workflow.sh is used with a wait condition. The only Informatica component that uses the stop_workflow.sh is the IDoc Listener, which is started without a wait condition. There are three parameters that are supplied to the start_workflow.sh and stop_workflow.sh scripts: folder name (highlighted in blue text), workflow name (highlighted in green text), and the wait condition (red text). The wait condition should be used by most interfaces, as this will allow the workflow to complete prior to sending a return code to Control-M. This is important because the return code is responsible for communicating success or failure to Control-M and ControlM uses this return code to dictate execution of subsequent jobs in the group.

There will be a script implemented for each interface. The script name should conform to the following standard: p1up_qtqg2_<interface name> The parameter values for each script will be interface specific. To manually start the Informatica workflow with out Control-M, run the start_workflow.sh for that particular interface from the /schedapps/p1up directory.

Informatica Architecture CommonLE Integration Design

4 Build and Unit Test Activities


During the development cycle, each developer should focus build and unit test activities on the sessions that perform the extract and load procedures for the interface. All unit test scripts should be completed for these main components. Upon successful completion of these unit test activities, a developer should work with the development lead to incorporate the CommonLE components into an existing workflow. After walking through the following procedures with the development leads, any developer working on multiple interfaces will have the basic understanding of the constructs and organization of the standard interface wrapper to develop and test the wrapper for subsequent interfaces.

4.1 PowerCenter Designer Tasks


Each developer will need to create shortcuts to the following three SHARED mappings from SHARED_US_CORP_1UP folder: m_P1UP_SHARED_AUDIT_LOG_BEGIN m_P1UP_SHARED_AUDIT_LOG_END m_P1UP_SHARED_ERROR_MESSAGING

DO NOT DIRECTLY COPY THESE MAPPINGS INTO YOUR DEVELOPMENT FOLDER. Shortcuts are required so that each developer is referencing the latest version of the code. If the mapping changes within the Shared folder, those changes will be propagated into the developers folder as well. Changes may impact the developers session and its ability to execute, but this type of error should not be difficult to resolve with either a Validation of the session or a slight configuration change. Screenshot 7.1.1.a This demonstrates the creation of a SHORTCUT into a developer folder. Notice the shortcut icon on each mapping that was added.

4.2 PowerCenter Workflow Manager Tasks


After the mapping shortcuts have been created in the developers folder, the associated sessions can now be copied as well. The following four sessions will be copied: s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE s_m_P1UP_SHARED_ERROR_MESSAGING_SAMPLE

To copy these sessions, follow these instructions:

Informatica Architecture CommonLE Integration Design


1.) Connect to and open the desired developer folder. 2.) Connect to but do NOT open SHARED_US_CORP_1UP. 3.) Highlight all four sessions related to audit and error logging within this folder. Use the Edit menu to select Copy Screenshot 7.1.2.a

4.) Navigate to the developers folder that is currently open and Paste using the Edit menu.

Screenshot 7.1.2.b

Informatica Architecture CommonLE Integration Design

5.) Step #4 will generate a new window to emerge called Copy Wizard. The Copy Wizard is designed to help eliminate any conflicts Workflow Manager detects when copying sessions or workflows from one folder to the next. This wizard should determine that there is a conflict with regards to the session/mapping associations. For each mapping/session combination, you will need to go through and select the mapping shortcut you previously created. Screenshot 6.1.2.d demonstrates the resolution of the conflict. Screenshot 7.1.2.c Copy Wizard

Screenshot 7.1.2.d Resolution

Informatica Architecture CommonLE Integration Design

6.) Click Next>> and Finish to complete this wizard. 7.) You should now have created copies of those sessions. You should now rename each of the sessions you copied to align with the interface you are building. The following is the naming convention you should follow for each reusable session: s_m_INTFC_[interface acronym]_AUDIT_LOG_BEGIN s_m_INTFC_[interface acronym]_AUDIT_LOG_END_SUCCESS s_m_INTFC_[interface acronym]_AUDIT_LOG_END_FAILURE s_m_INTFC_[interface acronym]_ERROR_MESSAGING 8.) Lastly, each of these sessions will require parameter file entries within the following text files on the Unix servers: //etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_begin_audit_parms.txt //etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_end_audit_parms.txt //etlapps/[phase]/71/qtg1/Scripts/US_CORP_1UP_QTG1_INTFC_error_parms.txt 9.) Refer to Section 6.1.3 for sample entries into the parameter files.

4.3 Mapping Parameters for Sessions


This table represents all of the parameters used for the CommonLE audit and error logging mappings and sessions. The table specifies which units of code utilize the various parameters on the list. It is the developers responsibility to determine the values for their work units and communicate that information to the development leads and the Informatica architect so that all documentation and code can be kept up-to-date. Parameter Name Default Value Error Messag e X Audit Begin X Audi t End X Description

$$INTERFACE_NAME

DEFAULT_INTERFACE_NAM

This value will correspond with the

Informatica Architecture CommonLE Integration Design


E $$APPLICATION_ID DEFAULT_APPLICATION_ID X X X value used to insert into INFA_INTERFACE_LOG table. This parameter identifies the Application from a CommonLE perspective. This parameter will correspond with the numeric value of the Caliber ID for the interface object. This parameter will correspond with the name of the interface object and is directly related to the SERVICE_NAME numeric value. This parameter will correspond to the acronym for the target system or application. The severity code will be managed for the interface. Any error will be assigned the severity code for the entire interface. X X X This matches the workflow name for all sessions in the interface. This parameter will be the name of the subsequent session in the workflow. X This parameter will be different for sessions that end the workflow successfully versus a session that ends the workflow with a failure. Usually two sessions, one for success and one for failure, exist after a decision task in the workflow which analyzes the status of the workflow based upon its tasks. This parameter will be the name of the previously executed session in the workflow.

$$SERVICE_NAME

$ $TRANSACTION_DOMAI N $ $APPLICATION_DOMAIN $$SEVERITY_CODE

DEFAULT_BUSINESS_OBJE CT

DEFAULT_TARGET_SYSTE M 0

$$WORKFLOW_NAME $$NEXT_SESSION

DEFAULT_WORKFLOW_NA ME DEFAULT_NEXT_SESSION

$$AUDIT_STATUS

DEFAULT_AUDIT_STATUS

$$PREVIOUS_SESSION

DEFAULT_PREVIOUS_SESSI ON

Below are samples from each of the parameter files. Screenshot 7.1.3.a US_CORP_1UP_QTG1_INTFC_begin_audit_parms.txt

Informatica Architecture CommonLE Integration Design

Screenshot 7.1.3.b US_CORP_1UP_QTG1_INTFC_end_audit_parms.txt

Screenshot 7.1.3.c US_CORP_1UP_QTG1_INTFC_error_parms.txt

Informatica Architecture CommonLE Integration Design

4.4 Build Completion and Next Steps 4.4.1 String / Assembly Testing
For string and assembly testing, all code will need to be moved into the project specific string/assembly test folder (QTG1_INTFC). There are currently shortcuts for the shared mappings that exist in these folders. Therefore, the development lead will only be responsible for migrating the sessions and workflows into the project folder. The development lead will need to re-point each session to use the mapping shortcuts already created within the project folder. In addition, the parameter files must be changed to reflect the new folder that all code is residing in. These modifications should complete the migration into the project folders.

4.4.2 Migration to QA, UAT, and PROD


The parameter files and any scripts related to the interface workflows should be migrated from PHGP0233 to PHGP0232 and PHGP0234 accordingly. Unless environment-specific details are referenced in scripts, no additional modifications would be necessary.

Informatica Architecture CommonLE Integration Design

Appendix A: Step-by-Step Application of Code Template to Core Processes


This following appendix will provide developers with a common architecture and code template for building interfaces that publish messages for posting into the CommonLE. This documentation will also provide the development leads with a sort of checklist to walk through each interface and determine if the code has been modified according to the necessary steps. 1) Create a copy of the following mappings from the SHARED_US_CORP_1UP folder into your current folder: i) ii) iii) m_P1UP_SHARED_AUDIT_LOG_BEGIN m_P1UP_SHARED_AUDIT_LOG_END m_P1UP_SHARED_SUMMARY_ERROR_MESSAGING

iv) m_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING v) 2) 3) m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING

Create a session using the mapping m_P1UP_SHARED_AUDIT_LOG_BEGIN. To save time, create a copy of the session s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE from folder SHARED_US_CORP_1UP. Rename the session to comply with the following standards for interfaces. i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_BEGIN

4) 5) 6) 7)

Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_begin_audit_parms.txt. Click on the Mapping Tab. For the target entitled shortcut_to_INFA_INTERFACE_LOG, change the reject file name to your_session_name.bad. Log into Unix command line for the Informatica server. Modify the parameter file for begin audit logs located in the //etlapps/dev/71/qtg1/Scripts directory. The file name will be US_CORP_1UP_AI_INTFC_begin_audit_parms.txt. To add the applicable data, copy and paste the following 8 lines into the parameter file and replace the parameter values with the values that pertain to your session. [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_BEGIN_SAMPLE ] $$INTERFACE_NAME=SAMPLE_INTERFACE_NAME $$APPLICATION_ID=1UP_QTG1_INF_DEV $$SERVICE_NAME=12345 (Note: This is actually the caliber ID) $$TRANSACTION_DOMAIN=BUSINESS_OBJECT_NAME $$APPLICATION_DOMAIN=TARGET_APPLICATION $$NEXT_SESSION=s_m_INTFC_NEXT_SESSION $$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE Please refer to Section 7.1.3 for mapping parameters and parameter files.

8)

Create a session using the mapping m_P1UP_SHARED_AUDIT_LOG_END_FAILURE. To save time, you can copy session s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE from folder SHARED_US_CORP_1UP. Rename the session to comply with the following standards for interfaces. i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_END_FAILURE

9)

10) Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. 11) Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_end_audit_parms.txt. 12) Create a session using the mapping m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS. To save time, you can copy session s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE from folder SHARED_US_CORP_1UP. 13) Rename the session to comply with the following standards for interfaces. i) s_m_INTFC_[interface_abbreviation]_AUDIT_LOG_END_SUCCESS

Informatica Architecture CommonLE Integration Design


14) Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. 15) Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_end_audit_parms.txt. 16) Log into Unix command line for the Informatica server. Modify the parameter file for begin audit logs located in the //etlapps/dev/71/qtg1/Scripts directory. The file name will be US_CORP_1UP_AI_INTFC_end_audit_parms.txt. To add the applicable data, copy and paste the following 18 lines into the parameter file and replace the parameter values with the values that pertain to your session. [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_END_SUCCESS_SAMPLE ] $$INTERFACE_NAME=TECH_ARCH_TEAM $$APPLICATION_ID=1UP_QTG1_INF_DEV $$SERVICE_NAME=99999 (Note: This is actually the caliber ID) $$TRANSACTION_DOMAIN=TECH_ARCH_DOMAIN $$APPLICATION_DOMAIN=TGT_TECH_ARCH $$PREVIOUS_SESSION=s_m_P1UP_TECH_ARCH_SAMPLE $$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE $$AUDIT_STATUS=PROCESSED [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_AUDIT_LOG_END_FAILURE_SAMPLE] $$INTERFACE_NAME=TECH_ARCH_TEAM $$APPLICATION_ID=1UP_QTG1_INF_DEV $$SERVICE_NAME=99999 (Note: This is actually the caliber ID) $$TRANSACTION_DOMAIN=TECH_ARCH_DOMAIN $$APPLICATION_DOMAIN=TGT_TECH_ARCH $$PREVIOUS_SESSION=s_m_P1UP_TECH_ARCH_SAMPLE $$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE $$AUDIT_STATUS=FAILED Please refer to Section 7.1.3 for mapping parameters and parameter files. 17) Create a session using the mapping m_P1UP_SHARED_SUMMARY_ERROR_MESSAGING. To save time, create a copy of the session s_m_P1UP_SHARED_SUMMARY_ERROR_MESSAGING_SAMPLE from folder SHARED_US_CORP_1UP. 18) Rename the session to comply with the following standards for interfaces. i) s_m_INTFC_[interface_abbreviation]_SUMMARY_ERROR_MESSAGING

19) Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. 20) Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_error_parms.txt. 21) Log into Unix command line for the Informatica server. Modify the parameter file for exception logs located in the //etlapps/dev/71/qtg1/Scripts directory. The file name will be US_CORP_1UP_AI_INTFC_error_parms.txt. To add the applicable data, copy and paste the following 8 lines into the parameter file and replace the parameter values with the values that pertain to your session. [US_CORP_1UP_QTG1_INTFC.s_m_P1UP_SHARED_ERROR_MESSAGING_SAMPLE ] $$INTERFACE_NAME=SAMPLE_INTERFACE_NAME $$APPLICATION_ID=1UP_QTG1_INF_DEV $$SERVICE_NAME=12345 (Note: This is actually the caliber ID) $$TRANSACTION_DOMAIN=BUSINESS_OBJECT_NAME $$APPLICATION_DOMAIN=TARGET_APPLICATION $$SEVERITY_CODE=3 (NOTE: This will be dependent upon the SID definition for the interface) $$WORKFLOW_NAME=wf_P1UP_SHARED_INTERFACE_SAMPLE Please refer to Section 7.1.3 for mapping parameters and parameter files. 22) Create a session using the mapping m_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING. To save time, create a copy of the session s_m_P1UP_SHARED_INTFC_ERR_LOG_MESSAGING_SAMPLE from folder SHARED_US_CORP_1UP. 23) Rename the session to comply with the following standards for interfaces.

Informatica Architecture CommonLE Integration Design


i) s_m_INTFC_[interface_abbreviation]_INTFC_ERR_LOG_MESSAGING

24) Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. 25) Click on the Mapping Tab. For the target entitled INFA_INTERFACE_ERR_LOG1, change the reject file name to your_session_name.bad. 26) Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_error_parms.txt. The same error parameter file will be leveraged throughout the record-level exception handling components. Copy the lines used for the summary exception messaging session and reference this new session. Keep these entries close together in case a change is required. 27) Create a session using the mapping m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING. To save time, create a copy of the session s_m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING_SAMPLE from folder SHARED_US_CORP_1UP. 28) Rename the session to comply with the following standards for interfaces. i) s_m_INTFC_[interface_abbreviation]_INTFC_AUDIT_LOG_MESSAGING

29) Double-click the session and click on the Properties tab. Change the session log file name to your_session_name.log. 30) Click on the Mapping Tab. For the target entitled INFA_INTERFACE_AUDIT_LOG, change the reject file name to your_session_name.bad. 31) Click on the Properties Tab of your session. Use the following value for the parameter file setting: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_begin_audit_parms.txt. The same audit begin parameters will be leveraged throughout the record-level audit logging components for this session. Copy the lines used for the begin audit messaging session and reference this new session. Keep these entries close together in case a change is required. 32) Within the core processing sessions, add the following entries to the workflow parameter file located at: $PMRootDir/ai/Scripts/US_CORP_1UP_AI_INTFC_workflow_parms.txt. [US_CORP_1UP_AI_INTFC.s_m_P1UP_SHARED_INTFC_AUDIT_LOG_MESSAGING_SAMPLE ] $$INTERFACE_NAME=SAMPLE_INTERFACE_NAME Shortcut_to_mplt_Process_Audit_Logs.$$AUDIT_LOGGING_SWITCH=ON

Informatica Architecture CommonLE Integration Design

Appendix B: Accessing CommonLE Logs


The following steps outline the process to login to the CommonLE front-end application to view Informatica log entries. 1) The logs and errors are viewed via a web browser, Use the following link for the development environment: http://wlsite4.corp.pep.pvt:7229/1UPPepsiCSD/gologin.do 2) Enter the User Name and Password and click Submit

3) The Welcome screen appears. Click Logs & Errors

Informatica Architecture CommonLE Integration Design

4) Click View Logs and choose Application. You may use the other fields to narrow the search. Click the Submit button.

Informatica Architecture CommonLE Integration Design

5) To view the details of a specific log click the Application link.

Informatica Architecture CommonLE Integration Design

6) The details of that specific log will be displayed at the bottom of the page.

Informatica Architecture CommonLE Integration Design

1)

Informatica Architecture CommonLE Integration Design

Appendix C: Implementing Record-Level Exception Logging into Core Processes


Within the SID documentation associated with a given interface, an exception or error pattern will be selected by the Integration Solution Architect. These patterns identify how the business is required to track the data through an interface. Each pattern tracks exceptions at differing levels and completion alerts will also vary across the patterns. For those patterns that require record-level exception logging (Transmit with Workflow Success & Report Exceptions, Transmit with Workflow Failure & Report Exceptions, and Restrict with Workflow Failure & Report Exceptions), each developer will need to implement this mapplet into the core process of the interface workflow. For a design of the mapplet, please refer to the Informatica Error Handling Design documentation located in the same StarTeam directory.

Identifying Possible Error Locations


One of the outputs from the SID process is the identification of the error pattern for the interface and all potential exceptions within the business logic of the code. Throughout the core mappings for an interface, there will be several instances where errors can be captured and logged. Most frequently these locations will be directly after lookup procedure transformations or just prior to a target instance. Errors prior to target instances will contain target-specific load requirements that must be proactively enforced. For example, if field 4 is NOT NULLABLE in the target application, an expression or router must avoid sending all records with no value in field 4 to the target and instead send this information as an alert to the CommonLE.

Add Evaluation Transformations


Within a mapping, routers will be the most frequent tool for evaluating record sets and choosing success or exception paths. Routers will contain the necessary groups to send appropriate data to the successful target instances and other groups to direct data to the mapplet for logging to the exception table in middleware.

Implementing the Mapplet


When an exception is encountered within the code, the mapplet will be utilized to insert that data into the exception table (INFA_INTERFACE_ERR_LOG) in a standard format.

Mapplet Input Port


in_INTERFACE_NAME

Port Description
This is the name of the interface currently being executed. This parameter should be consistent across all of the parameter files for a given interface. This parameter should be local to the mapping itself and have the full name of the mapping being executed. This value will be defined as a constant within a transformation in the mapping and will correspond to the name of the transformation where the exception occurred. This value will be the transformation type for the location of the exception. An expression should be used to concatenate the input values for a given failed transformation. This is most useful/vital for lookup procedures.

Input Value from Mapping


$$INTERFACE_NAME from the workflow parameter file

in_MAPPING_NAME

$$MAPPING_NAME from the workflow parameter file

in_TRANSFORMATION_NAME

Constant defined within the mapplet-calling mapping

in_TRANSFORMATION_TYPE

Constant defined within the mapplet-calling mapping Concatenated value defined within the mapplet-calling mapping

in_TRANS_INPUT_DATA

Informatica Architecture CommonLE Integration Design


in_TRANS_OUTPUT_DATA All output values from a failed transformation should be concatenated and mapped to this port of the mapplet. This provides a standard exception code for a given error in an interface. The error message is derived from this value. This identifies each source record as a unique occurrence. It is very possible that more than one field is required for a unique business id. Each SID document should articulate in detail the exact business id for a given interface. The time of occurrence for an exception Concatenated value defined within the mapplet-calling mapping. Constant value matching the exact type and case from the table below Concatenated value defined within the mapplet-calling mapping

in_ERR_CODE

in_ERR_BUSINESS_ID

in_ERR_TIMESTAMP

SYSDATE defined within the mapplet-calling mapping.

Error Codes
This section will contain a table of all of the acceptable error messages to be logged into the INFA_INTERFACE_ERR_LOG table. Emphasis must be placed upon using the proper messages when logging to this table.

Error Code Value


LOOKUP_PROCEDURE_ERROR

Error Code Description


Whenever a cross reference lookup procedure returns a default value due to mismatch of incoming values, this error should be logged into the exception log table within the middleware SAPEAI database schema. SID documentation may outline business data validation procedures. These validations must be checked within mappings and errors logged, processes suspended, etc.

Example Usage
When lkp_PAYMENT_TERMS returns -1, log this error along with the incoming data values for the transformation.

DATA_VALIDATION_ERROR

When exp_CHECK_DEBIT_CREDI T_MATCH detects a difference between AMT_DEBIT and AMT_CREDIT, route this information to the exception mapplet. When in_denominator = 0 then route record to exception mapplet with a divide by zero error using this message value. When in_oldValue is not in (1, 2, 3, 4, 5) then mark this as an error. When number of source records does not equal the number of target records, log this value. When target load conditions are not met, this error should be sent to the CommonLE to identify the record as not

COMPUTATION_ERROR

This error message value should be used when computation errors are detected within expressions, aggregators, etc. This value will be used when conversions or substitutions are used within expressions and no possible matches are found. This error message is reserved for source/target record count analysis. This error message is reserved for target load errors.

DATA_CONVERSION_ERROR

RECORD_COUNT_ERROR

TARGET_LOAD_ERROR

Informatica Architecture CommonLE Integration Design


being loaded into the target. Where applicable, this error can be used in conjunction with another error code.

Informatica Architecture CommonLE Integration Design

Appendix D: Implementing Record-Level Audit Logging into Core Processes


Within the Solution Integration Design for a given interface, a Business ID or unique identifier for a record in the source system is documented. This Business ID subsequently becomes the unique identifier for each record transmitted through the interface code. This unique identifier will be logged to the Audit Logging portion of the CommonLE as a requirement of all QTG2 interfaces using Informatica. The components that perform audit logging may be turned off by production support personnel. This switching functionality is controlled at the interface /workflow level; therefore large interface volume can be removed by production support when it no longer is needed.

Creating the Business ID


Within the Solution Integration Design documentation, there is a section for the creation of a Business ID for the interface. This identifier will be either one field or the concatenation of multiple fields that combine to create the natural key for the incoming data record. This Business ID should be created within the first one or two transformations downstream of the source qualifier transformation. If a SQL Override is used within the mapping, it may be advisable to create the Business ID within the SQL statement. For example, if the source table is DM_ACCOUNT and the Business ID is ACCT_ID, your SQL statement could read: select ACCT_ID as INTFC_BUSINESS_ID, ACCT_ID as ACCT_ID, STRT_DT as START_DATE from DM_ACCOUNT. Because there are multiple ways of processing data through a mapping, the developer may choose to have only one ACCT_ID value returned by the SQL statement and then connect it to multiple transformation paths. When using SQL Overrides to meet other requirements, the architecture team recommends this strategy for implementation.

Inserting the Audit Logging Mapplet & Target


The majority of functionality for inserting into the interface audit log table within the middleware database resides within a reusable mapplet transformation in the shared Informatica folder. This mapplet, mplt_Process_Audit_Logs, contains a router transformation that controls the usage of audit logging within an interface. The routers main grouping evaluates the value of the AUDIT_LOGGING_SWITCH parameter within the core workflow parameter file on the Unix server. Each developer will need to provide the following inputs to the mapplet transformation:

INTERFACE_NAME MAPPING_NAME AUDIT_BUSINESS_ID AUDIT_TIMESTAMP

The outputs of this transformation will link directly to the target table, INFA_INTERFACE_AUDIT_LOG. Using the AutoLink feature of Informatica, the output from the mapplet transformation will automatically link or port to the target tables columns. During session creation, assign SAPEAI as the connection for this target table.

Setting Up Mapping Parameters and Parameter File


For interface core processes, there is one parameter file used across the various interfaces for a given release. The parameter file, US_CORP_1UP_AI_INTFC_workflow_parms.txt, will contain the specific parameters used by core process sessions. The most frequently used parameter, $$INTERFACE_NAME, should be present in this parameter file as it is present in all other parameter files. As a developer, please make certain that the INTERFACE_NAME value matches across all of these separate files. The common architecture components require this synchronization. Within each mapping, there are two main parameters that need to be defined: $$INTERFACE_NAME set via the parameter file and $$MAPPING_NAME that can be defaulted within the mapping itself. There is no need to maintain this value within the parameter file itself. In addition, the mapplet contains a parameter that must be fed from the parameter file for core processes. This parameter, $$AUDIT_LOGGING_SWITCH, will provide the functionality of controlling audit logging to the Common Services reporting components. When not configured to the value of ON, the interface will not

Informatica Architecture CommonLE Integration Design


log Business IDs to the INFA_INTERFACE_AUDIT_LOG table and subsequently no messages will be delivered to the AUDIT queue for Common Services. For assembly testing purposes, audit logging should always be enabled. The general rule for system test cycles should be that the AUDIT_LOGGING_SWITCH is set to ON unless performance becomes a major issue for successful completion of the testing phases. Performance of the common components should be thoroughly investigated during these test intervals. Application Integration architects will assist with any performance issues that emerge from these common mapplets, mappings, and sessions. Data volumes within the INFA_INTERFACE_AUDIT_LOG table may become the single largest contributor of performance issues for this reusable component.

S-ar putea să vă placă și