Sunteți pe pagina 1din 14

Production Support Issues - Informatica

If you have a flat file as a source and it is ftped from other systems.. at the time of ftp if it got failed then
your jobs also fail.

If source data is not correct means not in correct format.

Database deadlock (One or more process accessing data at the same time)

Database server down


Taking care of the Mapping parameter and mapping variable file name

While importing the Test Environment into Production some of the Reusable maps gets invalidate.

Most issues come back to quality ofthe design, quality of the code and complexity of the solution.

Ho good were your test cases? Did you do peer code reviews?

How manydisparate systems are you pulling from? etc.

Error due to dependency of jobs runnning simultaneously.

Develop new or maintain existing Informatica mappings and workflows based on specifications.

Familiarity with administrative tasks within Informatica platform such as stopping/starting Informatica
Repository and Server processes, performing code migration, backup and restore of Informatica
Repository, and security administration.

Participate in cross-functional efforts to support other teams ** such as ETL and database tuning to
support Business Objects reporting performance.

Sharing knowledge by effectively documenting work.

Session Recovery Procedures.

Ability to take ownership of projects and critical issues and to drive those to successful resolution

Provide daily production support of batches and system processes.

Produce weekly status report and participate in team meetings and conference calls with other region.
Informatica and Loading Issues

Table 61 provides information about problems and solutions related to issues with Informatica and loading.
To view the Informatica log file details, double-click the workflow.

Table 61. Informatica and Loading Issues

Symptom/Error Message Probable Cause/Solution

Double-clicking the workflow The session log files are not set up properly. You also may need to
yields a Workflow Manager error change the text editor.
message: "The system cannot find
the file specified."

Using Oracle, some mappings When running some Informatica mappings for loading the Siebel Data
hang while running when Warehouse, turning on the Performance Statistics can cause the
performance statistics are mapping to hang. The only workaround is to increase the values of the
switched on. LMSharedMemory and MaxSessions variables in Informatica. The risk of
increasing the LMSharedMemory too much is that it may start to have a
serious effect on overall performance of the machine that the
Informatica server is running on.

When you execute a workflow on This can happen due to a server time-out property that is usually set to
the Informatica Workflow 20 or 40 seconds. When you try to run a large workflow, every session in
Manager, Informatica returns the that workflow is fetched into the server's memory. If this takes longer
following error message: than the server time-out property, the server returns a message that the
workflow was unable to run. However, the workflow is running, but the
"Request to start workflow
server just needs time to complete fetching the sessions into memory.
(workflow name) on server
Double-click the workflow to view the log file details.
(server name) not successful."

When running Incorrect date entry in the SME Date Format field in the System
Full_Extract_Siebel_DW or Preferences. The format is YYYYMMDD.
Refresh_Extract_Siebel_DW,
Informatica returns errors similar
to:
TE_7007 Transformation
Evaluation Error;
current row skipped...

TE_7007
[<<Transformation
Error>> [to_date]:
Date function error
to_date('19010101',
'DD-MON-YYYY')

When running Insufficient semaphores allocated. Allocate more semaphores on the


Full_Load_Siebel_DW, Informatica Data Warehouse Server. The change becomes effective when you reboot.
returns errors similar to:
CMN_1014 Error
creating semaphore...

TM_6006 Error
initializing DTM for
session...

TM_6006 [s_CR18a1.
Load W_PROG_DM_TMP -
Program Records]

Informatica (RDBMS is DB2) gives The DB2 parameter "SHEAPTHRES" is too small.
the following error message:
Error occurred
unlocking
[SDE_ServiceRequestDim
ension1].

An error occurred
while accessing the
repository[[IBM][CLI
Driver][DB2/6000]
SQL0955C

Sort memory cannot be


allocated to process
the statement. Reason
code = "".

SQLSTATE=57011]

DB2 Fatal
Error[FnName:
ExecuteDirect --
SQLSTATE=57011 [IBM]
[CLI Driver][DB2/6000]

Informatica produces the error The last Designer session was not validated. Part of the development
"Unable to connect to the server" process of working with Designer is to always validate any changes to
when running a full load of the Informatica mappings definitions and sessions after the change is saved
Siebel Data Warehouse in repository.
(Full_Load_Siebel_DW_Dimension
s).

When loading the data warehouse, Either someone has a session open or there is a dead session. Make sure
Informatica reports a lock no one has any open sessions. If no sessions are open, then follow the
problem. Informatica documentation on removing locks caused by dead sessions.

After changing an Informatica This is due to Informatica mapping objects that have been modified and
mapping, you may get an error this does not automatically validate the session objects. You must
message when trying to execute validate all changes to any existing mappings in the Informatica
"Full_Load_Siebel_DW_Facts." The repository.
error message is "Unable to
connect to the server."

Session SDE_RecordLoadStart This could be because the previous full load did not complete
fails due to unique constraint error successfully. Fix the problem that caused the previous load session to
while executing fail. Make sure you start the process from the last entry of
Full_Load_Siebel_DW_Dimensions Load_RestartNextWorkflow # before the failed session, and restart the
or Full_Load_Siebel_DW_Facts. workflow process from that point.
If you have to re-extract the data from the Siebel transactional database
because something had to be fixed in the source database to resolve the
load error, then you must restart the ETL process. Truncate
S_ETL_INC_STAT in the Siebel transactional database, then enable the
Extract and Load workflows and rerun them.

Session This could be because the previous load or refresh did not complete
SDEINC_RecordExtractStart fails successfully. Fix the problem that caused the previous refresh session to
due to unique constraint error fail. Make sure you start the process from last entry of
while executing a Refresh %RestartNextWorkflow # before the failed session, and restart the
workflow. workflow process from that point.

The session fails and you receive This is due to a disk space limitation. Check the
the following error code: /Informatica/PowerMart/Cache/Check directory for available disk space,
also check the limits (ulimit) of the account used to start PowerMart.
Error "TE_7042
Aggregate Error: File
Operation Error

Informatica sessions get This is possibly caused by a limited number of resources on the MSSQL
deadlocked and eventually fail Database Server. The workaround is to execute the following MSSQL
when they try to do a "select" specific SQL command on the Siebel Data Warehouse:
from the repository table
DROP INDEX OPB_OBJECT_LOCKS.OPB_OBJ_LOCKS_IDX
OPB_OBJECT_LOCKS. This
problem sometimes occurs on
MSSQL server databases. DROP INDEX OPB_OBJECT_LOCKS.OPB_OBJ_LOCKS_IDX2

DROP INDEX OPB_OBJECT_LOCKS.OPB_OBJ_LOCKS_IDX3

Upon completion of executing these commands, continue executing the


workflow processes to load the Siebel Data Warehouse.

An error may occur when trying to After installing Informatica Server on Windows, copy the file mapi32.dll
send a post session email from winnt\\system32 to the bin folder where the Informatica Server is
notification using MS Outlook installed, overwriting the existing mapi32.dll in that directory. Start the
2000. Refer to Informatica release Informatica Server so that the Informatica Server can use the new
notes for further information. mapi32.dll.
The Extended MAPI Error. MAPILogonEx failed[2147746065] error
indicates that the logon is not configured correctly. Check the following:

1. Under Services > Informatica> Logon, make sure the


login (domain\username) and password are correct.

2. Under Control Panel > Mail (it may also be called Mail
and Fax or Exchange) > Services > Show Profiles, make sure
the mail profile is correct.

3. Under Programs> Informatica Server > Informatica


Server Setup> Miscellaneous, make sure the MS Exchange
profile is correct.
While creating a custom session, Change the mode to "normal" in Informatica repository for the session.
bulk load mode does not work The "normal" mode must be used everywhere for SQL Server in all of
properly with SQL Server. your custom sessions.

When running IMR, you may This is caused by a down-level ODBC driver license key. Rename or move
receive an error message box ivodbc.lic, lvodbc.lic (if it exists), and lvdw.lic (if it exists). Make sure you
which states "The evaluation have only one license file named ivdw.lic in winnt\system32. This
period for this Oracle ODBC driver eliminates the problem.
has expired. Please call Merant to
obtain a production version of this
Oracle ODBC driver."

Outlook closes when sending out a Informatica is closing Outlook. This issue is known to Informatica and is
notification of finishing the ETL scheduled to be resolved in an upcoming release. Until then, create a
process. second profile in Outlook and add that profile name to the Informatica
server setup.

Oracle 8i sessions running in bulk This problem is an Oracle 8i defect. It has been resolved in Oracle 9i.
mode fail and return an error The workaround is to run the session in Normal mode. To do so, in
message similar to: Workflow Manager navigate to the Targets window, and change the
Target Load type to Normal.
WRITER_1_1_1> CMN_1022
Database driver error...
CMN_1022 [
ORA-00600: internal error
code, arguments:
[kpodpmop_01], [2], [], [],
[], [],[], []

Or
MAPPING> TE_7022
TShmWriter: Initialized
MAPPING> Sat Jan 26
13:54:45 2002
MAPPING> TE_7001 Internal
error: Failed to allocate a
target slot. Sat
MAPPING> Jan 26 13:54:45
2002 TE_7017 Failed to
Initialize Server
MAPPING> Transformation
BLK_ALL_DATATYPES1 Sat Jan
26 13:54:45 2002MAPPING>
TM_6006 Error initializing
DTM for session...MAPPING>
TM_6020 Session
[s_BULK_LONG] completed at
[Sat Jan 26 13:54:45 2002]

During an ETL execution, when This problem is a DB2 Connect version 7 (IBM) defect related to code
Informatica and DAC servers use page conversion. The problem has been resolved in DB2 Connect version
DB2 Connect version 7 to talk to 8.
DB2/390 version 7 OLTP and data
To correct the problem, do the following:
warehouse databases, you receive
an error message similar to the
following: 1. Download the file IBM01140.ucs from
SEVERE: [IBM][CLI Driver]
[DB2] SQL0191N Error
occurred because of a ftp://ftp.software.ibm.com/ps/products/db2/fixes/english/siebel/
fragmented MBCS character. siebel7/Conversion_Files
SQLSTATE=22504
to the /sqllib/conv directory.
103 SEVERE Tue May 11
2. Make a copy of the file and rename it to IMB05348.ucs.
21:37:29 CDT 2004 [IBM][CLI
Driver][DB2] SQL0191N Error
occurred because of a
fragmented MBCS character.
SQLSTATE=22504

When an ETL process is running This issue is related to the network. The workaround is to increase the
and tasks fail, Informatica returns Timeout parameter values in the Informatica Repository Server
an error similar to the following Administration Console.
Error while running
Workflow Description: ERROR 1. In the left pane of the Repository Server Administration
: TM_6292 : (3040|4868) Console window, right click your repository and select Edit
Session task instance Connection.
REP_12400 [Repository Error
([REP_51055] Repository 2. In the Network tab, enter 9 as the value for the
agent connection failed. parameters MessageReceiveTimeout and MessageSendTimeout.
[System Error (errno =
121): The semaphore timeout 3. Stop and start the Informatica Repository Server.
period has expired. .
(Cannot read message. Read 4. Start the Informatica Server.
5824 bytes.)])]

1.connection issue
2.Database issue
3.Unix permission issue(rare in production, often in test/dev)
4.Storage issue
5.ETL failure due to bad data( eg NOT Null column is being populated by null)
6.CnB failures
-Unique constraint violation error
-Dirty reads
-Integration Connection loss error
-Job running with very low throughput
-Jobs failed coz of locking issues
-Job failed due to lookup cache full
-Job failed due to some permission issues(it could be at database,unix or
informatica)
Version control- Thank god we do not have version control in our prod environment. We back up
the informatica objects as xml before every migration. Version control makes migration even more
difficult. If you migrate a mapping and if the source/target is not checked in then those
source/targets wont make it to the destination machine. Make sure to check in all the objects
that need to be migrated. Also, in the destination machine make sure those objects are checked
in too.

Not creating proper source or target shortcuts in target environment- If you copy a mapping
that has source or target shortcuts, and if those shortcuts are not created prior to copying the
mapping in the destination then the mapping gets copied with the shortcut removed. Whenever
you are working with mappings that have shortcuts, please make sure to create those shortcuts in
the target environment before migrating the mappings.

Impacting other mappings- It is always necessary to do a thorough dependency check on all the
source/targets that are getting updated from a migration. The changes to source/target might
make other mappings that use the same source/target to fail after the update is done. Also
validate all the mappings that are impacted by the source/target definition changes.

Not validating impacted sessions- after a mapping change has been migrated make sure to
refresh the session that was impacted. Otherwise the session will bomb when it runs in
production.

Not having proper connections set up in production- If a matching database connection is not in
production/target environment then most likely when you migrate the session it will show the non
existence of the connection in the source environment. Most likely you have to create this
connection in target environment or change the connection to match with a connection in the
target environment. Also, some sessions are sensitive to the type of database connection.
Also proper permissions have to be set up for the connection objects. Otherwise it will fail with no
permission to use the connection object. Have the right database drivers set up in the destination
machines with all the correct tnsnames.ora or .odbc.ini entries.

Object Permissions- The folders in the destination should be set up with proper permissions so
that the admin or the users of the destination machine can run the workflows and make changes if
necessary.

Setting up the parameters/variables: Make sure to check if all the mapping/workflow variables
were migrated properly. Also, make sure that the parameter file was migrated and accessible to
the informatica workflows.

Scripts- Make sure all the unix/perl or whatever script you have going along with your informatica
workflows are migrated, and Informatica can access and execute/read/write it.
Not having the database changes done correctly - If you migrate the mappings/sessions but the
database changes are not done to reflect the new informatica code, then the sessions will fail.
This could also include recreating some of the views of which the base tables where modified.

Copying reusable session that have overides in the workflows--If the reusable session has
overrides, we have observed that sometimes these overrides are gone when the reusable session is
copied again from dev/non prod environment. So better check all the overrides such as data base
connection, and other sessions properties are not changed after sessions are copied.

In my recent migration, I copied the mapping and just refreshed the session that was using this
mapping in the destination. However, this mapping had a new lookup and the session did not have
connection configured in the destination. I should have migrated the session too. Anyhow, had to
set up this connection in the destination session.

In another migration, I copied a reusable session to the destination and was hoping the non
reusable session in the destination machine will automatically pick this reusable session. But it did
not. Had to remove the session that previously existing and add the reusable session. Very naive
error.

Scheduling conflicts- You will get every thing correct but then you realize that there is backup
job running and your database is not available. WTF??? or some other informatica workflow is
writing to the same table or using up all the database resources. Check the scheduling conflicts
with other jobs.

Impact to downstream process- Lets say you are generating a file for some downstream process
to consume but made changes that the downstream process cannot process anymore then it will
be a major issue. Check if all the downstream or upstream processes are working as expected.

Useful Informatica Load Level Production Issues


Issue 1 :

The sessions in the worklet wkflw_ABCD01_STAGING_LOAD ,sessions


"s_m_ABCD01_STAGING_SESSION","s_m_ABCD01_ARCH_LOAD", in the workflow
wkflw_ABCD01_STAGING_LOAD failed with below error.

Message: SQL Error [


ORA-01017: invalid username/password; logon denied
ORA-02063: preceding line from ABCD010LINK
Database driver error...
Function Name : Execute
SQL Stmt : SELECT ABCD01_MNO.CUST_NO, ABCD01_MNO.CUST_SUBJ,
ABCD01_MNO.CUST_USER, ABCD01_MNO.CUST_ADMIN FROM ABCD01_TABLE
Oracle Fatal Error
Database driver error...
Function Name : Execute

Solution :

Here the query is firing on the source db, Password needs to be updated for DBLINK present
on the source database instance.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
----------------------------------
Issue 2 :

Sessions and workflows are getting hung status when Informatica Infrastructure which
was built on Solaris 10

Solution :

This is a known issue with (mail task ) rmail utility on the Solaris 10 operating system (
Kernel version: SunOS 5.10 Generic_147440-27 ), for long term solution you can
remove the mail tasks whichever you are using and can use the command task with
mail-x utility.This is a Bug and it got confirmed from Informatica and Oracle vendor
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
-----------------------------------
Issue 3:
Debugger giving error Error connecting to the DTM at [%s : %d]: [%s]
Solution :
Where do we run the Debugger?
In designer, open the mapping which you want to debug.
Go to the tab: MappingsDebuggerStart Debugger
It gives us a series of windows in which we need to select the Integration, Debugger
session type, related session of the mapping, related source/target/lookup
connections.

How we solved?

In the 2nd window, it asked for Integration service and Debugger Session Type.
There are 3 Debugger session types:

Debugger
The user who facing this issue was using the session Use an existing session instance
We suggested him to select Create a debug session instance so that we can check
where it is going wrong.
After creating new session instance (selected right parameters like session,
Source/target/lookup connections), the debugger started running fine (as per the
update from user).
Other resolutions:
Cause 1 :
Debugger crashes because the ports used by debugger is being used by some other
application like Yahoo messenger, etc. Close those applications and try, it will work.
Resolution:
In Designer, Goto Tools --> Options --> Debug Tab --> CHECK the box for TCP/IP- Pick a
port between Min and Max automatically.
It should solve debugger crashing,
Hope it helps.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
----------------------------------
Issue 4
Debugger giving error Error connecting to the DTM at [%s : %d]: [%s]
This happens if you are trying to run your Debugger over NAT or VPN.
Resolution:
Add a custom property "Debugger DTMHostName" without quotes with a value of your
full server name where the Integration service is hosted. Disable then enable the
Integration Service. Make sure the default port (or the custom port) is "opened" on
your firewall and your modem/server where you set NAT entries.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
----------------------------------
Issue 5 :
Java Transformation Issue : Users were unable to compile the java transformation and
getting the below error.
1. Unable to find any class file
2. Unable to create the java source file
Analysis :
While analyzing the mapping design we found that there was a mismatch in data type
for the field (METRIC_ID) which was used as String in all the transformations But
inside the Java Code it was declared as Integer.
Resolution :
We have advised the project team to make it as an Integer in all transformations and
validated the mapping which resolved the issue
Please click on each of the picture in order to navigate

----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
-----------------------------------

Issue 6 :
Informatica load got stucks with message "TIMEOUT BASED COMMIT POINT"
Solution :
Below are my experienced solutions to TIMEOUT BASED COMMIT POINT issues.
on of the Session s_m_XYZ_DBG_REPORT in our heavy ODS load was having same
issue we created the index which resolved the issue.
one of the Fact Session (s_PQ_AR_FACT_LOAD) we involved DB team and they did
gave solution Undo retention has been increased to 2400 seconds from 1800
seconds.
You need to check the source data count how much it has increased because in
one of the case it was resolved by decreasing the commit interval.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
----------------------------------

Issue 7 :
When the session recovery option is turned on at the session level Informatica session
was throwing the below error.
NODE_PTDEV_SERD02190 : TM_6371 : The Integration Service will create recovery
table [PM_TGT_RUN_ID] in database [serd02190.ciscrim.com] because it does not
exist .
Database driver error...
Function Name : execute Direct
SQL Stmt : CREATE TABLE PM_TGT_RUN_ID (LAST_TGT_RUN_ID NUMBER NOT NULL)
Oracle Fatal Error
Database driver error...
Function Name : Execute Direct
Oracle Fatal Error] setting up [PM_TGT_RUN_ID] table.
Solution : Will have to use the Base Schema account at Informatica connection
(Global Object) or else full privileges have to be given to the user used at Informatica
connection.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
-------------------
Issue 8:
One of the worflow in the folder ABCD01 migrated to the new server that was going to
unknow status.Ran manulay again it was going to unknow status.When ran from
BackEnd workflow Manager got the following error.
(INFFORMIS) Start workflow: Request acknowledged
(INFFORMIS) Start workflow: ERROR: Workflow [ABCD01:wfklw_ABCD_RUN[version 1]]:
Parameter file [/apps/Infa/staging/SrcFiles/ABCD01/INFA/ABCD.PAR] not
found.Please check the Integration Service log for more information.
Solution:
The parameter file was migrated from old server to new server ABCD.PAR and changed
the paths within the parameter file.Scheduled to next run again the workflow came to
unknown status and got the below error.
Start workflow: ERROR: Workflow [ABCD01:wfklw_ABCD_RUN[version 1]]: Parameter
file
[/apps/Infa/staging/SrcFiles/ABCD01/INFA/ABCD.PAR] not found.Please check the
Integration Service log for more information.
When checked in work-flow the parameter file was wrongly updated as
/apps/Infa/staging/SrcFiles/ABCD01/ABCD.PAR and its changed to
$PMSourceFileDir/ABCD01/INFA/ABCD.PAR. then the workflow ran fine.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
-------------------

Issue 9:
Problem Statement :
One of the session s_mL_AB_RUN_DFACT in our environment (running for the 3 days
and has around 250K records to load)
Three times we tried to load this table, previous 3 occasions it failed with snapshot
too old after running for 3 days.
It loads 110K records and then through put reduces to 1 rows/sec and session log
shows TIMEOUT BASED COMMIT POINT.
Analyze:
When checked in sesslog the session was showing only
Message: TIMEOUT BASED COMMIT POINT
Message: TIMEOUT BASED COMMIT POINT
Solution :
1) It could be issue of long caching time..
2) with less Commit Interval(Decrease commit interval from 10080 to 5000).
3) Please check DTM buffer settings and and the resource usage and capacity of the
host
4) Set the Maximum Memory Allowed for Auto memory attribute to 1024MB or 2048MB

5) Most of the times the issue might be with the undo retention that has to be
increased.
Example :
Undo retention has been increased to 2400 seconds from 1800 seconds. Now the undo
table space is having 2 GB of free space.
NAME TYPE VALUE
-------------------- ------------ -----------
undo_management string AUTO
undo_retention integer 2400
undo_tablespace string UNDOTBS1
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
-----------------------------------

Issue 10 :

One of the user came up with the below issue in LinkedIn Stating
we are facing another issue we have migrated the code 1 month back when we run
the code manually from Informatica job is getting succeeded last week we have
scheduled the job in a scheduling tool when the job runs it get succeeded in less than
a minute but it is not loading the data and we are not able to fetch the session logs,
workflow logs Please find the error unable to fetch the logs [LGS_10051] the
registered log file does not exist.
Solution :
Please do the following things.
1) Edit Session -> Properties ->Write Backward Compatible SessLog File -> Enable this
check box.
2) Edit Workflow -> Properties ->Write Backward Compatible Workflow Log File
->Enable this check box.
3) Have the correct SessLog path in Properties->SessLog File Directory.
4) Edit Workflow -> Configure Concurrent Execution-> Enable this check box.
----------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------
------------------------------
Issue 11 :

WRT_8324 : Warning: Target Connection Group's connection doesn't support transactions.


Targets [TARGET] may not be loaded according to specified transaction boundary rules!

Justification :

This warning would be triggered only when a user-defined commit is enabled by


default if the mapping has effective Transaction Control transformations. Because there
session is not able to find the transaction boundaries since it was handled in Transaction
Control transformation.
When we are running a source-based commit or user-defined commit session, and the
named targets may not recognize
transaction boundaries. This might happen with flat file targets or with bulk loading.
Integration Service uses a source- or target-based, or user-defined commit. You can
choose source- or target-based commit if the mapping has no Transaction Control
transformation or only ineffective Transaction Control transformations. By default, the
Integration Service performs a target-based commit.

S-ar putea să vă placă și