Sunteți pe pagina 1din 24

SUMMARY:

. More than 8 years total IT experience in developing Business


Intelligence solutions including building Data Warehouses, Datamarts
and ETL for clients in major industry sectors like Telecom, Pharmacy,
Finance and Insurance
. More than 6 years of ETL tool experience using IBM Information Server
DataStage and QualityStage 8.x, Ascential DataStage 7.x/6.0 in
designing, developing, testing and maintaining jobs using Designer,
Manager, Director, Administrator and Debugger
. Experienced in troubleshooting of DS jobs and addressing production
issues like performance tuning and fixing the data issues
. Excellent knowledge of studying the data dependencies using Metadata
of DataStage and preparing job sequences for the existing jobs to
facilitate scheduling of multiple jobs
. Strong understanding of the principles of DW using Fact Tables,
Dimension Tables, star schema modeling, Ralph-Kimball approach, BillInmon approach
. Experienced in writing system specifications, translating user
requirements to technical specifications, ETL Source-Target mapping
document and testing documents
. Experience in integration of various data sources with Multiple
Relational Databases RDBM systems Oracle, Teradata, Sybase, SQL
Server, MS Access, DB2
. Worked on integrating data from flat files, COBOL files and XML files.
. Extensively worked on extracting data from SAP using ABAP Extract
Stage.
. Experience in writing, testing and implementation of the Triggers,
Procedures, functions at Database level and form level using PL/SQL
. Sound knowledge in UNIX shell scripting
. Knowledge of full life cycle development for building a Data Warehouse
. Good working knowledge of Client-Server Architecture
. Articulate with excellent communication and interpersonal skills with
the ability to work in a team as well as individually.
. Certified in IBM WebSphere DataStage 8.5.v.

TECHNICAL SKILLS:

Data Warehouse: IBM Information Server DataStage and QualityStage 8.1.1,


Ascential DataStage EE 7.X/6.5 (Designer, Director, Manager,
Administrator), Parallel Extender 7.5.1, 6.0, MetaStage 6.0, Business
Objects 6.x,
Data Bases: Oracle 91/10g, IBM UDB DB2 9.1/9.7, Teradata V2R6/V212
Dimensional Modeling: Data Modeling, Star Schema Modeling, Snow- Flake
Modeling, Fact and Dimensions, Physical and Logical Data Modeling Erwin
3.5.2/3.x
Reporting Tools: OBIEE 10g, Crystal Reports 6.x/5.x, Cognos, Business
Objects 6.5
UNIX Tools: C Shell, K Shell, Bourne Shell, Perl, AWK, VI, SED
Databases: Teradata V2R12/V2R13, Oracle 10g/9i/8i, PL/SQL, UDB DB2 9.1/9.7,
Sybase SQL Server 11.0, MS SQL Server 6.5/7.0, TSQL, MS Access 7.0/97/2000,
Excel
Languages: PL/SQL, SQL*PLUS, C, VB, JDBC, XML

Professional Experience

Boston College, Boston, MA


Sr. DataStage Developer

Feb 2012-Present

Boston College (BC) is a private Jesuit research university located in the


village of Chestnut Hill, Massachusetts, USA. The main campus is bisected
by the border between the cities of Boston and Newton. It has 9,200 fulltime undergraduates and 4,000 graduate students. It is a member of the 568
Group and the Association of Jesuit Colleges and Universities. Boston
College offers bachelor's degrees, master's degrees, and doctoral degrees
through its nine schools and colleges. Boston College is currently ranked
31 in the National Universities ranking by U.S. News & World Report.
Responsibilities:
. Involved in all phases of SDLC.
. Responsible for creating detailed design and source to target
mappings.
. Responsible to communicate with business users and project management
to get business requirements and translate to ETL specifications.
. Used DataStage/QualityStage Designer to import/export jobs, table
definitions, Custom Routines and Custom Transformations.
. Created Extract Transform and Load (ETL) interfaces and gateways for
backend database.
. Designed Mappings between sources to operational staging targets,
using Star Schema, Implemented logic for Slowly Changing Dimensions
(SCD).
. Extensive hand on experience on design and developing Parallel and
server jobs.
. Extensively worked on building datastage jobs using various stages
like Oracle Connector, Funnel, Transformer stage, Sequential file
stage, LookUp, Join and Peek Stages.
. Extensively used Sort, Merge, Aggregator, Peek, DataSet and Remove
Duplicates stages.
. Involved in the migration of DataStage jobs from development to QA and
then to production environment.
. Created shared containers to use in multiple jobs.
. Hands on experience upgrading datastage from v7.5 to Information
Server 8.1.1.
. Imported and exported Repositories across DataStage projects using
datastage designer.
. Extensively worked on DataStage Job Sequencer to Schedule Jobs to run
jobs in Sequence.
Environment: IBM Information Server 8.1.1 DataStage and QualityStage
(Designer, Directory and Administrator), Oracle 10g, Cognose v10, PL/SQL,
HP-UNIX 11, Toad for Oracle.

..............................................................................

.................................................
Freddie Mac, McLean, VA
Jun 2011 - Jan 2012
Sr. DataStage Developer
The Federal Home Loan Mortgage Corporation (FHLMC), known as Freddie Mac,
is a public Government Sponsored Enterprise (GSE). The FHLMC was created in
1970 to expand the secondary market for mortgages in the US. Along with
other GSEs, Freddie Mac buys mortgages on the secondary market, pools them,
and sells them as a mortgage-backed security to investors on the open
market. This secondary mortgage market increases the supply of money
available for mortgage lending and increases the money available for new
home purchases.
Responsibilities:
. Designed and developed jobs for extraction of data from
different datafeeds into IBM DB2 database.
. Coded many shell scripts for efficient job scheduling.
. Worked on preparing the test cases and testing ETL jobs and data
validation.
. Developed parallel jobs using various Development/debug stages
and processing stages (Aggregator, Change Capture, Change Apply,
SAP ABAP Extract Stage, IDoc Stage, BAPI Stage, Filter, Sort &
Merge, Funnel, and Remove Duplicate Stage).
. Worked on change management system on code migrations from Dev
to QA to Prod environments.
. Performed debugging on these jobs using Peek stage by outputting
the data to Job Log or a stage.
. Extensively worked on building ETL interfaces to read and write
data from DB2 data base using DB2 Enterprise Stage and DB2 API
Stage.
. Involved in functional and technical meetings and responsible
for creating ETL Source - to - Target maps.
. Modifying the existing jobs and Hash files according to the
changing business rules
. Loading the Historical data into the Warehouse.
. Developed jobs for transforming the data and stages like Join,
Merge, Lookup, Funnel, Transformer, Pivot and Aggregator
. Experience with Scheduling tool Autosys for automating the ETL
process.
. Involved in developing a Control Module for the complete process
using PERL and UNIX scripting.
. Worked on documenting technical design documents and source to
target (STT) documents.
. Involved in Unit Testing, Integration testing and UAT
Performance Testing.
. Worked with Embarcadero to interact with DB2.
Environment: IBM Information Server 8.5 (Designer, Director and
Administrator), QualityStage, Test Director, ClearCase, AutoSys, K-Shell
Scripts, SAP ECC R3, SAP BW, DS Extract PACK for SAP 5.1,IBM DB2 9.1, AIX
5.3, Embarcadero for DB2
..............................................................................
.......................................
First Tennessee Bank, Memphis, TN

Jan 2010 - May 2011


Sr. Datastage Developer
Project: TSYS to FDR conversion
First Tennessee Bank is now the largest Tennessee-based bank with any
genuine influence beyond the borders of the state. Chartered in 1864 as
First National Bank, First Horizon National Corporation (FHN) has grown to
be one of the largest bank holding companies in the United States in terms
of asset size. FHN's approximately 6,000 employees provides financial
services through about 180 bank locations in and around Tennessee and 21
FTN Financial Group offices in the U.S. and abroad. AARP and Working Mother
magazine have recognized FHN as one of the nation's best employers.
Responsibilities:
. Interacted with End user community (FDR) to understand the
business requirements and in identifying data sources.
. Analyzed the existing informational sources and methods to
identify problem areas
and make recommendations for
improvement. This required a detailed understanding of the data
sources and researching possible solutions.
. Used Classic Federation Server to get file from the Mainframe to
Unix and vice versa.
. Used DataStage stages namely Z/OS file stage using cobol copy
books to extract the data from the Mainframe through Classic
Federation server , Column Export, Column Import, Sequential
file, Transformer, Aggregate, Sort, Datasets, Join, Lookup,
Change Capture, Funnel, Peek, Row Generator stages in
accomplishing the ETL Coding.
. Developed job sequencer with proper job dependencies, job
control stages, triggers.
. Used Zeke job scheduler for automating the monthly regular run
of DW cycle in both production and UAT environments.
. Reviewed reports on Mainframe TSO ISPF environment and allocated
datasets on the mainframe using Classic Federation jobs.
. Created shared containers to simplify job design.
. Performed performance tuning of the jobs by interpreting
performance statistics of the jobs developed.
. Documented ETL test plans, test cases, test scripts, and
validations based on design specifications for unit testing,
system testing, functional testing, regression testing, prepared
test data for testing, error handling and analysis.
. Worked on change management system on code migrations from Dev
to QA to Prod environments.
. Extensively worked on building ETL interfaces to read and write
data from DB2 data base using DB2 Enterprise Stage and DB2 API
Stage.
. Involved in functional and technical meetings and responsible
for creating ETL Source - to - Target maps.
. Modifying the existing jobs and Hash files according to the
changing business rules
. Loading the Historical data into the Warehouse.
. Developed jobs for transforming the data and stages like Join,
Merge, Lookup, Funnel, Transformer, Pivot and Aggregator
. Experience with Scheduling tool Zeke for automating the ETL
process.
Environment: IBM Information Server 8.5 (Designer, Director and
Administrator), QualityStage, Test Director, ClearCase, Zeke, K-Shell
Scripts, Mainframe TSO ISPF, IBM DB2 9.1, AIX 5.3, WinSQL for

DB2,Peoplesoft 9.2.
..............................................................................
...............................................
Pfizer Pharmaceutical,
Apr 2009 - Dec 2010 Bridgewater, NJ
DataStage Developer
Pfizer pharmaceutical is dedicated to discovering and developing new, and
better, ways to prevent and treat disease and improve health and well being
for people around the world. Data Repository is residing on various
platforms which is sourced by various Pfizer legacy systems in order to
provide patient-centric care. The main goal of Pfizer is to build an
integrated Enterprise Data warehouse for all of their applications while
reducing operational costs.
Responsibilities:
. Responsible for detailed design and development of Pfizer datawarehouse.
. Used DataStage Manager to define Table definitions, Custom Routines and
Custom Transformations.
. Communicated with business users and management to get business
requirements and translate to ETL specifications.
. Designed Mappings between sources to operational staging targets, using
Star Schema, Implemented logic for Slowly Changing Dimensions (SCD).
. Experience with Parallel Extender stages - Funnel/Remove duplicates
. Used Built-in, Plug-in and Custom Stages for extraction, transformation
and loading of the data, provided derivations over DS Links.
. Extensively wrote Custom Routines and Transformations as per the
business requirements.
. Developed various jobs using CFF, ODBC, Lookup, Aggregator, Sequential
file stages.
. Extensively used Sort, Merge, Aggregator, Peek, DataSet, DB2 and Remove
Duplicates stages.
. Involved in the migration of DataStage jobs from development to
production environment.
. Created shared containers to use in multiple jobs.
. Imported and exported Repositories across DataStage projects.
. Extensively worked on DataStage Job Sequencer to Schedule Jobs to run
jobs in Sequence.
. Used shared containers to reuse the specific business logic in various
jobs to eliminate 30% of redevelopment
Environment: Ascential DataStage 7.5, (Manager, Designer, Director), DB2,
Oracle 8i, PL/SQL, AIX, Toad 7.
..............................................................................
................................................
CITIZENS BANK, Providence, RI
2009
DataStage Developer

Jan 2007 - Feb

. Developed the source to target process and mapping documentation.


. Designed and developed jobs for extraction of data from different
datafeeds into IBM DB2 database.

. Developed jobs for handling different data transformations as per


specified requirements using stages like Join, Merge, Lookup,
Transformer and Aggregator etc.
. Used Change Capture Stage, Sort Merge and Funnel for developing the
Delta process.
. Designed and developed Shared Containers that can be reused by other
parallel jobs.
. Developed jobs for loading Data into DB2 target database and used
stages like DB2 Bulk loader, DB2 API stages.
. Designed the Unit testing and integrated testing process and necessary
documentation.
. Involved in performance tuning to reduce the time consumption.
. Used DataStage Manager for Importing and exporting jobs into different
projects.
. Used UNIX and PERL scripts to execute jobs and also used the DataStage
Director for scheduling, executing and monitoring jobs.
Environment: Ascential DataStage 7.1 Enterprise Edition (Designer, Manager,
Director and Administrator), Teradata v2r6, v2r12, Oracle 9i, Aqua Data
Studio, Toad, Shell Scripts, AIX 5.1.
==============================================================================
Professional Summary:
Over 7 years of experience in Data modeling, Datawarehouse Design, Development a
nd Testing using ETL and Data Migration life cycle using IBM WebSphere DataStage
8.x/7.x
Expertise in building Operational Data Store (ODS), Data Marts, and Decision Sup
port Systems (DSS) using Multidimensional Model(Kimball and Inmon),Star and Snow
flake schema design.
Experience in analyzing the data generated by the business process, defining the
granularity, source to target mapping of the data elements, creating Indexes an
d Aggregate tables for the data warehouse design and development.
Data Processing experience in designing and implementing Data Mart applications,
mainly transformation processes using ETL tool DataStage (Ver8.0/7), designing
and developing jobs using DataStage Designer, Data Stage Manager, DataStage Dire
ctor and DataStage Debugger.
Efficient in all phases of the development lifecycle, coherent with Data Cleansi
ng, Data Conversion, Performance Tuning and System Testing.
Excellent in using highly scalable parallel processing Infrastructure using Data
Stage Parallel Extender.
Efficient in incorporation of various data sources such as Oracle, MS SQL Server
, and DB2, Sybase, XML and Flat files into the staging area.
Experience in Mapping Server/parallel Jobs in DataStage to populate tables in Da
ta warehouse and Data marts.
Proven track record in addressing production issues like performance tuning and
enhancement.
Excellent knowledge in creating and managing Conceptual, Logical and Physical Da
ta Models.
Experience in dimensional and relational database design.
Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Sche
ma methodologies.
Expert in unit testing, system integration testing, implementation, maintenance
and performance tuning.
Experience in different Scheduling tools like AutoSys for automating and schedul
ing jobs run.
Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loa
der.
Experience in UNIX Shell Scripting.
Excellent knowledge of operating systems Windows, UNIX, Macintosh, and databases

including Oracle, SQL Server,and DB2.


Experience in implementing Quality Processes like ISO 9001:2000/Audits.
Detail oriented with good problem solving, organizational, analysis, highly moti
vated and adaptive with the ability to grasp things quickly.
Ability to work effectively and efficiently in a team and individually with exce
llent interpersonal, technical and communication skills.
Education Qualifications:
Masters in Electrical and Computer Engineering.
Skill Sets:
IBM Information Server V8.1(DataStage, QualityStage, Information Analyzer), Asce
ntial DataStage V7.5 (Designer, Director, Manager, Parallel Extender)., Oracle 8
i/9i/10g, MS SQL Server 2005/2008, DB2 UDB, MS Access, Sybase, SQL, PL/SQL, SQL*
Plus, Flat files, Sequential files, TOAD 9.6, Erwin, Microsoft Visio, Oracle Dev
eloper 2000, SQL*Loader, IBM Cognos 8.0, IBM AIX UNIX, Red Hat Enterprise Linux
4, UNIX Shell Scripting, Windows NT,/XP, Macintosh,C,C++,VB scripting.

Project Summary:
Prudential Financial, Newark,NJ 1/2009- Present
DataStage Developer
Prudential Financial, Inc. is a Fortune Global 500, provides insurance, investme
nt management, and other financial products and services to both retail and inst
itutional customers throughout the United States and in over 30 other countries.
The project objective was to collect, organize and store data from different ope
rational data sources to provide a single source of integrated and historical da
ta for the purpose of reporting, analysis and decision support to improve the cl
ient services.
Hardware/Software:
IBM DataStage 8.0 (Designer, Director, Manager, Parallel Extender), Oracle 10g,S
QL Server 2008, DB2 UDB, Flat files, Sequential files, Autosys, TOAD 9.6, SQL*Pl
us, AIX UNIX, IBM Cognos 8.0
Responsibilities:
Interacted with End user community to understand the business requirements and i
n identifying data sources.
Analyzed the existing informational sources and methods to identify problem area
s and make recommendations for improvement. This required a detailed understandi
ng of the data sources and researching possible solutions.
Implemented dimensional model (logical and physical) in the existing architectur
e using Erwin.
Studied the PL/SQL code developed to relate the source and target mappings.
Helped in preparing the mapping document for source to target.
Worked with Datastage Manager for importing metadata from repository, new job Ca
tegories and creating new data elements.
Designed and developed ETL processes using DataStage designer to load data from
Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and f
rom staging to the target Data Warehouse database.
Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate,
Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stage

s in accomplishing the ETL Coding.


Developed job sequencer with proper job dependencies, job control stages, trigge
rs.
Used QualityStage to ensure consistency, removing data anomalies and spelling er
rors of the source information before being delivered for further processing.
Excessively used DS Director for monitoring Job logs to resolve issues.
Involved in performance tuning and optimization of DataStage mappings using feat
ures like Pipeline and Partition Parallelism and data/index cache to manage very
large volume of data.
Documented ETL test plans, test cases, test scripts, and validations based on de
sign specifications for unit testing, system testing, functional testing, prepar
ed test data for testing, error handling and analysis.
Used Autosys job scheduler for automating the monthly regular run of DW cycle in
both production and UAT environments.
Verified the Cognos Report by extracting data from the Staging Database using PL
/SQL queries.
Wrote Configuration files for Performance in production environment.
Participated in weekly status meetings.
Kaiser Permanente, Pleasanton, CA 05/2007-- 12/2008
ETL Designer/ DataStage Developer
Kaiser Permanente is an integrated managed care organization, is the largest hea
lth care organization in the United States. The Health Plan and Hospitals operat
e under state and federal non-profit tax status, while the Medical Groups operat
e as for-profit partnerships or professional corporations in their respective re
gions.

The project was to design, develop and maintain a data warehouse for their vendo
r's data, internal reference data and work with their DBA to ensure that the phy
sical build adheres to the model blueprint.
Hardware/Software:
DataStage 7.5.1 Enterprise Edition, Quality Stage, Flat files,Oracle10g, SQL Ser
ver -2005/2008, Erwin 4.2, PL/SQL, UNIX, Windows NT/XP
Responsibilities:
Involved in understanding of business processes and coordinated with business an
alysts to get specific user requirements.
Studied the existing data sources with a view to know whether they support the r
equired reporting and generated change data capture request.
Used Quality Stage to check the data quality of the source system prior to ETL p
rocess.
Worked closely with DBA's to develop dimensional model using Erwin and created t
he physical model using Forward Engineering.
Worked with Datastage Administrator for creating projects, defining the hierarch
y of users and their access.
Defined granularity, aggregation and partition required at target database.
Involved in creating specifications for ETL processes, finalized requirements an
d prepared specification document.
Used DataStage as an ETL tool to extract data from sources systems, loaded the d
ata into the SQL Server database.
Imported table/file definitions into the Datastage repository.
Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge,
Aggregator stages compiled, debugged and tested. Extensively used stages availa

ble to redesign DataStage jobs for performing the required integration.


Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere
DataStage Director for developing jobs and to view log files for execution error
s.
Controlled jobs execution using sequencer, used notification activity to send em
ail alerts.
Ensured that the data integration design aligns with the established information
standards.
Used Aggregator stages to sum the key performance indicators used in decision su
pport systems.
Scheduled job runs using DataStage director, and used DataStage director for deb
ugging and testing.
Created shared containers to simplify job design.
Performed performance tuning of the jobs by interpreting performance statistics
of the jobs developed.
Documented ETL test plans, test cases, test scripts, and validations based on de
sign specifications for unit testing, system testing, functional testing, regres
sion testing, prepared test data for testing, error handling and analysis.
Macy's, Atlanta,GA 12/2005-- 04/2007
ETL Developer
Macy's (NYSE: M) is a chain of mid-to-high range American department stores deli
vering fashion and affordable luxury to customers coast to coast. Online shoppin
g is offered through macys.com. Its selection of merchandise can vary significan
tly from location to location, resulting in the exclusive availability of certai
n brands in only higher-end stores.
The aim of the Project was to build a data warehouse, which would keep historica
l data according to a designed strategy. Flat files, Oracle tables were part of
the source data, which came in on a daily, weekly, monthly basis.
Hardware/Software:
IBM Information Server DataStage 7.5, Oracle 10g, SQL, PL/SQL, UNIX, SQL*Loader,
Autosys, Business Objects 6.1, Windows 2003, IBM AIX 5.2/5.1, HP Mercury Qualit
y Center 9.0

Responsibilities:
Involved in understanding of business processes and coordinated with business an
alysts to get specific user requirements.
Used Information Analyzer for Column Analysis, Primary Key Analysis and Foreign
Key Analysis.
Extensively worked on DataStage jobs for splitting bulk data into subsets and to
dynamically distribute to all available processors to achieve best job performa
nce.
Developed ETL jobs as per business rules using ETL design document
Converted complex job designs to different job segments and executed through job
sequencer for better performance and easy maintenance.
Used DataStage maps to load data from Source to target.
Enhanced the reusability of the jobs by making and deploying shared containers a
nd multiple instances of the jobs.
Imported the data residing in the host systems into the data mart developed in O
racle 10g.
Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly,
weekly monthly basis with proper dependencies.
Wrote complex SQL queries using joins, sub queries and correlated sub queries
Performed Unit testing and System Integration testing by developing and document

ing test cases in Quality Center.


Validated the report generated using Business Objects using PL/SQL queries.
Worked on troubleshooting, performance tuning and performances monitoring for en
hancement of DataStage jobs and builds across Development, QA and PROD environme
nts.
Citibank Inc., New York, NY 12/2004--11/2005
DataStage Developer
Citibank, the leading global banking company, has some 200 million customer acco
unts and does business in more than 100 countries, providing services to consume
rs, corporations, governments and institutions.
The project was to transform the data coming from various sources through multip
le stages before being loaded into the data warehouse and maintenance.
Hardware/Software:
Ascential DataStage 7.0(Designer, Manager, Director), Oracle 9i, MS Access, SQLServer 2000/2005,SQL, PL/SQL, Toad, UNIX
Responsibilities:
Involved in understanding of business processes to learn business requirements.
Extracted data from different systems into Source. Mainly involved in ETL develo
ping.
Defined and implemented approaches to load and extract data from database using
DataStage.
Worked closely with data warehouse architect and business intelligence analyst i
n developing solutions.
Used Erwin for data modeling (i.e. modifying the staging and SQL scripts on Orac
le and MS Access Environments).
Involved in design, source to target mappings between sources to operational sta
ging targets, using DataStage Designer.
Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge,
Aggregator stages compiled, debugged and tested. Extensively used stages availa
ble to redesign Data Stage jobs for performing the required integration.
Executed jobs through sequencer for better performance and easy maintenance.
Involved in unit, performance and integration testing of Data Stage jobs.
Used Data Stage Director to run and monitor the jobs for performance statistics.
Involved in performance tuning of the jobs.
Used T-SQL for validating the data generated at OLAP server.
Wipro Technologies, Bangalore 08/2003--11/2004 Quality Assurance Engineer
Wipro Technologies is an Indian Multinational, a leading provider of integrated
business, technology and process solutions on a global delivery platform.
Worked as a QA tester on web based E-billing application. The application has tw
o modules account payable (AP) and account receivable (AR) to keep track of tran
sactions.
Hardware/Software:
Windows XP,ASP.net,Test Director 7.2,WinRunner 7.0,SQL server 2000
Responsibilities:
Active participation in decision making and QA meetings and regularly interacted
with the Business Analysts &development team to gain a better understanding of
the Business Process, Requirements & Design.
Analyzed business requirements with perspective to Black Box Testing, system arc
hitecture/design and converted them into functional requirements/test cases.
Used Test Director to document the requirements and created traceability matrice
s for the requirements.

Developed Test Plan that included the scope of the release, entrance and exit cr
iteria and overall test strategy. Created detailed Test Cases and Test sets and
executed them manually.
Performed Cross-Browsing testing to verify if the application provides accurate
information in different (IE, Netscape, Firefox, Safari) browsers.
Extensively used Output and Checkpoint for verifying the UI properties and value
s using VB scripting.
Back-End Database verification manually and using WinRunner to automatically
Verify Database with the values entered during automated testing.
Performed the Back-End integration testing to ensure data consistency on front-e
nd by writing and executing SQL Queries. Provided management with metrics, repor
ts, and schedules and was responsible for entering, tracking bugs.
Ensured that the Defect was always written with great level of detail.
Certifications
See above
================================================================================
================================================================================
==========
Over 6 years of Dynamic career reflecting pioneering experience and high per
formance in System Analysis, design, development and implementation of Relationa
l Database and Data Warehousing Systems using IBM Data Stage 8.0.1/7.x/6.x/5.x (
Info Sphere Information Server, Web Sphere, Ascential Data Stage).
Excellent Experience in Designing, Developing, Documenting, Testing of ETL j
obs and mappings in Server and Parallel jobs using Data Stage to populate tables
in Data Warehouse and Data marts.
Proficient in developing strategies for Extraction, Transformation and Loadi
ng (ETL) mechanism.
Expert in designing Parallel jobs using various stages like Join, Merge, Loo
kup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Mod
ify, Aggregator, XML.
Expert in designing Server jobs using various types of stages like Sequentia
l file, ODBC, Hashed file, Aggregator, Transformer, Sort, Link Partitioner and L
ink Collector.
Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/
SQL, Oracle, Teradata, XML and MS-Access) into data staging area.
Expert in working with Data Stage Manager, Designer, Administrator, and Dire
ctor.
Experience in analyzing the data generated by the business process, defining
the granularity, source to target mapping of the data elements, creating Indexe
s and Aggregate tables for the data warehouse design and development.
Excellent knowledge of studying the data dependencies using metadata stored
in the repository and prepared batches for the existing sessions to facilitate s
cheduling of multiple sessions.
Proven track record in troubleshooting of Data Stage jobs and addressing pro
duction issues like performance tuning and enhancement.
Expert in working on various operating systems like UNIX AIX 5.2/5.1, Sun So
laris V8.0 and Windows 2000/NT.
Proficient in writing, implementation and testing of triggers, procedures an
d functions in PL/SQL and Oracle.
Experienced in Database programming for Data Warehouses (Schemas), proficien
t in dimensional modeling (Star Schema modeling, and Snowflake modeling).
Expertise in UNIX shell scripts using K-shell for the automation of processe
s and scheduling the Data Stage jobs using wrappers.
Experience in using software configuration management tools like Rational Cl
ear case/Clear quest for version control.
Experienced in Data Modeling as well as reverse engineering using tools Erwi
n, Oracle Designer and MS Visio, SQL server management studio, SSIS and SSRS and

store procedure.
Expert in unit testing, system integration testing, implementation and maint
enance of databases jobs.
Effective in cross-functional and global environments to manage multiple tas
ks and assignments concurrently with effective communication skills.
EDUCATIONAL QUALIFICATION:
Bachelors in Electronics and Communication,
TECHNICAL SKILLS:
ETL Tools
DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.0, Ascential Data Stag
e /7.5.2/5.1/6.0 Profile Stage 7.0, SSIS (SQL server 2005), Data Integrator.
Business Intelligence tools
Business Objects, Brio, SSRS(SQL Server 2005),IBM Cognos 8 BI
Development Tools and Languages
SQL, C, C++, Unix Shell Scripting, Perl, PL/SQL, oracle
Testing Tools
Auto Tester, Test Director, Lotus Notes
Data Modeling Tools
Erwin 4.0, Sybase Power Developer, SSIS,SSRS
Operating Systems
HP-UX, IBM-AIX 5.3, Windows 95/98/2000/ NT, Sun Solaris, Red-Hat Linux, MS SQL S
ERVER 2000/2005/2008& MS Access
WORK EXPERIENCE:
Confidential, CA Nov 2010
Present ETL Developer
NetApp Inc is leading Network Appliance Manufacturer Company as well as data sto
rage Company which provide Network appliance like hard disk, shelf for small bus
iness owners, large business owners. Also NetApp provides efficient data storage
facility. The main aim is to provide variety of services like Data storage, Dat
a Analysis, Data warehouse, Data mart etc. which can adopt consistent tailored p
rocesses in order to strive and fulfill promise of commitment and reliability to
Customers.
Involved as primary on-site ETL Developer during the analysis, planning, des
ign, development, and implementation stages of projects using IBM Web Sphere sof
tware (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WIS
D of IIS 8.0.1).
Prepared Data Mapping Documents and Design the ETL jobs based on the DMD wit
h required Tables in the Dev Environment.
Active participation in decision making and QA meetings and regularly intera
cted with the Business Analysts &development team to gain a better understanding
of the Business Process, Requirements & Design.
Used DataStage as an ETL tool to extract data from sources systems, loaded t
he data into the ORACLE database.
Designed and Developed Data stage Jobs to Extract data from heterogeneous so
urces, Applied transform logics to extracted data and Loaded into Data Warehouse
Databases.
Created Datastage jobs using different stages like Transformer, Aggregator,

Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Fi
lter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator
, Row Generator, Etc.
Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.
Extensively worked with sequential file, dataset, file set and look up file
set stages.
Extensively used Parallel Stages like Row Generator, Column Generator, Head,
and Peek for development and de-bugging purposes.
Used the Data Stage Director and its run-time engine to schedule running the
solution, testing and debugging its components, and monitoring the resulting ex
ecutable versions on ad hoc or scheduled basis.
Developed complex store procedures using input/output parameters, cursors, v
iews, triggers and complex queries using temp tables and joins.
Converted complex job designs to different job segments and executed through
job sequencer for better performance and easy maintenance.
Creation of jobs sequences.
Maintained Data Warehouse by loading dimensions and facts as part of project
. Also worked for different enhancements in FACT tables.
Created shell script to run data stage jobs from UNIX and then schedule this
script to run data stage jobs through scheduling tool.
Coordinate with team members and administer all onsite and offshore work pac
kages.
Analyze performance and monitor work with capacity planning.
Performed performance tuning of the jobs by interpreting performance statist
ics of the jobs developed.
Documented ETL test plans, test cases, test scripts, and validations based o
n design specifications for unit testing, system testing, functional testing, pr
epared test data for testing, error handling and analysis.
Participated in weekly status meetings.
Developed Test Plan that included the scope of the release, entrance and exi
t criteria and overall test strategy. Created detailed Test Cases and Test sets
and executed them manually.
Environment: IBM Web Sphere DataStage 8.1 Parallel Extender, Web Services, Quali
ty Stage 8.1, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IB
M DB2 Database, SQL Server, IBM DB2,Teradata, ORACLE 11G, Query man, Unix, Windo
ws.
Confidential, NJ Jan 2010 - Oct 2010 Lead Sr. Datastage Developer
Project was to design and develop enterprise data warehouse. Extract data from h
eterogeneous source system, transform them using business logic and load in to d
ata warehouse.
Used the DataStage Designer to develop processes for extracting, cleansing,
transforming, integrating and loading data into staging tables.
Extensively used ETL to load data from IBM DB2 database, XML & Flat files So
urce to Informix Database Server.
Involved in analysis, planning, design, development, and implementation phag
es of projects using IBM Web Sphere software (Quality Stage v8.0.1, Web Service,
Information Analyzer, Profile Stage, WISD of IIS 8.0.1).
Developed complex jobs using various stages like Lookup, Join, Transformer,
Dataset, Row Generator, Column Generator, Datasets, Sequential File, Aggregator
and Modify Stages.
Created queries using join and case statement to validate data in different
databases.
Created queries to compare data between two databases to make sure data is m
atched.
Used the DataStage Director and its run-time engine to schedule running the
solution, testing and debugging its components, and monitoring the resulting exe
cutable versions on an ad hoc or scheduled basis.

Created shared container to incorporate complex business logic in job.


Monitoring the Datastage job on daily basis by running the UNIX shell script
and made a force start whenever job fails.
Created and modified batch scripts to ftp files from different server to dat
a stage server.
Extensively used slowly changing dimension Type 2 approach to maintain histo
ry in database.
Created Job Sequencers to automate the job.
Modified UNIX shell script to run Job sequencer from the mainframe job.
Create parameter set to assign a value to job at run time.
Standardized the Nomenclature used to define the same data by users from dif
ferent business units.
Created multiple layer report providing a comprehensive and detail report wi
th Drill through facility.
Used Parallel Extender for Parallel Processing for improving performance whe
n extracting the data from the sources.
Worked with Metadata Definitions, Import and Export of Datastage jobs using
Data stage Manager.
Providing the logical data model design, generating database, resolving tech
nical issues, and loading data into multiple instances.
Implemented PL/SQL scripts in accordance with the necessary Business rules a
nd procedures.
Developed PL/SQL procedures & functions to support the reports by retrieving
the data from the data warehousing application.
Used PL/SQL programming to develop Stored Procedures/Functions and Database
triggers.
Environment: IBM Web Sphere DataStage 8.0.1 Parallel Extender, Web Services, Qua
lity Stage 8.0, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1
IBM DB2 Database, SQL Server 2000, IBM DB2,Teradata, ORACLE 11G, Query man, BMQ,
Unix, Windows.
Dec 2009 Lead Datastage ETL Developer
Confidential, VA Oct 2008
Project was involved in design and development of a group insurance system, whic
h processes claims for group insurance. It covers benefits with subsystems cover
ing Term Life Insurance, Medical Indemnity and Managed Health Care.
Data Modeling:
Gathered and analyzed the requirements of the in-house business users for th
e data warehousing from JAD sessions.
Collected the information about different Entities and attributes by studyin
g the existing ODS and reverse engineering into Erwin.
Defined the Primary keys and foreign keys for the Entities.
Defined the query view, index options and relationships.
Created logical schema using ERWIN 4.0 and also created the Dimension Modeli
ng for building the Cubes.
Designed staging and Error handling tables keeping in view the overall ETL s
trategy.
Assisted in creating the physical database by forward engineering.
ETL Process:
Extracted data from source systems transformed and loaded into Oracle databa
se according to the required provision.
Primary on-site technical lead during the analysis, planning, design, develo
pment, and implementation stages of data quality projects using Integrity (now k
nown as Quality Stage).
Involved in system analysis, design, development, support and documentation.
Created objects like tables, views, Materialized views procedures, packages
using Oracle tools like PL/SQL, SQL*Plus, SQL*Loader and Handled Exceptions.

Involved in database development by creating Oracle PL/SQL Functions, Proced


ures, Triggers, Packages, Records and Collections.
Created views for hiding actual tables and to eliminate the complexity of th
e large queries.
Created various indexes on tables to improve the performance by eliminating
the full table scans.
Used the DataStage Designer to develop processes for extracting, cleansing,
transforming, integrating and loading data into Data Marts.
Created source table definitions in the DataStage Repository.
Identified source systems, their connectivity, related tables and fields and
ensure data suitability for mapping.
Generated Surrogate ID s for the dimensions in the fact table for indexed and
faster access of data.
Created hash tables with referential integrity for faster table look-up and
for transforming the data representing valid information.
Used built-in as well as complex transformations.
Used Data Stage Manager to manage the Metadata repository and for import/exp
ort of jobs.
Implemented parallel extender jobs for better performance using stages like
Join, Merge, Sort and Lookup, transformer with different source files complex fl
at files, XML files.
Optimized job performance by carrying out Performance Tuning.
Created Stored Procedures to confirm to the Business rules.
Used Aggregator stages to sum the key performance indicators in decision sup
port systems and for granularity required in DW.
Tuned DataStage transformations and jobs to enhance their performance.
Used the DataStage Director and its run-time engine to schedule running the
solution, testing and debugging its components, and monitoring the resulting exe
cutable versions on an ad hoc or scheduled basis.
Scheduled Datastage job using Autosys scheduling tool.
Prepared the documentation of Data Acquisition and Interface System Design.
Assigned the tasks and provided technical support to the development team.
Monitored the development activities of the team and updated to the Manageme
nt.
Created complicated reports using reporting tool Cognos.
Environment: IBM / Ascential Data Stage E.E./7.5(Manager, Designer, Director, Pa
rallel Extender), Quality Stage 7.5 Data Stage BASIC language Expressions,Autosy
s, Erwin 4.0, Windows NT, UNIX, Oracle 9i, SQL SERVER, Cognos, Sequential files,
.csv files.
Confidential, PA Jan 2007 -- Sep 2008
Sr. Data Stage Developer
As a DW developer designed, developed, and deployed DataStage Jobs and associate
d functionality. The warehouse employed highly complex data transformations incl
uding Slowly Changing Dimensions and a series of Stored Procedures, which made p
erformance tuning and efficient mapping highly critical. Along with designing jo
bs from scratch re-wrote existing code to enhance performance and trouble-shoot
errors in both DataStage & Oracle10G.
Responsibilities:
Used IBM Datastage Designer to develop jobs for extracting, cleaning, transf
orming and loading data into data marts/data warehouse.
Developed several jobs to improve performance by reducing runtime using diff
erent partitioning techniques.
Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel
, Filter, Copy, Aggregator, and Sort etc.
Used to read complex flat files from mainframe machine buy using Complex Fla
t File Stage.
Sequential File, Aggregator, ODBC, Transformer, Hashed-File, Oracle OCI, XML

, Folder, FTP Plug-in Stages were extensively used to develop the server jobs.
Use the EXPLAIN PLAN statement to determine the execution plan Oracle Databa
se.
Worked on Complex data coming from Mainframes (EBCIDIC files) and knowledge
of Job Control Language (JCL).
Used Cobol Copy books to import the Metadata information from mainframes.
Designed Datastage jobs using Quality Stage stages in 7.5 for data cleansing
& data standardization Process. Implemented Survive stage & Match Stage for dat
a patterns & data definitions.
Staged the data coming from various environments in staging area before into
DataMarts.
Involved in writing Test Plans, Test Scenarios, Test Cases and Test Scripts
and performed the Unit, Integration, system testing and User Acceptance Testing.
Used stage variables for source validations, to capture rejects and used Job
Parameters for Automation of jobs.
Strong knowledge in creating procedures, functions, sequences, triggers.
Expertise in PLSQL/SQL.
Performed debugging and unit testing and System Integrated testing of the jo
bs.
Wrote UNIX shell script according to the business requirements.
Wrote customized server/parallel routines according to complexity of the bus
iness requirements.
Designed strategies for archiving of legacy data.
Created shell scripts to perform validations and run jobs on different insta
nces (DEV, TEST and PROD).
Created & Deployed SSIS (SQL Server Integration Services) Projects, Schemas
and Configured Report Server to generate reports through SSRS SQL Server 2005.
Used to create ad-hoc reports by MS SQL Server Reporting Services for the bu
siness users.
Used SQL Profiler to monitor the server performance, debug T-SQL and slow ru
nning queries.
Expertise in developing and debugging indexes, stored procedures, functions,
triggers, cursors using T-SQL.
Wrote mapping documents for all the ETL Jobs (interfaces, Data Warehouse and
Data Conversion activities).
Environment: IBM Web Sphere Data stage and Quality Stage 7.5, Ascential Datastag
e7.5/EE (Parallel Extender), SQL Server 2005/2008, Linux, Teradata 12, Oracle10g
, Sybase, PL/SQL Toad, UNIX (HP-UX), Cognos 8 BI
Confidential, NJ
Jr. DATASTAGE DEVELOPER Jan 2006- Dec 2006
Merrill Lynch was a global financial service provides capital markets services,
investment banking and advisory services, wealth management, asset management, i
nsurance, banking and related financial services worldwide.
Responsibilities:
Worked on the logical and physical design of the Data warehouse. Identified
sources/targets and analyzed source data for dimensional modeling.
Good knowledge on Voluntary Insurance plans to employers to offer total Insu
rance packages.
Worked in design of Voluntary Disability, Voluntary Dental and Voluntary Lif
e of data marts.
Good knowledge on policy and claims processing
Worked on integration of Health Claims ODS from legacy systems.
Designed and developed jobs for extracting, transforming, integrating, and l
oading data into data mart using DataStage Designer, used Data Stage manager for
importing metadata from repository, new job categories and creating new data el

ements
Worked with EBCIDIC files to extract data in required format.
DataStage jobs were scheduled, monitored, performance of individual stages w
as analyzed and multiple instances of a job were run using DataStage Director.
Used Parallel Extender for splitting the data into subsets, utilized Lookup,
Sort, Merge and other stages to achieve job performance
Used DS Erwin MetaBroker to import Erwin 4.x Metadata into DataStage Reposit
ory.
Developed user defined Routines and Transformations for implementing Complex
business logic.
Extensively used Shared Containers and Job Sequencer to make complex jobs si
mple and to run the jobs in sequence
Involved in the preparation of ETL documentation by following the business r
ule, procedures and naming conventions.
Created reports for various Portfolios using the Universes as the main Data
Providers.
Created the reports using Business Objects functionality s like Queries, Slice
and Dice, Drill Down, Cross Tab, Master Detail etc.
As a part of report development, created the reports using universes as a ma
in data provider and using the Powerful business objects functionalities, and fo
rmulae. Involved in trouble shooting of various reporting errors.
Created Business Objects reports, Queries with constant interaction with the
end users. Trained end users in understanding the reports. Functionalities such
as Slice and Dice, Drill mode and Ranking were used for Multidimensional Format
ting.
Web Intelligence was used to generate reports on the internet/intranet.
Exporting the Reports to the Broadcast Agent and Used the Broadcast Agent to
Schedule, Monitor and Refresh the Reports.
Developed Test plans, Test Scenarios and Test cases for Code testing.
Trained team members
Provided 24/7 production support
Environment: IBM Web Sphere DataStage 7.5, Metastage 7.0, Business Objects 6.5,
Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, Windows 2000/NT 4.0, ERWIN 4.
1.
Confidential June 2004 Dec 2005 Jr. Datastage developer
Description: ICICI Prudential Insurance provides a wide range of insurance polic
ies such as Life Insurance, Health Insurance, Motor Vehicle Insurance and Genera
l Insurance etc. This project is developed as a process of automation for insura
nce policy management by using centralized data warehouse and Data Mart. This ap
plication provides the provision to take in various related information regardin
g the region, generates premiums and desired data in the form of reports.
Responsibilities:
Designed and developed mappings between sources and operational staging targ
ets, using Star and Snow Flake Schemas.
Provided data models and data maps (extract, transform and load analysis) of
the data marts for systems in the aggregation effort.
Involved in Extracting, cleansing, transforming, integrating and loading dat
a into data warehouse using Datastage Designer.
Developed various transformations based on customer last name, zip code for
internal business analytical purposes, loaded warehouse based on customer credit
card number with dynamic data re-partitioning.
Developed user defined Routines and Transformations by using Universe Basic.
Used Datastage Manager for importing metadata from repository, new job categ
ories and creating new data elements.
Used the Datastage Director and the runtime engine to schedule running the s
olution, testing and debugging its components and monitoring the resulting execu

table versions (on adhoc or scheduled basis).


Developed, maintained programs for scheduling data loading and transformatio
ns using Datastage and Oracle 8i.
Developed Shell scripts to automate file manipulation and data loading proce
dures.
Environment: Datastage 5.2/6.0, Oracle 8i, SQL, TOAD, UNIX, Windows NT 4.0.
================================================================================
======
Currently I am an associate with Accenture with 3.3 years of work experience in
Data analysis, design, development and implementation of Data warehousing applic
ations using IBM DataStage (ETL) in DW/BI Technologies. My main area of experien
ce has been project delivery of various sizes. I have worked primarily in the do
main of Banking, Manufacturing.
Having 3.3 years of Technical exp in design and development of the ETL jobs and
processes to support the business requirements for an enterprise data warehouse.
Good Experience on DataStage Parallel Extender (EE) using various stages with de
signer component.
Exposed to various Domains of Data warehouses like Banking, Manufacturing.
Good Knowledge on Data Warehousing Concepts.
Has the ability to develop and maintain good, long lasting client relationships.
Enjoy challenging and thought provoking work and have a strong desire to learn a
nd progress (motivated enough to self learn) Ability to pick up new technology i
ndependently.
Technology
Application/Functional areas/Packages worked on Banking, Manufacturing
Types of projects worked on Development, Enhancement
Environments worked Data Warehousing( IBM DataStage),Oracle 9i
Languages/Tool worked on DataStage Designer (Parallel jobs), SQL.
Operating System Windows 2003 Server/XP/UNIX
RDBMS Oracle 9i
Career Profile
Dates Organization Role
Jun 2010 to Till Date Accenture, Bangalore ETL Developer
Jan 2009 to May 2010 Patni Computers,Bangalore ETL Developer
Qualifications
Degree and Dates
M.C.A from St.Martin s Engineering College ,Hyderabad affiliated to JNTU(2009)
Work Experience

Project #2
Client : MASHREQ BANK
Domain : BANKING
ROLE : ETL DEVELOPER
LOCATION : BANGALORE
Jun 2010

Till Date

Description:
Mashreq Bank is the leading private bank in the United Arab Emirates (UAE) with
a growing retail presence in the region including Egypt, Qatar and Bahrain. It h
as provided banking and financial services to millions of customers and business
es since 1967.
As per current practice, each line of business manages data and risk measurement
s themselves, and as a result, it is difficult to standardize the methodology ac
ross different line of businesses. Business Unit risk managers currently present
analysis on as-requested basis, hence it is difficult to standardize and to aud
it the methodology used as well as it is impossible to do on-demand analysis.
Responsibilities:
Understanding the business functionality & Analysis of business requirements
Design and developed ETL jobs using various Active and Passive stages in paralle
l jobs using Designer.
Extracting data from sources like oracle, flat files and transforming them using
business logic and loading the data to the business tables.
Extensively used processing stages like sort, aggregator, transformer Stages in
developing jobs.
Performing unit testing on the jobs developed.
Preparing the Test Case Document and capturing the test results.
Monitoring the Datastage jobs using crontab.
Environment: IBM-DataStage-8.1(Parallel Jobs), Oracle 9i, Windows XP/UNIX.
Project #1
Client : FORD MOTORS
Domain : MANFACTURING
ROLE : ETL DEVELOPER
LOCATION : BANGALORE
Jan 2009

May 2010

Description:

QIS2 is a global quality application that is used to perform analysis of diagnos


tic data from Powertrain and Vehicle Control Modules to reduce the detection-tocorrection time for Ford Motors. FM would like to further enhance QIS2 to report
on propulsion battery measurement data.
The purpose of this project is to create additional reports to analyze the batte
ry measurement data captured in the propulsion battery assembly plants. This wil
l allow FM Battery and Volt Engineers to identify issues using measurement data
including volt and temperature readings.
Responsibilities:
Understanding the business functionality & Analysis of business requirements
Design and developed ETL jobs using various Active and Passive stages in paralle
l jobs using Designer.
Extracting data from sources like oracle, flat files and transforming them using
business logic and loading the data to the Data warehouse.
Extensively used processing stages like sort, aggregator, transformer, join Stag
es in developing jobs.
Performing unit testing on the jobs developed.
Preparing the Test Case Document and capturing the test results.
Monitoring the Datastage jobs using Autosys scheduler
Environment: IBM-DataStage-7.5.x2 (Parallel Jobs), Oracle 9i, Windows XP/2003.
================================================================================
======
Datastage Consultant
EXPERIENCE SUMMARY:
9+ Years of overall IT experience in Data warehousing Applications design, Devel
opment, Testing and Project Management, With 7+ years of Experience in ETL Devel
opment and Design using IBM Websphere Datastage8.1 Enterprise and other previous
Ascential Datastage versions and 2+ years in Oracle ETL development.
Expertise in Ascential Datastage 7.5.2 Server Edition as well.
Expertise in Datastage issue resolution, debugging and application performance t
uning.
Expertise in Performance tuning of Datastage jobs
Expertise in extracting data from various versions of Oracle and Teradata Databa
se using Datastage.
Expertise in Oracle SQL, PLSQL coding.
Proficient in UNIX korn Shell Scripting.
Familiar with PERL scripts.
Familiar with BASIC Batch jobs development in Datastage.
Familiar with Pentaho ETL Tool (haven't implemented any projects)
Familiar with BTEQ, Fast load, Fast export concepts in Teradata.
Expertise in performance tuning by using Explain Plan, Creating appropriate inde
xes, queries optimization, utilizing table spaces, partitioning schemes in oracl
e.
Experience in creating project design documents High Level Design Document, Soft
ware Requirement Specification, Mid Level Platform Application design Document(M
LPD), Detailed Platform Application Design Document (DPAD).
Expertise in developing and maintaining overall Test Methodology and Strategy, D
ocumenting Test Plans, Test Cases and editing, executing Test Cases and Test Scr

ipts.
Resolving data issues, complete unit testing and complete system documentation f
or ETL processes
Involved in development of Test cases, executing the Unit/UAT Test cases.
Involved extensively in Unit Testing, System Testing and Regression Testing.
Experience on Defect tracking tools like HP Quality Center.
Involved in daily interactions with the Business to understand their requirement
s and act in a Business Liaison role.
Setting up Defect calls for tracking and closing the defects.
Experience in producing project estimation, timeline scheduling, resource foreca
sting, communicating key milestones to IT management and clients, Task schedulin
g and status tracking.
Excellent communication skills, problem solving skills, Leadership qualities and
an attitude to learn the new cutting edge technologies.
Experience in leading and mentoring team members on both functional and technica
l aspects.
Flexible, enthusiastic and project oriented team player with solid communication
and leadership skills to develop creative solution for challenging client needs
.
Able to work independently and collaborate proactively & cross functionally with
in a team.
Technical Skills:
Operating System Windows All Versions, IBM AIX Unix
Programming Languages SQL,PL/SQL, UNIX Shell Scripting
Databases Oracle9i, 10g,11g, Teradata 13.10
ETL Tools Datastage 8.1 / 7.5.2, Pentaho 4.4.0
Tools PL/SQL Developer, Toad
Testing Tools HP Quality Centre 11.0
Education:
Degree: Bachelor of Electronics and Communication Engineering 2004
Period: 2000 - 2004
University: Bharathidasan Universtiy, India
PROFESSIONAL EXPERIENCE:
Client: FedEx Office, Dallas, Texas July'12 to Present
Role: Datastage Consultant.
Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Adminis
trator, Ascential Datastage 7.5.2 Server, Oracle 9i/ 10g/11g, Teradata 13.10, So
laris UNIX, TOAD, PLSQL Developer
Responsibilities
Design and Development of ETL jobs using Datastage 8.1 Enterprise Edition and Se
rver Edition as well.
Preparing Software Requirement Specification (SRS), Code review document, Test C
ase Specification and Deployment plan Documents
Done, Full project design for FedEx Tax Services system, which includes preparin
g Software Requirement Specification document, interacting with source teams and
preparing mapping documents, Datastage Coding, Unit Testing, support during sys
tem testing, preparing deployment plan, supporting business objects reporting te
am for their testing and supporting business during their UAT.
Done Development enhancements for the work request received for FedEx ECOM, Mobi
le print, Online Signs and Graphics, packaging and shipping projects which inclu
ded extracting data using Datastage from Oracle Database and Enterprise Data War
ehouse Teradata Database, transforming them and loading into Oracle database whi
ch was used by the Business objects team for their reporting, The work involved
in these projects include, Preparing SRS, Identifying source systems and prepari
ng Mapping documents, documenting and presenting reject handling techniques to b

usiness, preparing code review document and deployment plan.


Created Reject handling shared container for FedEx ECOM project which was used f
or reporting issues in various source system files received from the upstream sy
stems.
Created common jobs in Datastage Server Edition using CRC for Change Data Captur
e and producing Insert, update (load ready files) and delete files for loading i
n the target tables.
Created SFTP Unix script Datastage jobs to support the HR source system which wa
s upgraded to PeopleSoft 9.1
Produced Impact analysis documents, followed by SRS, Code review, Datastage Codi
ng changes and other project documents to support decommissioning of the Central
Repository(CR) DB and replacing with Enterprise Database (EDB)
Supported business enhancements for Business objects requirements which required
code changes in Datastage for work requests like Coupon code applied, Customer
Base foundation (CBF), etc
Done BASIC code development and job changes for implementing batch jobs for Orde
r analysis project and an ETL Modernization decommissioning work request.
Created documentation for a large ETL application which was having no documentat
ion at all previously and facing various issues in production and kept receiving
various change request, the documentation has been widely appreciated for its u
sefulness and the precise information helped both the business and other downstr
eam systems.
Other responsibilities include Fixing Production defects, doing system testing a
nd system integration testing for Datastage coding done by other developers
Provided various inputs for creating Data warehouse standard Document
Client: Lloyds Banking Group (LBG), Manchester, United Kingdom April'07 to May'1
2
Role: Datastage Designer / Lead Developer.
Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Adminis
trator, Oracle 10g, Teradata 13.10, IBM AIX UNIX 5.1 , TOAD, PLSQL Developer, Co
nnect Direct File Transfer Tool, IBM Mainframe
Responsibilities:
During my tenure in Lloyds as Datastage Designer and Lead developer I have predo
minantly worked in Datastage modules of the projects related to Financial Data p
rocessing, Business Performance Management, Sales Marketing Analysis and Reporti
ng, e-Statements which has extensive usage of Datastage 8.1.
Participated in all phases of project cycle including Requirement Analysis, Clie
nt Interaction, Design, Coding, Testing, Production support and Documentation.
Involved in gathering business requirements and providing development and testin
g estimation, timelines and resource forecasting.
Responsible for preparing Mid Level Platform Application Design (MLPD) and Detai
led Platform Application Design (DPAD) Documents based on Business Requirement s
pecification document, giving walkthrough to Document review panel members, clie
nts and obtaining sign offs from the CIO and other Subject Matter Experts from t
he Client Organization before proceeding with the build activity.
Involved in preparing the coding standards documentation for Datastage Developme
nt.
Involved in designing complex Parallel Datastage jobs, sequencers.
Involved in working with SAP Ledger downstream systems by sending SAP reference
files and loading processed data to the SAP tables.
Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator,
Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, SCD, Change c
apture, Copy, External filter, External Source, Pivot, Complex Flat File Stages,
etc
Used Datastage XML input stage for processing banks hierarchical files
Coordinated with UNIX admin and service delivery team in procuring the UNIX envi

ronment and to create Datastage UNIX environment file system and directories.
Involved in extracting the data from various sources like Flat files, Oracle and
Teradata databases.
Familiar with BTEQ, Fast load, Fast export concepts in Teradata.
Involved in developing Key Data validation and splitting modules using Datastage
in Financial Data processing project for various sources files coming from diff
erent source systems.
Used multiple invocation techniques for running jobs in various instances so as
to reduce the time and expediting the processes.
Used Datastage Designer to develop jobs for extracting, cleansing, transforming,
integrating, and loading data into data warehouse database.
Utilized slowly changing Dimensions for tracking changes to dimensions over time
Used Data stage Administrator for defining environment variables and project lev
el settings.
Utilized the stages of Job Sequence such as User Variable Activity, Notification
Activity, Routine Activity, Terminator Activity, etc.
Involved in developing error logging shared containers and extensively used in v
arious Datastage jobs to gather the error logs in a business defined format, whi
ch facilitated the business in analyzing the erroneous files and fixing them qui
ckly.
Responsible for validating the Datastage jobs, sequencers against the pre-define
d ETL design standards.
Involved in developing Connect direct jobs for transferring files to the downstr
eam systems and for pulling files from the source systems as well.
Involved in Datastage jobs bug fixing and supporting testing team during various
stages of testing.
Involved in tuning Source extract SQL scripts used in ETL jobs to meet the busin
ess SLAs.
Analyzed the requirements to identify the necessary tables that need to be popul
ated into the staging database.
Prepared the DDL's for the staging/work tables and coordinated with DBA for crea
ting the development environment.
Developed Ksh shell scripts for automating Datastage jobs and Housekeeping activ
ities.
Created Shell Scripts to automatically notify Business Exceptions and Rejects du
ring the Loads and file processing stages.
Created Unix Shell Scripts to read the Parameter Files and passing these values
to the job during runtime.
Experience in preparing and reviewing Functional Test Plans and Master Test Plan
s for various stages of the testing activities
Responsible for preparing various test scenarios and validating the data as per
the business rules.
Involved in preparing and executing System Testing and System Integration test c
ases
Provided support during User Acceptance Testing by running jobs and providing fi
xes.
Co-ordinate the offshore development, testing and implementations using implemen
tation plans.
Responsible for doing the peer reviews, Planning and estimating the project requ
irements and to report the status to business managers.
Client: Halifax Bank of Scotland (HBoS), India, August'06 to February'07
Role: Datastage Developer/Designer.
Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Adminis
trator,
Datastage 7.5.2, Oracle 10g, Teradata V2R6, IBM AIX UNIX 5.1, TOAD, PLSQL Develo
per.
Responsibilities:
Involved in Gathering Business Requirement and producing inputs to MLPD and DPAD

.
Involved in all phases of project cycle including Requirement Analysis, Client I
nteraction, Design, Coding, Testing, support and Documentation.
Responsible for preparing ETL Documentation for the developed processes.
Investigation of possible Terrorist/Money laundering transactions happening in t
he system for the tickets raised by the business
Responsible for handling Production Support tickets.
Automated SQL queries using UNIX scripts for loading large volume of data during
data migration activities in AML Project
Involved in tuning many Oracle scripts and other ETL processes used in this proj
ect.
Responsible for doing the peer reviews, Planning and estimating the project requ
irements and to report the status to business managers.
Created Documentation for the developed jobs where appropriate documents were no
t available.
Provided production support and bug fixing during various stages of the testing
Involved in Designing and Developing of Datastage Parallel Jobs, Sequencers and
Shared Containers
Performance Tuning of Datastage jobs and Oracle SQL queries.
Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator,
Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, SCD, Change c
apture, Copy, External filter, External Source, Pivot, etc.
Used multiple invocation technique for running jobs in multiple instances to spe
ed up the process.
Involved in writing oracle PLSQL Stored Procedures, SQL scripts
Developed Ksh shell scripts for automating Datastage jobs and Housekeeping activ
ities.
Created Shell Scripts to automatically notify Business Exceptions and Rejects du
ring the Loads and file processing stages.
Created Unix Shell Scripts to read the Parameter Files and passing these values
to the job during runtime.
Responsible for preparing Unit test cases and validating the data as per the bus
iness rules.
Provided support to the testing team during System Testing , System Integration
Testing and User Acceptance testing
Client: AstraZeneca, India March'04 to August'06
Role: ETL Oracle Developer.
Technology & Tools Used: Datastage 7.5.2, Oracle 10g, Teradata, IBM AIX UNIX 5.1
, TOAD, PLSQL Developer.
Responsibilities:
Understanding Business Requirements
Developing Datastage jobs using the MLPD, DPAD and Business Requirement Specific
ation document.
Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator,
Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, Change captur
e, Copy, External filter, External Source, Pivot, etc.
Developing Oracle SQL, PL/SQL queries and DDL's for the Database creation.
Performance tuning of SQL Queries.
Developing UNIX scripts for automating Datastage jobs.
Extensively used Datastage Designer & Director.
Writing and executing test case for unit testing
Providing support to QA team during System Testing and System Integration Testin
g.
Coordinating with UNIX Admin and DBA's for Datastage UNIX environment creation a
nd Database creation
Engaged in Production Support tasks.
Documenting the changes for future reference.

S-ar putea să vă placă și