Sunteți pe pagina 1din 8

AnkalaRao Chinthapalli

Mobile : +91-9949642442
Email: aravind7_ch@yahoo.co.in
--------------------------------------------------------------------------------------------------------------------Objective:
Looking for a Challenging role in a highly motivated team and contribute my expertise on Design and
Implementation of Data Warehouse development using Informatica, Oracle, Teradata, Microsoft SQL
server and MySQL.
Professional Summary:
Around 8 + years designing, developing large-scale data integration solutions for data
warehouses, application migrations and such across a variety of platforms and databases.
Having Good knowledge on Data Warehousing and Business Intelligence concepts with emphasis on
Extraction Transformation and Loading data and Extensive experience in analysis, design,
development, testing and implementation involved in the complete life cycle for the projects and
contributed to the development of the system architecture.
Extensive hands on Gathering requirements, analyzing the source data and developing Source to
Target mapping document.
Extensive experience in ETL methodology for developing data extraction, loading and transformation
processing. Using Informatica (Power Center 9.x/8.x/7.x Repository Manager, Designer and
Workflow Manager).
Strong development experience in Informatica with various databases like Oracle 11G, MySQL,
Teradata 12 and MS SQL Server.
Having Good Understanding of Dimensional Modeling like Star Schema, Snow Flake Schemas.
Good understanding of Informatica Architecture and Extensive experience using Informatica
components Designer, Workflow Manager, Workflow Monitor.
Extensive experience in implement the SCD (Type I, Type II and Type III) Logic Implementation.
Good Exposure in Partition Points, Cache Requirements, Commit points, Push down optimization,
Session recovery.
Good Exposure in Teradata Architecture, Writing B-TEQ scripts and hands on experience on using
TeradaUtilities like FastExport, Mload, Tpump and Fast Load.
Good Exposure in Creating Macros, Views, procedures, Query writing and performance tuning
depends on requirements.
Loading data into Teradata production and development warehouse using Teradata Parallel
Transporter (TPT) with Informatica.
Hands on knowledge on UNIX operating systems (Linux and Solaris) and Windows 98/2000/2003
Good Exposure in Shell scripting and Perl scripting
Good experience and knowledge on Business Objects, Microstrategy
Build OLAP Reports and Dashboards in MicroStrategy.8.1.1 Created Logical schema by defining
Attributes, Facts, Metrics, filters, Custom Group etc
Using Narrowcast Administration 8.1.1 created Service, subscribers, and subscription with
personalization characteristics of each subscription and scheduled and monitored the services.
Experience Details:
Working as a Team Leader in Accenture, Hyderabad from July 2010 to Till date.
Worked as a Senior Software Engineer in Surgical information Systems, Hyderabad from May 2006 to
June 2010.
Worked as a Software Engineer in ICSA, Hyderabad, from Nov 2004 to April 2006.
Certifications:
1. Informatica Certified Developer 8.x
2. Informatica Certified Administrator 8.x
3. Teradata certified professional 12.0
Education Details:

1 M.C.A from Nagarjuna University Passed Out 2003.


Technical Skills:
ETL
: Informatica Power Center 9.1.0/8.6.1/7.x
OLAP Tools
: Business Objects 6.5, Microstrategy 9.0, Qlickview.
Database Languages : SQL Server 2005, Oracle 9i,Teradata12.0,13.0,
SQL data modeler 3.3
Operating Systems
: Windows NT and Windows 2003,UNIX.
Project #1:
Title
: Novartis AMAC CI to Informatica migration
Client
: Novartis.
Role
: ETL Lead
Duration
: Oct 2014 to Till Date
Environment
: Informatica Power Center 9.5.0, Informatica Cloud service, Castiron, Flat
Files, Oracle11g, SQL, Salesforce.com, Windows, UNIX.
Project Description:
The main intension of this project needs to migrate from current platform to new platform
alongside scale up the current system. Fetching flat files data from salesforce.com (SFDC). Need to load
the database tables alongside SFDC system to oracle base by using Informatica Cloud service.
Loading from multiple source systems Flat Files, Veeva related data in salesforce.com and from Cloud
loading data into Oracle tables.
Responsibilities:
Involved in requirement gatherings and analyzing BRDs and worked with system analysts to
create source to target documents.
Analyzed data in source databases before loading into data warehouse and created technical
documents according to BRD.
Created complex mappings using technical documents and loaded data into various data bases.
Used dynamic cache in lookups and created slowly changing dimensions according to the
requirements.
Developed mappings involving complex business logic using mapping parameters, mapping
variables and unconnected lookups SQL overrides, Normalizer, Union etc.,
Wrote pre and post session SQL scripts while loading data in Oracle data base.
Identified bottlenecks at mapping level using debugger and resolved them to increase
performance.
Optimized performance of mappings by using appropriate transformations like source qualifier,
aggregator, connected lookups etc.,
Worked with data architects in analyzing loaded data in data base and modified transformation
logic if necessary
Created pre-session and post-session shell scripts and e-mail notifications.
Optimized Query performance and DTM buffer size, Buffer Block Size to tune session
performance
Used Parameter files to initialize workflow variables, Mapping parameters and mapping
variables and used system variables in mappings for filtering records in mappings.
Performed data validation, reconciliation and error handling in the load process.
Created, optimized, reviewed, and executed SQL test queries to validate transformation rules
used in source to target mappings/source views, and to verify data in target tables.
Developed and executed test cases for Integration and system testing and documented them along
with technical documents.
Involved in migrating informatica code from 8.6 version to 9.0.1.
Worked with QA team to resolve and analyze defects in different tiers.

Optimized the performance of queries by removing unnecessary columns, eliminated redundant


and inconsistent data, normalized tables, established joins and created indexes wherever
necessary
Worked with back end Database Administrators to provide requirements for necessary back
endless.
Project #2:
Title
: Star Aggregator (Analytics)
Client
: Star TV.
Role
: ETL Lead
Duration
: July 2014 to Sep 2014.
Environment
: Flat Files, MySQL, SQL, Windows, Qlikview, UNIX, Erwin 4.1
Description:
AVS BE source systems make available data in the shape of interfaces/extracts to Star Aggregator System
as input for the Staging Area.
Source data will be composed of flat csv files produced by AVS BE source system.
Star DataMart solution, contains an aggregated level, with the purpose of creating pre-aggregate data tables
from existing target tables, as input for dash boarding QlikView system.
QlikView reporting tool takes in input the four tables provided by MySQL database, after being filled in the
Reporting Area.
Every night a reload of new available data is scheduled, in order to give the customer the possibility to
watch each KPI updated to the last useful date.
Roles and Responsibilities:
Performed Customer Requirements Gathering, Requirements Analysis, Design, Development, Testing,
End User Acceptance Presentations, Implementation and Post production support of BI projects.
Designed Source to Target mappings from primarily Flat files and MYSQL tables using Unix Jobs.
Developed Transformation Logic to cleanse the source data of inconsistencies before loading the data
into staging area which is the source for stage loading by using procedures.
Responsible for performance tuning for several ETL process .
Created UNIX shell scripts to read and archive files from source directory.
Designed and developed error handling strategies to re-route bad data.
Involved in conceptual, logical, physical data modeling and designed star schema for data warehouse
Used Crontab to schedule and run sessions, as well as to check logs for all activities.
Worked with migration team, testing team to fix defects in various environments like DEV and QA.
Involved in the optimization of SQL queries which resulted in substantial performance improvement
for the conversion processes.
Used mapping variables and mapping parameters and created parameter files.
Participated in code reviews and modified procedures according to the feedback from client.
Developed various Adhoc and created Documents using Report Services Documents.
Deployed procedures code to Integration,QA and production environments and supported post
production issues.
Documented technical mapping specifications and reviewed them with architects.
Project #3
Title
: BWIN-PARTY-EDWH
Client
: BWIN.
Role
: Team Lead (ETL Developer)
Duration
: July 2012 to June 2014
Environment
: Informatica Power Center 9.1.0, ODI, Flat Files, Oracle9i, SQL
Server2005, Teradata12.0, Windows, UNIX.
Project & Role Description
Bwin.Party Digital Entertainment is an online gambling company, formed by the March2011 merger of
PartyGaming plc and Bwin Interactive Entertainment AG. The worlds largest publicly traded online

gambling firm. It is headquartered in Gibraltar. The Bwin.Party having four key products/verticals: Sports
betting, Poker, Casino and Bingo. Sports betting is a core business and three main brands bwin,
Gamebookers and PartyBets. Poker is one of core businesses and main brand is PartyPoker.

Responsibilities:
Wrote technical requirements and specifications for the modifications after interacting with
customers/end users to obtain the requirements
Worked with Business Analysts and Data Architects in gathering the requirements and designed
the Mapping Specification Documents.
Prepared technical requirements documents which include both macro-level and micro-level
design documents
Used Erwin data modeler to design the data marts and also generate the necessary DDL scripts
for objects creation for DBA review
Involved in preparing & documenting Test cases and Test procedures. Involved in developing
these for Unit test, Integration Test and System Tests
using various Informatica Transformations like Union, Joiner, Expression, Lookup, Aggregate,
Filter, Router Normalizer, Update Strategy etc
Involved in the performance tuning of Informatica mappings and Reusable transformations
and analyze the target based commit interval for optimum session performance
Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email
tasks and various other applications
Used Sequence Generator to create Dimension Keys and Update Strategy to insert records into
the target table in staging and Data Mart.
Used unconnected lookup where different expressions used the same lookup and had multiple
targets, which use same logic executed and return one value
Used the Debugger in debugging some critical mappings to check the data flow from instance to
instance.
Implemented Slowly Changing Dimensions - Type I,II mappings as per the requirements
Created error log table to capture the error messages, session load time
Performance tuned the workflow by identifying the bottlenecks in sources, targets, mappings
and sessions
Identifying read/write errors using Workflow and Session logs
Used Parameter files to initialize workflow variables, Mapping parameters and mapping
variables and used system variables in mappings for filtering records in mappings
Developed all the mappings according to the design document and mapping specs provided and
performed unit testing
Project #4
Title
: BWIN Integration
Client
: BWIN.
Role
: ETL Developer
Duration
: July 2011 to June 2012
Environment
: Informatica Power Center 9.1.0, ODI, Flat Files, Oracle9i, SQL
Server2005, Teradata12.0, Windows, UNIX.
Project Description:
The Integration Committee has been established to oversee the efficient and effective integration
of the previously existing PartyGaming and BWIN businesses. So as identified integrated part from
combining the two businesses. Mian modules is Pokar, Casino, Sports betting and CSM. I have handle
single module and develop the mapping based on the mappings developed in the ODI. Convert the PL/Sql
procedures to Macros in the Tearadata. Monitor the loads on daily basis.

As per Importence of Teradara in a project for loading data from source to target using TeradaUtilities
like FastExport, Mload, Tpump, FastLoad and TPT (Teradata parallel Transfer).

Responsibilities:

Responsible to deliver defect free code to client production environment


Involved in requirements study and understanding the functionalities
Preparation of Functional Requirement Specifications.
Preparation of Technical design document and High Level design document.
Preparation ETL Mapping design document.
Creating data Model as per functionality of the application by using data model tool like Erwin.
Responsible to analyze the impact to the system whenever there is a change in source data.
Created mapping using the transformations such as the Source qualifier, Aggregator, Expression,
Router, Filter, Rank, Sequence Generator, and Update Strategy.
Identified and created different source definitions to extract data from input sources and load into
relational tables using Informatica Power Center.
Developed complex Informatica mapping using sql override to implement the Type2 Logic.
Used heterogeneous data sources(SQL Server 2005, Oracle 10g,Teradata,Mysql and flat files.
Involved in Extensive Design and Code reviews of ETL Informatica Objects
Created and Monitored Informatica sessions and created the Macros for some sessions.
Analyzed Mappings and Sessions for better performance.
Used Simple Pass Through, Key Range, Round Robin Pipeline along the data transformation

pipeline to increase the performance.


Performed the role of a Team Lead which included mentoring people.
Responsible for review the Unit Test cases, Unit test results.
To develop the Source to target mapping document for the changes happening in source and
propose optimal design.
Taking overall responsibility and ownership for team deliverables.
Project #5
Title
: Sanofi US AgSpend
Client
: SANOFI-AVENTIS, USA.
Role
: ETL Developer
Duration
: July 2010 to June 2011
Environment
: Windows, UNIX, Informatica Power Center 8.6.1, Flat Files, Teradata,
Oracle9i. Microstrategy 8.1.1
Description:
The US Transparency Initiative is in part a response to the Patient Protection and Affordable Care Act
(PPACA) Federal Statute and is designed to create transparency for all Health Care Provider (HCP) and
Health Care Organization (HCO) expenditures.
The Transparency System consists of two main modules: Data Capture and Reporting. For Data Capture,
several integration layers will be incorporated into Sanofi-Aventis architecture to drive a comprehensive
data warehouse that contains all Covered Recipient spend transactions and master data. The data will be
sourced from multiple Source Systems.
Mainly Four modules in AgSpend Accounts, Prescribers, Affliations and Products. Process flow is
extracts the data from different data sources like SAP, CLUBNET, IST, Pasteur and transfer the data from
flat file to staging. Now Staging to landing and Landing to Final. Extracts the data from FTP using UNIX
scripts and developed mappings in Informatica Power Center Designer to load data from staging
environment to warehouse.

Responsibilities:
Involved in requirements study and understanding the functionalities
Preparation of Functional Requirement Specifications.
Responsible to analyze the impact to the system whenever there is a change in source data.
Created UNIX scripts for loading the data from csv files to Stage then loading the data from Stage
to Target by using procedures.
Based on the functional design Implemented the SCD Type1 Logic.
Analyzed Aggregate Procedures then tune procedures for better performance.
Performed the role of a Team Lead which included mentoring people.
Responsible for review the Unit Testcases, Unit test results.
Project#6
Title
: HealthCare Management System
Client
: UCONN, US.
Role
: ETL Developer
Duration
: July 2008 to June 2010
Environment
: Informatica Power Center 8.6, Windows, Flat Files, Oracle9i, UNIX.

Description:
It is a total healthcare management system, which manages the total patient information
maintained in three different modules. Patient Information, Accounts Information, Monitoring. The
Patient information module contains the patient details, up to dated health information. In the Accounts
Information module day-to-day bill settlements will be entered in to the online system. In Monitoring, it
monitors from the day one to discharge date of the patient and Case Sheet will be maintained which is
confidential
Responsibilities:
Responsible for Business Analysis and Requirements Collection.
Translated requirements into business rules & made recommendations for innovative IT solution
Involved in analyzing scope of application, defining relationship between data, star schema, etc.
Analysis of star schema in dimensional modeling finds the suitable dimensions and facts for schema.
Involved in the Design and development of Data Mart and fetching data from different data sources.
Documented data conversion, integration, load and verification specifications.
Parsing high-level design spec to simple ETL coding and mapping standards.
ETL to load data from wide range of sources such as flat files, and Oracle to XML Documents.
Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter,
Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence
generator.
Involved in Informatica administrative work such as creating Informatica folders, repositories and
managing folder permissions.
Collected performance data for sessions and performance tuned by adjusting Informatica session
parameters.
Created pre-session and post-session shell scripts and mail-notifications.
Extensively worked on the Informatica Designer, Repository Manager, Repository Server, Workflow
Manager/Server Manager and Workflow Monitor.
Project #7:
Title
Client
Role
Duration
Environment
Description:

: SIS TRAX
: SIS
: ETL Developer
: May 2006 to June 2008.
:Windows NT, Informatica Power Center 7.1, Oracle 9i, UNIX.

A complete tissue tracking and management system fully integrated into the perioperative
documentation workflow enables your facility to standardize processes and accurately track tissues,
reduce waste, and manage your tissue inventory. The SIS Trax system brings rich, clinical functionality
and expert tissue management to your hospital. From receiving to post-implantation, SIS web-based

tissue tracking system can be used from receipt of tissue through final disposition. Complete integration
with the SIS Perioperative Solution advances the typical tissue tracking system by including tissue
documentation in the normal documentation workflow for improved accuracy, a reduction in tissue
documentation time, and the achievement of a more standardized process.
Responsibilities:
Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
Extracted source data from flat files, Oracle and loaded to an Oracle.
Developed mappings in Informatica Power Center Designer to load data from staging
environment to warehouse.
Created mappings using the transformations such as the Source qualifier, Aggregator, Expression,
Router, Filter, Sequence Generator, and Update Strategy.
Created and Monitored Informatica sessions.
Checked and tuned the performance of Informatica Mappings.
Involved in Creating Sessions, Workflows Using Workflow Manager
Involved in testing the mappings (UNIT testing).
Project #8:
Title
: SIS Analytics
Client
: SIS
Role
: Report Developer
Duration
: May 2006 to June 2007.
Environment
:Windows NT, Informatica Power Center 7.1, Oracle 9i, UNIX.
Description:
SIS Analytics provides nurses, surgeons, anesthesia providers, and hospital executives with actionable
intelligence to view data, develop action plans, improve processes, and obtain a clearer picture of your
ORs performance.
By using the Qlick view reports and Trend Analysis they can improve the Improve Efficiency and
Productivity,Improve Financial Performance.
Responsibilities:
1. As per the users requirements we created the different types of reports like Master/Detail, Cross
Tab and Chart.
2. We created reports, enables the user Explorer to view reports from different prospect. Used Slice
and Dice and Drill down in reports.
3. Distributing the Documents to the various user group and types of users through the Repository
4. Analyzed the database and provide detailed information on their operational and Forecast for
future years.
5. This QlikView Project is for Finance Department, which includes Bookings, Revenue, Cost and
Gross Margin.
6. Functionalitys like @ functions, user response and formulaes.
Project #9
Client
:
Unisankyo Pvt Ltd, Hyderabad
.
Title
: (The Enterprise-Wide Accounting Software) (PRODUCT)
Role
: .Net Developer
Duration
: Nov 04 to April 06
Environment
: VC++, VB, Window XP, SQL Server 2000
Description:
This is much more than just accounting software. It can be easily programmed to meet your specific
requirements. It provides for your data-entry screens to be customized to suit your specific
requirements with ease and flexibility. Conditions also can be imposed on each column using
mathematical formulae. Additional Screens can be added should the need arise. Application has a
provision that helps you design your own reports using the powerful built-in reports designer, if your
information needs exceed the standard reports provided in the menu. It supports transactions in
multiple currencies with user-definable exchange rates. Reports can be had in all currencies.
Reports on fore gains and losses can also be obtained. It provides a rich set of standard reports.
Besides, these reports can be printed in figures as well as in graphs.

I am involved in PAYROLL module, Leaves, Shifts, Vacation, attendance, advances and calculate
Payroll amount of particular employee and generate customized reports.
Responsibilities
1. Developed Cascading Style Sheets (CSS) for User Interface uniformity throughout the
application.
2. Developed and consumed Web Services for Speech Analysis and Integration with Keefe
Commissary.
3. Used HTML, JavaScript for developing Controls and web forms in Debit accounting Software.
4. Extensively used GridViews with sorting and paging
5. Implemented Template Columns for Custom Nested GridViews.
6. Developed XSL, XSD files for Media Metadata XML files.

S-ar putea să vă placă și