Sunteți pe pagina 1din 38

B Ali

Sr. Integration Consultant


Email: programmer67109@gmail.com
Mobile: 734-367-4333

PROFESSIONAL SUMMARY

 Eight plus (8+) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business
application systems for Health care, Financial, Telecom, energy, oil and services sector.
 Experienced in interacting with Business users in analyzing the Business process requirements and transforming them
into documents, designing, and rolling out the deliverables.
 Experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data
mapping, build, unit testing, systems integration and UAT.
 Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouses and
Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor).
 Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups,
Router, Aggregator, Joiner, Update Strategy, expression, sorter, Rank, sequence generator, Normailizer and Re-usable
transformations.
 Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations
Developer, Mapplet and Mapping Designer.
 Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
 Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2008, MySQL, DB2 8.0/7.0, MS
Access, Postgre SQL and Green Plum.
 Strong experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages
 Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like
sources, targets, mappings and sessions
 Solid experience in the installation, administration and monitoring of Informatica server, Domain, Repositories and
Informatica monitoring tools
 Highly proficient in processing tasks, scheduling sessions, import/export repositories, manage users, groups and
deployment activities in Dev, QA and Prod environment.
 Experience in Developing Repository (RPD) using Admin tool (i.e.) Physical layer, BMM layer and Presentation layer.
 Well versed in UNIX shell scripting.
 Experience in using Automation Scheduling tools like Control-M, CAWA and Infa. Scheduler.
 Practical understanding of the database schemas like Star Schema and Snow Flake Schema used in relational and
dimensional modeling.
 Exceptional analytical and problem-solving skills. Team Player with the ability to communicate effectively at all levels
of the development process.
TECHNICAL EXPERTISE
 ETL Tools: Informatica 9.x/8.x/7.x, Informatica, Oracle Data Integrator 11g, 12c
 Databases: Oracle 11g/9. i/8.x, SQL Server, DB2, MySQL, MS Access, Greenplum
 Operating Systems: Unix/Linux, Windows (7,8,10)
 Packages: SQL *Plus, Toad 7.x, SQL Developer
 Data Modeling: Erwin
 Scheduling Tools: Control-M, CAWA.

PROFESSIONAL EXPERIENCE
Tektronix Communication March 2017 to Present
Client: Vodacom, SA
Informatica/ETL Developer

Project: Network Analysis Data Mart (NAD):

The NAD solution is a standard ETL environment that takes Voice, SMS, and Bearer & GN IMSI FACT data from touchpoint. It
then transforms that data using Informatica by enriching it with dimensional data also sourced from touchpoint. Finally, the
transformed data into a target warehouse datamart.
In addition to this it also performs aggregation, aging, and has a few slowly changing dimensions that provide the capability of
a rich historical view of its datasets. Reporting and data access is via a COGNOS framework.

Extract: 15-minute data (level of granularity) is extracted from the touchpoint Database using database customized views.
This data is loaded into tables in the staging schema (Staging)
Transformed: The data is transformed from the 15-minute data into data aggregated to Hourly, Daily, Weekly and Monthly
periods of aggregation. In addition, the data is structured based on specific data dimensions such as IMSI, Cell, and Device etc.
Load: The data is loaded into the Data Warehouse from Staging. The primary mechanism for moving data from staging to the
Data Warehouse is via Oracle Partition Exchange. The benefit of this mechanism is that once the data is processed it is
incorporated into the Data Warehouse in a single step and thereby ensuring the integrity of the data in the Data Warehouse.
Environment: Informatica Power Center 9.5.1, UNIX, Oracle 11g, SQL Developer

Role and Responsibilities:


 Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure,
and lookup to develop robust mappings in the Informatica Designer.
 Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data
warehouse.
 Modified existing mappings for enhancements of new business requirements.
 Extensively used workflow variables, mapping parameters and mapping variables.
 Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data
cache size and target based commit interval.
 Analyzed Session Log files to correct errors in mappings and sessions.
 Scripted UNIX bash for pre-session and post session in mappings which included PMCMD/PMREP commands.

Capgemini Canada, CA ` October 2016-Feb 2017


OCDM/ETL Consultant
Client: SaskTel, Regina, Saskatchewan
 Oracle Communication Data Model (OCDM) Installation and Configuration and customization of OCDM as per client
requirement.
 Perform implementation of customization of oracle BRM Adapter to OCDM.
 Extensive use of Oracle Data Integrator as ELT for loading tables from BRM source in OCDM staging layers foundation
layers (Base, reference and lookup tables) and Analytics layer (Derived, Aggregate tables and OLAP cubes)
 Extensive use of PL/SQL (Procedures, packages and functions) and SQL tuning of inbuilt packages.
 Conducted GAP analysis to analyze the variance between the system capabilities and business requirements
 Development of packages and Scenarios using interfaces and variables.
 Importing and exporting scenarios from development, test and production environments.
 Did reverse engineering for data sources and targets and created new models.
 Set up the Topology including physical architecture, logical architecture and context.

Environment: Oracle Data Integrator 11g/12c, Oracle Communication Data Model 11.2.5, BRM Adapter, Oracle 11g, Toad,
Unix

Tata Consultancy Services, Toronto, ON October 2015-Sep 2016


Sr. Developer
Client: Rogers

Project: BI Legacy Migration Stream:


This is BI migration project. The purpose is to migrate the old legacy system consist of Marquee (cable), ODIN (wireless) and
GCDM (customer) to the new world/environment. The new environment consists of Operation data hub (ODH) which has 04
layers of Enterprise Landing areas ELA, Staging, operational data store ODS and business application layer. Then is Analytical
Data Store (ADS) which is designed to contain integrated enterprise-wide detailed and summarized data supporting both
strategic and executive management reporting and decision-making processes consisting of OCDM (oracle communication
data model), OBIA (Oracle business intelligence applications and LATE (legacy application transition environment). Also, we
have Hadoop/Big Data environment which will complement our ODH/ADS environments because it is efficient in handling un-
structured and semi-structured data. All data will land in Hadoop Data Lake as well as ODH. This platform will eventually serve
as the landing zone for all data and it will also be the destination for quick access for archived data.
The data can be accessed using standard tool {Tableau, Microsoft BI, SAS} or big data tools {hive, pig, etc.}.
Responsibilities:
 Performed some initial analysis to understand the legacy world by understanding shell scripts, packages, procedures
and Control-M job need to be migrated.
 Gathering of business requirement and preparation of mapping specification documents and source to target
mapping documents.
 Conduced meeting on daily basis with off-shore team to get the update and progress of their development and giving
them new requirement.
 Developed complex mappings in Informatica to load the data from various sources.
 Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and
relational connections.
 Effectively worked on Onsite and Offshore work model by leading team of 04 offshore and 02 onsite developers.
 Involved in peer code review.
 Worked on converting some of the PL/SQL scripts into Informatica mapping.
 Testing of scripts after migration to the new UNIX server.
 Close coordination with DBA, Administrator and other team members in activates such as creating tables, new
schemas, creating project folders, shared folders, getting access on new repositories and import of all sources, targets
and parameters files to new informatica server.
Environment: Informatica Power Center 9.5.1, UNIX, Oracle 11g, Teradata, SQL Developer for Oracle, Control-M for
scheduling jobs.

Groundswell Group, Vancouver, BC Nov 2014 to August 2015


Client: BC Hydro
Senior ETL Consultant

Project: Energy Analytics Solutions (EAS)


The primary objective of EAS Project (Release6) is to create effective deterrence to electricity theft by consistently
demonstrating an ability to both:
• Positively identify theft with high degree of certainty, quantify volume, and identify area.
• Facilitate Revenue Assurance rapidly shutting-down instances of theft.
These objectives can be achieved through the implementation of an Energy Analytics Solution (EAS) with an initial
focus on theft detection with leverages Smart Metering and emerging Smart Grid technologies. The SMI R6 program is
to facilitate the back office, RA Analysts, and field investigation team to have data required for theft detection, identify
of potential theft, instant energy balancing and facilitate intelligent analytics and provide reports based on this data.
The aim is to provide accurate, correct, cleansed information via EAS Analytics which internally pulls the information
from the Greenplum database.

Responsibilities:
• Developed the CIM ETL Mapping by using Informatica power center to load the data from XML source to 48
staging tables in Green plum Database.
• Created 48 individual mapping to load the data from Staging to ODS as per business requirement. This ODS
data is being used for the EAS upstream requirement.
• Modified the existing XSD by adding 03 more classes (Series Reactor, Meter bank, meter device) for new CIM
model (XML Source)
• Created 02 new Informatica environments (Unit_test, Sus_unit_test @INFATST) for the Databases for testing
the code before promoting to QA and production.
• Changed and performed unit testing for the database functions in Dev that are using old PostGis functions to
use new PostGIS functions so they can be executed in the GP 4.3.5. Created Source to Target (xml to staging
and then staging to ODS) Mapping document in Excel.
• Revised and modified the existing documentation in the BC Hydro SharePoint environment to in-corporate the
new business rules for smart metering integration (SMI).
• Used CIM EA Enterprise Architect (EA) to manage IEC Common Information Model (CIM), CIM Profiles, and
CIM-based artifacts.
• Used PGAdmin administration and Development tool for PostgreSQL.
Environment: ETL, Informatica Power Center 9.1, Oracle DBMS 10g, 11g, flat file, TIBCO, Greenplum DBMS, Informatica Power
Exchange for JMS, Pivotal Hadoop, Informatica Data Replication 9.1.
Groundswell Group, Vancouver, BC June 2014 to September 2014
Client: Provisional Services Health Authority
Sr. Integration Consultant/Informatica Administrator
Provisional Health Services Authority (PHSA) primary role is to ensure that BC residents have access to a coordinated network
of high-quality specialized health care services.

Project: Data Migration: The Panorama Data Conversion Project is intended to handle the entire Extract, Transform and Load
(ETL) interfaces required to migrate the data from the legacy iPHIS system, and the Sexually Transmitted Infections
Information System (STIIS) in order to convert the data over to the Pan-Canadian Health Surveillance and Management
System: Panorama.

Responsibilities:
• Designed and Developed Informatica Mapping to load data for STI (Sexually Transmitted Infection System)
from XML Source to target.
• Extensively used Lookup, Sorter, Aggregator, Filter, Expression, Normalizer, Mapping variables and
parameters, sequence Generator and Mapplets.
• Used JAMA for requirement capturing and used SQL Developer extensively for SQL, running queries, export
data to desired format, debugging and testing.
• Converted text file to PDF using JAVA based solution and PERL application as part of project requirement.
Used external loader to attach the file.

Responsibilities as Informatica Administrator:


• Also worked as Informatica Administrator for this data migration project and performed wide range of
administrator tasks.
• Installed and configured Informatica server, creating users, connections, folders and managing privileges on
them.
• Performed cleanup activities, shell scripting, build maintenance procedures, backup and recovery and
monitoring the environment.
• Setup and configured Informatica domains, services (repository, integration, Metadata manager.
• Looked at all log files on daily basis (Integration service log, Repository service log, domain log, node log and
Catalina).

Environment: Informatica Power Center 9.5 (Designer, Workflow Manager, Workflow Manager, Workflow Monitor, Repository
Manager), Oracle 11g, SQL Developer for Oracle, UNIX, JAMA for requirement gathering.

Groundswell Group, Calgary, AB March 2014 to May 2014


Client: Calfrac Well Services
Sr. ETL Developer
Calfrac Well services Ltd. Is an oilfield services corporation operating in Western Canada and provides oil and gas exploration
companies with acidizing ,hydraulic fracturing, coiled tubing, nitrogen and CO2 drilling and completion services.

Project: Equipment Integration


The business objective of the integration is to integrate Equipment data from Cetaris (Source) to SPIRA (Target). This project is
already in place.

Responsibilities:
 Implemented new business requirements of integrating all the fleet code into Spira using Informatica best practices.
These redesigned works also include the error re-porting as per new logic. There are about 37 business rules (BR01-
BR37) for this integration process.
 Provided some performance tuning to improve the session performance to 80% and configured some email
notifications on workflow and session level.
 Configured all the Informatica session for email notification to different business users.
 Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and
Router transformations for populating target table in efficient manner.
 Prepared mapping specification documents and source to target field matrix
Groundswell Group, Calgary, AB February 2013 to March 2014
Client: Shaw
Integration Consultant

Shaw is the leading communication provider in Canada providing internet service, phone service and TV service to its
customers. Worked on 3 separate projects:

Project: (EDW Restatement Project)


The Purpose of EDW Restatement Project is to deliver adjustment to the business rules in Enterprise Data Warehouse (EDW)
for reporting Cable, Phone and internet subscribers and how they are measured.
Sales Tracking Summary report package will be re-pointed to the EDW as the source of truth (SOT) and reflect accurate
subscriber counts. Group billing accounts were being handled differently than the rest of the customer base, and it was
determined that this was unnecessary.

Project: (Past period Restatement)


Past Period Restatement (PPR) Project provides the ability to change past, present and future published EDW passed
household, customer counts, related net gain and service level counts in either a positive or a negative fashion. The intent is
to provide a full audit trail by making an adjustment to individual drop or customer records by key measures for a specific
reporting date. (Passed Household, Video Drop, Digital Customer, Internet Customer and Subscription, Phone customer and
Subscription).

Project: (Engineering to EDW)


The ultimate goal of this endeavor is to enable reporting across engineering and billing systems. Availability of engineering
data in an environment which can easily be accessed and used in conjunction with the data from
CBS will provide:
• Investigation into the differences between what is being provisioned and what is being paid for.
• Insight into the channels being provisioned for different packages/product with in CBS
• Analysis of the use of hardware vs. Hardware which is rented or purchased.
The need for this data has already occurred one one-time pulls were done and loaded into staged table in RDS. Desired a
repeatable, scheduled process to load data into a system which is designed for growth and which could be utilized by BI and
financial controls alike. While the initial focus was on the data from the cable DAC’s, it was also designed for the addition of
internet and phone.

Responsibilities:
• Conducted various meeting with different team to get knowledge on business logic
• Identified the bug in logic and reported to business with possible solution and implementation.
• Prepared Source to Target Mapping document with detailed transformation logic for each fields and related
joining tables.
• Developed complex mappings for various targets, tested and promoted to QA without single defect.
• Extraction of data from different sources like Oracle, Oracle E-Business suit (EBS), flat-files, and Mainframe.
• Extensively used the debugger to debug the mapping and to find out the data discrepancy coming from the
source or arising from the business rules being incorrectly applied.
• Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update
Strategy, Rank, Expression and lookups (connected and unconnected) to the business logic using Power
Center
• Created Mapping Specification documents following using best Informatica practices.
• Created and executed MOP in development and promoted the code to Testing, Lab and Prod.

Environment: Informatica 9.5.1 (Designer, Workflow Manager, Workflow Manager, Workflow Monitor, Repository Manager),
Oracle 11g, Toad for Oracle, UNIX.

Truven Health Analytics, (Formerly Health care of Thomson Reuter) IL August 2012 to January 2013
Senior ETL Developer

Project: CRM View Project


Truven Health Analytics delivers unbiased information, analytic tools, benchmarks, and services to the healthcare industry.
Hospitals, government agencies, employers, health plans, clinicians, and pharmaceutical and medical device companies have
relied on services of Truven Health for more than 30 years.
The CRM/view Project is for demographic matching criteria of the client data with Data in the National file for campaign the
national file contains 385 columns.

Responsibilities:
• Developed PL/SQL procedures, Cursors, Functions and packages for cleaning address, matching persons,
update person hoh_id, inserting new records, ncoa_updatesset sensitive flags for transactions to find
matching criteria.
• Involved in Performance tuning of the Informatica mappings using various components like Parameter files,
Variables and Cache.
• Created some UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained
the batch processes using Unix Shell Scripts
• Perform small enhancements (data cleansing/data quality)
• Involved in defining the source to target data mappings, business rules, business and data definitions

Environment: Informatica 9.1(Designer, Workflow Manager, Workflow Monitor, Repository Manager), Oracle 10g, UNIX, Toad
9.7)
Lehigh Valley Health Networks, PA March 2011 to July 2012
Senior ETL / Informatica Developer

Project:
Leigh Valley Health Networks is one of America best hospital providing healthcare services. It provides a comprehensive range
of inpatient, clinical and diagnostic services in more than 40 areas of medical specialties and subspecialties. During this
project I have worked closely with the data warehouse development team, customers, business analysts and other colleagues
in the ITS department to analyze operational data sources, determine data availability, define the data warehouse schema and
develop ETL processes for the creation, maintenance, administration and overall support of the data warehouse.
Responsibilities:
• Designed and developed Informatica Mappings to load data from different Source systems to Data
Warehouse.
• Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for capturing delta loads and
Populated Slowly Changing Dimensions using Informatica.
• Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica
Designer.
• Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup
and Router transformations for populating target table in efficient manner.
• Created Mapplet and used them in different Mappings.
• Provided Knowledge Transfer to the end users and created extensive documentation on the design,
development, implementation, daily loads and process flow of the mappings.
• Worked with session logs, Informatica Debugger, and Performance logs for error handling.

Environment: Informatica Power Center 9.1 (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer,
Transformation developer, Mapplet Designer, Mapping Designer, Repository manager), PL/SQL, SQL Server, Unix.

DilsonsTek Consultancy Services, Toronto, ON March 2009 to February 2011


Consultant
Project: CIS: Confidential, a global IT services company and delivers end-to-end IT solutions and services worldwide. The
Objective of project CIS (Customer Information System) is to create a Data mart for the analysis of sales information of a client
company. This Project involved designing and creating a Data mart to fulfill their business study and analysis requirement.
Project: Sales Data Mart: Mac’s is the chain of stores in North America having more than 5000 stores across. The purpose of
this project is to develop a sales data mart to extract, transform and load the sale data from all regions across Canada to Data
ware house. The management is using this data for their analysis and decision-making purpose for profitable and smooth
operation. ODI was used as ELT tool for load the data.

Responsibilities:
 Worked with business analysts, developers, and management professionals from various locations to gather
requirements.
 Translated business requirements into technical specifications to build the Data Warehouse.
 Developed mappings in using informatica and oracle data integrator that catered to the Extraction, Transformation,
and Loading from various source systems to target systems.
 Used Informatica tool to handle complex Mappings and extensively used the various Transformations like Source
Qualifier, Aggregators, Lookups, Filters, Update Strategy, Expression, Sequence generator and Sorter etc.
 Created numerous Interfaces to load data from Flat files, CSV files and Oracle tables into staging tables and then into
respective FACT/DIMENSION/Look-up Tables in Data Warehouse.
 Loaded data from different source systems to target warehouse using interfaces with Knowledge modules like LKM,
IKM and CKM for data quality check.
 Used ODI in-built scheduler to schedule Scenarios/Load Plan's in production.
 Participated in all phases of project development, testing, deployment and support.
 Developed, implemented and enforced ETL best practices standards.
Environment: Informatica power center 8.6. Oracle Data Integrator 10g, oracle 10g, SQL server, TOAD, SQL Developer, UNIX
EDUCATION
 Master of Science in Electrical Engineering (Electronics) – 04/1999
 Bachelor of Science in Electronics & Telecommunication Engineering – 06/1990

S-ar putea să vă placă și