Sunteți pe pagina 1din 6

Avinash Pulapaka

Hadoop Admin/Developer
Email: avi.hadoop@gmail.com
Ph: 408-658-0338

Professional Summary

7+ years of overall experience in Systems Administration and Enterprise


Application Development in diverse industries which includes hands on experience
in Big data ecosystem related technologies.

2 years of comprehensive experience as a Big Data & Analytics Administrator.


Experience in working with MapReduce programs using Apache Hadoop for
working with Big Data.

Experience in installation, configuration, supporting and monitoring Hadoop

clusters using Apache, Cloudera distributions and AWS.


Experience in using Pig, Hive, Scoop, HBase and Cloudera Manager.

Experience in importing and exporting data using Sqoop from HDFS to Relational
Database Systems and vice-versa.

Hands on experience in application development using Java, RDBMS, and Linux


shell scripting

Extending Hive and Pig core functionality by writing custom UDFs.

Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce

programs in Java.
Familiar with Java virtual machine (JVM) and multi-threaded processing.

Worked on NoSQL databases including HBase, Cassandra and MongoDB.

Knowledge in job workflow scheduling and monitoring tools like oozie and
Zookeeper

Experience in designing, developing and implementing connectivity products that


allow efficient exchange of data between our core database engine and the Hadoop
ecosystem.

Experience as a Java Developer in Web/intranet, client/server technologies


using Java, J2EE, Servlets, JSP, JSF, EJB, JDBC and SQL.

Good understanding of XML methodologies (XML,XSL,XSD) including Web


Services and SOAP

Familiar with data warehousing and ETL tools like Informatica and Pentaho.

Techno-functional
functional

and

responsibilities
technical

include

gaps,

interfacing

estimates,

with

designing

users,

identifying

custom

solutions,

development, leading developers, producing documentation, and production


support.

Excellent interpersonal and communication skills, creative, research-minded,


technically competent and result-oriented with problem solving and leadership
skills.

Technical Skills
Hadoop/Big Data

HDFS,Mapreduce,HBase,Pig,Hive,Sqoop,Flume,MongoDB,Ca

Java & J2EE

ssandra, Power pivot, Puppet, oozie, Zookeeper


Core Java, Servlets, JSP, JDBC, JNDI, Java Beans

Technologies
IDEs
Big data Analytics
Frameworks
Programming

Eclipse, Net beans


Datameer 2.0.5
MVC, Struts, Hibernate, Spring
C,C++, Java, Python, Ant scripts, Linux shell scripts

languages
Databases
Web Servers
Web Technologies
Network Protocols
ETL Tools
Testing

Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server


Web Logic, Web Sphere, Apache Tomcat
HTML, XML, JavaScript, AJAX, SOAP, WSDL
TCP/IP, UDP, HTTP, DNS, DHCP
Informatica, Pentaho
Win Runner, Load Runner ,QTP

Work Experience
HP (Big Data Services), Roseville, CA
Jan 2012
Till Date
Hadoop Engineer
The Hewlett-Packard Company or HP is an American multinational information
technology which provides products, technologies, software, solutions and services to
consumers, small- and medium-sized businesses (SMBs) and large enterprises,
including customers in the government, health and education sectors.
Environment: Hadoop, MapReduce, HDFS, Hive, Java, SQL, Cloudera Manager, Pig,
Sqoop, Oozie
Responsibilities:

Responsible for building scalable distributed data solutions using Hadoop.

Responsible for cluster maintenance, adding and removing cluster nodes,


cluster monitoring and troubleshooting, manage and review data backups,
manage and review Hadoop log files.

Worked hands on with ETL process.

Upgrading the Hadoop Cluster from CDH3 to CDH4 and setup High availability

Cluster Integrate the HIVE with existing applications


Configured Ethernet bonding for all Nodes to double the network bandwidth

Handled importing of data from various data sources, performed transformations


using Hive, MapReduce, loaded data into HDFS and Extracted the data from
Teradata into HDFS using Sqoop.

Analyzed the data by performing Hive queries and running Pig scripts to know
user behavior.

Continuous monitoring and managing the Hadoop cluster through Cloudera


Manager.

Installed Oozie workflow engine to run multiple Hive and Pig jobs.

Developed Hive queries to process the data and generate the data cubes for
visualizing.

Pacific Electric and Gas Company, San Francisco, CA


Nov
2010 Dec 2011
Hadoop Admin/ Developer
The Pacific Gas and Electric Company, commonly known as PG&E, is the investorowned utility that provides natural gas and electricity to most of the northern twothirds of California, from Bakersfield almost to the Oregon border. It is the leading
subsidiary of the PG&E Corporation.
Environment: Hadoop, MapReduce, HDFS, Hive, Oracle 11g, Java, Struts, Servlets,
HTML, XML, SQL, J2EE, JUnit, Tomcat 6.
Responsibilities:

Installed and configured Hadoop MapReduce, HDFS and developed multiple


MapReduce jobs in Java for data cleansing and preprocessing.

Importing and exporting data into HDFS and Hive using Sqoop.
Involved in defining job flows, managing and reviewing log files.
Extracted files from CouchDB through Sqoop and placed in HDFS and processed.
Load and transform large sets of structured, semi structured and unstructured

data.
Responsible to manage data coming from different sources.
Supported Map Reduce Programs those are running on the cluster.
Involved in loading data from UNIX file system to HDFS.
Installed and configured Hive and also written Hive UDFs.
Involved in creating Hive tables, loading with data and writing hive queries which

will run internally in map reduce way.


Gained very good business knowledge on health insurance, claim processing, fraud

suspect identification, appeals process etc.

Centre for Medicare &Medicaid Services (CMS), MD


Sep2009 Oct 2010
Java/J2EE Developer
Centers for Medicare & Medicaid Services (CMS) supports Retiree Drug Subsidy
program, a financial incentive offered to health insurance plan sponsors, that
encourages continuance of drug coverage to retirees and other plan members who
are Medicare eligible.
Environment: Java 1.4,Struts, JSP, Servlets API, HTML, JDBC, Web Sphere 5.1,MQ
Series, MS SQL server, XSLT, XML, EJB, Edit Plus, EJB, JUnit, CSS,JMS, Hibernate,
Eclipse, and WSAD
Responsibilities

Responsible for the design and development of the framework. The system is

designed using J2EE technologies based on MVC architecture.


Developed Session Beans using J2EE Design Patterns.
Implemented J2EE Design patterns like Data Access Objects, Business

Objects, and Java Design Patterns like Singleton.

Extensively used MQ series.

Extensive use of Struts framework.

Used JSP and Servlets, EJBs on server side.

Implemented Home Interface, Remote Interface, and Bean Implementation


class.

Implemented business logic at server side using Session Bean.

Wrote PL/SQL queries to access data from Oracle database.

Set up Web sphere Application server and used Ant tool to build the
application and deploy the application in Web sphere.

Developed the application using WSAD.

Prepared test plans and writing test cases

Worked on Hibernate.

First Citizens Bank, Raleigh, NC


Dec 2007
Sep 2009
Application Developer J2EE
First Citizens Bank is one of the largest banking institutions in the world. Bank offers
various financial and banking services to its customers. Worked on application named
Access portal. It is a part of online banking that allows a customer to view quick
summary of transactions and account details. It also shows mutual funds associated
with account.

Environment: Java, Web Sphere 3.5, EJB, Servlets, JavaScript, JDBC, SQL, JUnit,
Eclipse IDE. Apache Tomcat 6
Responsibilities:

Developed JavaScript behavior code for user interaction.

Created database program in SQL server to manipulate data accumulated by


internet transactions.

Wrote Servlets class to generate dynamic HTML pages.

Developed Servlets and back-end Java classes using Web Sphere application
server.

Developed an API to write XML documents from a database.

Performed usability testing for the application using JUnit Test.

Maintenance of a Java GUI application using JFC/Swing.

Created complex SQL and used JDBC connectivity to access the database.

Involved in the design and coding of the data capture templates, presentation and
component templates.

Part of the team that designed, customized and implemented metadata search

and database synchronization.


Used Oracle as Database and used Toad for queries execution and also Involved in
writing SQL scripts, PL SQL code for procedures and functions

Wachovia Bank, NC
Oct 2007
JAVA Developer

Sep 2006-

Wachovia provided a broad range of retail banking and brokerage, asset and wealth
management, corporate and investment banking products and services to customers,
along with nationwide retail brokerage, mortgage lending, and auto finance
businesses.
Environment Oracle 11g, Java 1.5, Struts, Servlets, HTML, XML, SQL, J2EE, JUnit,
Tomcat 6.
Responsibilities:

Responsible and active in the analysis, design, implementation and deployment of

full Software Development Lifecycle (SDLC) of the project.


Designed and developed user interface using JSP, HTML and JavaScript.
Developed Struts action classes, action forms and performed action mapping

using Struts framework and performed data validation in form beans and action
classes.

Extensively used Struts framework as the controller to handle subsequent client

requests and invoke the model based upon user requests.


Defined the search criteria and pulled out the record of the customer from the
database. Make the required changes and save the updated record back to the

database.
Validated the fields of user registration screen and login screen by writing

JavaScript validations.
Developed build and deployment scripts using Apache ANT to customize WAR and

EAR files.
Used DAO and JDBC for database access.
Developed stored procedures and triggers using PL/SQL in order to calculate and

update the tables to implement business logic.


Design and develop XML processing components for dynamic menus on the

application.
Involved in post production support and maintenance of the application.

ATRENTA INDIA PVT.LTD., INDIA


Oct 2005 Aug
2006
Junior JAVA Developer
Atrenta is one of the leading suppliers of innovative products addressing early stage
Integrated Circuits and system design aimed at improving design efficiency for the
worlds leading semiconductor and consumer electronics companies
Environment: Java, JSP, Servlets, JDBC, JavaScript, MySQL, JUnit, Eclipse IDE.
Responsibilities:

Involved in the analysis, design, implementation, and testing of the project.


Implemented the presentation layer with HTML, XHTML and JavaScript.
Developed web components using JSP, Servlets and JDBC.
Implemented database using SQL Server.
Designed tables and indexes.
Wrote complex SQL and stored procedures.
Involved in fixing bugs and unit testing with test cases using JUnit.
Developed user and technical documentation.

References:

Provided upon request

S-ar putea să vă placă și