Sunteți pe pagina 1din 3

Flat B-802, Elite Homes,

Near Akshara International


School, Tathwade,
Pune-411057, Maharashtra, India
Email id: saikok87@gmail.com
Cell: +91-8149055560/8668783943
9029794560

Sai Ashokrao Kokadwar

Profile:
Have total 5 years of IT experience including Hadoop, Core Java, Solr and Mainframe.
Strong technical and architectural knowledge of various technologies of Big Data and
Hadoop ecosystem.
Have knowledge on Hadoop technology stack.
Have good knowledge on NoSQL database Hbase.
Time management skills and good communication skills.

Skill Set:

Big Data Skills:


Tools: Hadoop, Spark, Hive, Mapreduce, Sqoop, HDFS, HBase, Zookeeper, Oozie, Solr, MongoDB,
Apache Rumen, PIG
Technologies: Big Data Hadoop, Solr, SpringFramework, RestEasy
Operating System: Linux
Domain: Healthcare, Credit Cards
Language: HQL, Core Java, Scala, Linux shell and Python basics

Mainfarme Skills:
Tools: Application Performance Analyzer (APA), Mainview, IBM DB2 Tools
Technologies: JCL, COBOL, DB2, IMS DB/DC, VSAM
Domain: Healthcare

Education:
Bachelor of Engineering (Computer Engineering) 2011, Mumbai University, India Graduated with
67.02% (First class with distinction)

1
Professional Experience:
Project - Technology Analyst (Hadoop Developer) Infosys Ltd

Project Name : Credit Risk 2020 - Big Data - Risk Processes


Client : American Express, USA
Organization : Infosys Ltd.
Tools/Technologies: Hive, Hbase, Mapreduce, Oozie, Solr, SVN, Apache Maven, GIT, SpringFramework,
RestEasy
Duration : Oct 2015 to till date
Role : Offshore Developer

Description : Credit Risk 2020 project is envisioned to provide a new set of risk management
capabilities. It supports computation of big data from numerous enterprise systems of record to create new
information. Data generated by the application to be used primarily in authorizations, new accounts and line
management processes. This will accelerate time to market, develop new business opportunities through
data mining and modernize capabilities to allow processing of large volume of data. All capabilities will be
developed on Big Data Platform to provide more scalable and cost-effective back-end data processing
infrastructure.
Worked with Hbase using JAVA API and Oozie for batch processing.
Involved in writing and running Hive queries and Mapreduce jobs
Also connected to Oozie through Java API
Also worked with HDFS files using Java API
Also worked on Solr using JAVA API to get data from Hbase and index it to Solr
Also worked on Solr to index HDFS data
Worked on Apache Rumen to get log files in readable format
Worked on creating Rest services while calling big data processes
Have done some GIT commits
Spark - multiple output files using scala
Spark -HBase bulkload, delete, extract using scala
Spark HDFS purge based on input days
Spark - basic SQL operations and Hive operations
Spark basic RDD operations

Project 2 - System Engineer (Hadoop Developer) IBM India Pvt. Ltd

Anthem Inc., a client of IBM U.S, is one among the largest Healthcare Insurance Providers in the U.S NE
region. Anthem is a leader in providing health insurance coverage to customers in CT, ME and NH States
and nation-wide accounts. It offers Health Insurance services through different products and riders to the
clients which are customized both for individual and corporate needs.

Imported data using Sqoop to load data from MySQL, IMS DB, DB2 to HDFS on regular basis.
Wrote Hive queries for data analysis to meet the business requirements.
Associated with creating Hive tables, loading and analyzing data using Hive Queries.
Involved in creating tables, partitioning of tables by using Hive.
Involved in exporting data from HDFS to MySQL by using Data Migration Tool like Sqoop.
Involved in writing and running PIG scripts.
Involved in writing and running MapReduce programs
Involved in PIG scripting
Imported data using Sqoop to Hive, HDFS

2
Knowledge of Amazon Elastic MapReduce

POC:
CSV & TSV file format read and load it in Hive.
Worked on Hive Joins
Twitter Data Analysis
Worked on Hive SerDe
Working on Yelp Dataset challenge
Worked on RCFormat data analysis
Worked on Hive regex
Worked on Hive partitions
Working on Oozie workflow
Working on HBase and Hive Integration
Working on data migration in HBase and querying through Hive
Have knowledge of troubleshooting of Hadoop cluster.

Hardware/Software:
Hadoop, Cloudera, HDFS, HIVE, JAVA (jdk 1.6), Linux

Project 1 - Associate System Engineer (Mainframe Developer) IBM India Pvt. Ltd
WellPoint Inc., a client of IBM U.S, is one among the largest Healthcare Insurance Providers in the U.S. It
offers HealthCare Insurance services through different products and riders to CPF clients which are
customized both for individual and corporate needs. Worked as a Application Developer in WellPoint and was
carrying out the following responsibilities-

Development and maintenance for Claims application (WellPoint CT).


Worked for increasing performance of the system.
Worked on abend reduction and provided permanent fixes.
Analyze system issues and design a fix to correct the issue.

Environment: Z/OS 390, IMS DB/DC, DB2, COBOL, JCL, VSAM

Achievements:
AHM 250 Certified (Academy for Healthcare Management)
Appreciated by client on many occasions.

Trainings:
I received 1 month of classroom training on Big Data technologies.

S-ar putea să vă placă și