Livecareer-Resume

Hadoop Developer Resume Example

Love this resume?

By clicking Build Your Own Now, you agree to our Terms of Use and Privacy Policy

Jessica Claire
  • Montgomery Street, San Francisco, CA 94105
  • H: (555) 432-1000
  • C:
  • resumesample@example.com
  • Date of Birth:
  • India:
  • :
  • single:
Summary
Overall Experience Cloudera, IBM , MapR certified Hadoop professional with around 4 years of bigdata development experience Expertise in big data technologies - Hadoop, Hive, Oozie,Sqoop and other hadoop ecosystems. Experience with Hadoop Big Data Installation and development Expertise in Object Oriented Analysis & Design and core Java development Expertise in enterprise application development and web service development.
Highlights
  • Languages: JAVA, J2EE, Unix
  • Big Data : Hadoop Map-Reduce, Hive, Sqoop , HBase, Pig, Oozie ,Cassandra.
  • BigData Environments : Cloudera , IBM BigInsights , Hortonworks,PivotalHD
  • ETL tools : Pentaho
  • Databases: MySQL, Oracle , Teradata, Netezza,DB2,Greenplum
  • Web Technologies: JSP, HTML, JAVASCRIPT, WebServices
  • Frameworks: Spring MVC , Hibernate
  • Web service: RESTful, web service
  • Version Control:GIT, SVN , gitlab
  • Code build/review tools: Ant, Ivy , Jenkins , Gerrit
  • Servers: Tomcat
  • IDEs: Eclipse, Db visualizer
  • Methodology: Waterfall, Agile
Experience
Hadoop Developer, to
Bank Of America Corporation , ,
  • Worked as a developer in Fleet Management POC.
  • The main objective of the POC is to provide software support for Fleet Analysis & Management.
  • The system track and diagnose vehicles based on the signals received from it and produce real time alerts and graphical reports.
  • Cassandra was used as the database.
  • The report generation module was designed using pentaho.
  • Develop the solution, load/retrieve data into cassandra database.
  • Report analysed data using pentaho reporting.
  • Integrate with twitter,gmail APIs for real time alerts.
  • Responsibilities : Requirement analysis and preparing Requirement Specification documents, Identification of Use cases, Developing Java programs to meet the requirement of use cases, Deploying and testing the application.
Lead Hadoop Developer, 06/2015 to Present
Bank Of America Corporation Carrollton, TX,
  • Currently working as hadoop developer and onsite coordinator for Walmart stores enterprise information management data integration project with an objective to analyse and process various external and internal data and load them into data fabric so that customers and internal teams could access these data for end analysis and reporting..
  • Develop shell scripts, Hive queries , Greenplum stored procedures and schedule the entire flow using mainframe and CA7 scheduler.
  • Responsibilities : Includes Requirement analysis, Framework design and development , testing and rollout to production.
  • Data validation and quality check, Framework setup, Onsite coordination,Discussion with clients on business requirements.
  • Technologies Used : hive, sqoop, greenplum,teradata, DB2, shell scripting,CA7.
Lead Hadoop Developer, 08/2014 to 06/2015
Bank Of America Corporation Carson, , India
  • Project: Online Campaign Rating Nielsen Online Campaign Rating Hadoop migration project with an objective to migrate existing OCR frameworks and business logics developed in Netezza into hadoop.
  • Convert existing netezza queries to hive/impala , load data using sqoop into oracle/hdfs and orchestrate the entire flow using oozie workflow and develop shell scripts whereever needed.
  • Responsibilities : Includes Requirement analysis, Framework design and development , testing and rollout to production.
  • Data validation and quality check, Framework setup.
  • Technologies Used : MapReduce , hive, oozie, sqoop, impala, Netezza, oracle, Java, shell scripting.
Hadoop Developer, 02/2014 to 08/2014
Cognizant Technology Solutions Cheyenne, , India
  • Project: GBI , iCloud Data Migration Data Migration project - Migrate existing data (150 TB) populated by different customer applications from Teradata to Hadoop ecosystem (hive).
  • Extract data from Teradata as flat files , load into corresponding hive tables.
  • Develop frameworks in shell scripts.
  • Convert Teradata queries into hive queries.
  • Responsibilities : Requirement analysis, development , testing and production rollout.
  • Data validation and quality check, Framework setup.
  • Technologies Used : Hive, Teradata, TPT script , Shell script.
Hadoop Developer, 08/2013 to 01/2014
TCS BigData CoE City, , India
developer, 03/2012 to 07/2013
TCS BigData CoE City, , India
  • for one of the TCS big data products - TCS BigData Desktop provides a complete user interface for Hadoop.
  • It supports multi-distributions and multi-clusters of Hadoop.
  • The product provides end-end metadata management for an enterprise.
  • It also allows importing metadata from existing systems other than Hadoop.
  • Design and develop features related to the application related to Hive,Impala, user access,cluster details and support,LDAP and kerberos security implementation.
  • Execution automator, design and develop web pages.
  • Cluster deployment, roll out patches for version updates, remote client environment deployment.
  • Responsibilities: Requirements Gathering, System Analysis, Impact Analysis, High Level Design preparation, Low level Design preparation, Identifying Unit Test Cases, preparing Unit Test plans, Implementation, Unit Testing, Unit Test Results preparation, Code Reviews, Integration.
  • Technologies used : MapReduce, Hive, HBase, Oozie, Sqoop , Pig ,Impala ,Java , J2EE, Spring, Hibernate, RestFull Webservices , Unix, Shell script,Git,Gerrit , Jenkins,MySQL.
Accomplishments
  • Cloudera Certified Developer for Hadoop (CCDH) MapR M5 for Administrators IBM Infosphere BigInsights Technical Professional V2 Technical Skills Languages : JAVA, J2EE, Unix Big Data : Hadoop Map-Reduce, Hive, Sqoop , HBase, Pig, Oozie ,Cassandra.
  • BigData Environments : Cloudera , IBM BigInsights , Hortonworks,PivotalHD ETL tools : Pentaho Databases : MySQL, Oracle , Teradata, Netezza,DB2,Greenplum Web Technologies : JSP, HTML, JAVASCRIPT, WebServices Frameworks : Spring MVC , Hibernate Web service : RESTful, web service Version Control :GIT, SVN , gitlab Code build/review tools : Ant, Ivy , Jenkins , Gerrit Servers : Tomcat IDEs : Eclipse, Db visualizer Methodology : Waterfall, Agile.
Education
Bachelor of Technology: Electronics and Communication Engineering, Expected in
to
Mahatma Gandhi University - Kottayam,
GPA:
Technologies used : Java, J2EE, Cassandra, Pentaho Educational Qualification Electronics and Communication Engineering
Skills
automator, big data, CA7, client, clients, data integration, Data Migration, Data validation, database, features, DB2, J2EE, Java, LDAP, mainframe, access, migration, MySQL, enterprise, OCR, oracle, developer, quality, real time, reporting, Requirement, Requirements Gathering, script, shell scripts, shell scripting, Shell script, Specification, System Analysis, tables, software support, Teradata, Unix, user interface, web pages, workflow

By clicking Build Your Own Now, you agree to our Terms of Use and Privacy Policy

Disclaimer
Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

Resume Overview

School Attended
  • Mahatma Gandhi University
Job Titles Held:
  • Hadoop Developer
  • Lead Hadoop Developer
  • Lead Hadoop Developer
  • Hadoop Developer
  • Hadoop Developer
  • developer
Degrees
  • Bachelor of Technology

Similar Resume

View All
Hadoop Developer
Hadoop Developer
Hadoop Developer