LiveCareer
LiveCareer
  • Dashboard
  • Jobs
  • Resumes
  • Cover Letters
  • Resumes
    • Resumes
    • Resume Builder
    • Resume Examples
      • Resume Examples
      • Nursing
      • Education
      • Administrative
      • Medical
      • Human Resources
      • View All
    • Resume Search
    • Resume Templates
      • Resume Templates
      • Nursing
      • Education
      • Medical
      • Human Resources
      • Customer Service
      • View All
    • Resume Services
    • Resume Formats
    • Resume Review
    • How to Write a Resume
    • CV Examples
    • CV Formats
    • CV Templates
    • Resume Objectives
  • Cover Letters
    • Cover Letters
    • Cover Letter Builder
    • Cover Letter Examples
      • Cover Letter Examples
      • Education
      • Medical
      • Human Resources
      • Customer Service
      • Business Operations
      • View All
    • Cover Letter Services
    • Cover Letter Templates
    • Cover Letter Formats
    • How to Write a Cover Letter
  • Jobs
    • Mobile App
    • Job Search
    • Job Apply Tool
    • Business Letters
    • Job Descriptions
  • Questions
  • Resources
  • About
  • Contact
  • 0Notifications
    • Notifications

      0 New
  • jane
    • Settings
    • Help & Support
    • Sign Out
  • Sign In
Member Login
  • LiveCareer
  • Resume Search
  • Hadoop Admin
Please provide a type of job or location to search!
SEARCH

Hadoop Admin Resume Example

Resume Score: 80%

Love this resume?Build Your Own Now
HADOOP ADMIN
Summary
  • Having around 6+ years experience in Operations, developing, maintaining, monitoring and upgrading Hadoop Clusters (Cloudera and Hortonworks distributions).
  • Hands on experience in installing/configuring/maintaining Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, Spark, YARN, Flume, Kafka, Impala, Zookeeper, Hue and Sqoop using both Cloudera and Hortonworks.
  • Responsible for planning of Capacity Planning, Infrastructure Planning and version fix to build Hadoop Cluster.
  • Excellent expertise and knowledge of Cloud Platforms and its components (IBM Private/Public Cloud, Kubernetes, Docker).
  • Experienced in using HDFS, Pig, Hive, Spark, Impala, Sqoop, Oozie, ZooKeeper and Cloudera Manager.
  • Having knowledge on Solr, NiFi and Kafka.
  • Self-starter and ability to learn new things in a quick span of time.
  • Good communication and documentation skills.

Enthusiastic individual with superior skills in working in both team-based and independent capacities. Bringing strong work ethic and excellent organizational skills to any setting. Excited to begin new challenge with successful team.

Skills
  • Active Directory, Developer
  • Backup, Quality
  • Big data, Express
  • Version control, SAS
  • CA-7, RDBMS
  • Catalog, Real Time
  • Hardware, Reporting
  • CPU, Requirement
  • Data Integration, SDLC
  • Databases, Shell Scripts
  • Database, SQL
  • Data Warehousing, SSL
  • Delivery, Strategy
  • Designing, Tableau
  • Certificate in Linux Programming and Administration
  • Encryption, SAP
  • Disaster Recovery, Tables
  • Encryption, SAP
  • Eclipse, Teradata
  • ETL, Troubleshooting
  • IDE, Unix Shell Scripts
  • Informatica, Upgrades
  • Encryption, SAP
  • JDBC, Written
  • LDAP
  • Linux
  • Logging
  • Managing
  • Meetings
  • Memory
  • Access
  • SQLServer
  • Migration
  • MySQL
  • Enterprise
  • NFS
  • Network
  • ODBC
  • OS
  • Operating system
  • Oracle
Experience
09/2018 - CurrentCompany NameCity, State
Hadoop Admin
  • Responsibilities:
  • Involved in deploying a Hadoop cluster using Hortonworks Ambari HDP 2.2 integrated with Sitescope for monitoring and Alerting.
  • Launching and Setup of Hadoop Cluster on physical servers, which includes configuring different components of HADOOP.
  • Created a local YUM repository for installing and updating packages.
  • Responsible for building system that ingests terabytes of data per day into Hadoop from a variety of data sources providing high storage efficiency and optimized layout for analytics.
  • Developed data pipelines that ingests data from multiple data sources and process them.
  • Expertise in Using Sqoop to connect to the ORACLE, MySQL, SQL Server, TERADATA and move the pivoted data to Hive tables or Hbase tables
  • Implemented Kerberos authentication infrastructure- KDC server setup, creating realm /domain, managing principles, generating key tab file for each and every service and managing key tab using key tab tools.
  • Worked on SAS migration to Hadoop on Fraud Analytics and provided predictive analysis
  • • Developed multiple Map Reduce jobs in java for data cleansing and preprocessing.
  • Configured Kerberos for authentication, Knox for perimeter security and Ranger for granular access in the cluster
  • Configured and installed several Hadoop clusters in both physical machines as well as the AWS cloud for POCs.
  • • Configured and deployed hive metastore using MySQL and thrift server.
  • Developed Simple to complex MapReduce Jobs using Hive and Pig
  • Involved in creating Hive tables, and loading and analyzing data using hive queries
  • Extensively used Sqoop to move the data from relational databases to HDFS.
  • Used Flume to move the data from web logs onto HDFS.
  • Used Pig to apply transformations validations, cleaning and deduplication of data from raw data sources.
  • Integrated schedulers Tidal and Control-M with the Hadoop clusters to schedule the jobs and dependencies on the cluster.
  • Worked closely with the Continuous Integration team to setup tools like Github, Jenkins and Nexus for scheduling automatic deployments of new or existing code.
  • Actively monitored the Hadoop Cluster of 320 Nodes with Hortonworks distribution with HDP 2.4.
  • Performed various configurations, which includes, networking and IPTable, resolving hostnames, user accounts and file permissions, http, ftp, SSH keyless login.
  • Worked on performing minor upgrade from HDP 2.2.2 to HDP 2.2.4.
  • Upgraded the Hadoop cluster from HDP 2.2 to HDP 2.4 and HDP 2.4 to HDP 2.5.
  • Integrated BI tool Tableau to run visualizations over the data.
  • Solving hardware related Issues Ticket assessment on daily basis.
  • Automate administration tasks through the use of scripting and Job Scheduling using CRON.
  • Provided 24 x 7 on call support as part of a scheduled rotation with other team members.
  • Worked closely with team members to deliver project requirements, develop solutions and meet deadlines.
  • Increased system security and performance with proactive changes.

Environment: HADOOP HDFS, MAPREDUCE, HIVE, PIG, OOZIE, SQOOP, AMBARI, STORM, AWS S3, EC2, IDENTITY ACCESS MANGEMENT, ZOOKEEPER, NIFI

Hadoop Administrator

Micron Technology Inc 

-

05/2016 - 07/2018Company NameCity, State
Hadoop Admin
  • TraceLink works with life sciences supply chain companies to protect patients and save lives, transforming the way that the industry does business with innovative track and trace solutions. TraceLink is dedicated to helping our customers - from 16 of the top 20 global pharmaceutical companies to corner pharmacies - enable the global distribution of safe drugs and achieve compliance in the most cost effective way.
  • Nstalled and configured Hortonworks Distribution Platform(HDP 2.3) on Amazon EC2 instances.
  • Installed Zookeeper,YARN,Slider,Tez on EC2 instances.
  • Configured Ambari server and ambari metrics server to collect metrics from the cluster.
  • Configured a cluster to run long running jobs using Slider.
  • Enabled High availability for Namenode,Resource Manager and Hive.
  • Hands on experience on configuring Capacity scheduler.
  • Configured queues and their capacities in the cluster.
  • Configured YARN Queue Manager to accept multiple applications by setting User limit factor.
  • Implemented node labels in the cluster to run applications on particular nodes.
  • Implemented node labels in the cluster to run applications on particular nodes.
  • Imported data from SQL server to HDFS by using Sqoop.
  • Used Hive to do analysis on HDFS data.
  • Han ds on experience in using REST APIs to start/stop services.
  • Wrote scripts using Ambari REST APIs to install/uninstall the Hadoop services.
  • Hands on experience on Apache Ambari 2.1.2.
  • Commissioned and decommissioned nodes in the cluster using REST APIs.
  • Created and deleted EC2 instances in the cluster.
  • Developed and documented procedure to replace hosts in the cluster.
  • Hands on experience in expanding volume for Amazon EC2 instances.
  • Integrated Nagios plugins with Hortonworks to monitor the Hadoop services and nodes.
  • Developed procedures, scripted to shut down the services and delete the instances from the clusters using REST APIs.
  • Have good exposure to support applications and development team.
  • Commissioned and decommissioned nodes via ambari.
  • Performed maintenance, monitoring, deployments, and upgrades across infrastructure that supports all ourHadoop clusters.
  • Hands on experience in upgrading the cluster from HDP 2.0 to HDP 2.3.
  • Created Ranger policies for hive and HDFS.
  • Having good exposure to tune the spark configuration.
  • Implemented a different use case to run applications in YARN containers as a long running jobs.
  • Worked with development team to give support for their long running applications and done root cause analysis while resolving their issues.
  • Environment: Cent OS, Oracle, MS-SQL, Zookeeper, Oozie, MapReduce, YARN, Puppet, Nagios, Hortonworks HDP 2.3,REST APIs, Ranger, Amazon web services, Ambari 2.1.2,Sqoop,Hive,Spark,Ranger, Spark.
  • Hadoop Admin/Developer,
  • Axiom Technology Group
06/2014 - 05/2016Company NameCity, State
Hadoop Administrator

Responsibilities:

• Build and Support Hadoop Based EDW platform to support ETL process. Streamline data ingestion from multiple source systems into Data lake. Transform data. Create and support workflow in
Talend TAC and Oozie, develop Pig, Hive QL, Spark QL, Spark Streaming scripts. Build Master Data Management (MDM) data sourcing data from data lake. Create data lineage using Talend TMM.
• Installed, Configured and Maintained the Hadoop cluster for application development and Hadoop ecosystem components like Hive, Pig, HBase, Zookeeper and Sqoop • Extensively worked on commissioning
and decommissioning of cluster nodes, replacing failed disks, file system integrity checks and maintaining cluster data replication.
• Assigning number of mappers and reducers to Map reduce cluster.
• Setting up HDFS Quotas to enforce the fair share of computing resources.
• Configuring and maintaining YARN Schedulers (Fair and Capacity).
• Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
• Developed stored procedures, triggers in MySQL for lowering traffic between servers & clients.
• Used MySQL workbench, query browser utilities • Configuration of the RAID for the servers. Resource management using the Disk quotas • Responsible for preventive maintenance of the servers on monthly
basis.
• Designs, implements and enforces security policies that protect systems and data from access by unauthorized users in Hadoop Environment.
• Investigates security violations and modifies procedures to prevent future incursions.
• Gaining exposure to some of the complex tasks within the job function.

  • Worked closely with team members to deliver project requirements, develop solutions and meet deadlines.
Education and Training
07/2013
Bachelor of Science in Computer ScienceRoyal University of Dhaka (RUD)
Activities and Honors
  • Photographer
  • Writer
  • Member of RUD student Association.
Build Your Own Now

DISCLAIMER

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

Resume Overview

School Attended

  • Royal University of Dhaka (RUD)

Job Titles Held:

  • Hadoop Admin
  • Hadoop Administrator

Degrees

  • Bachelor of Science in Computer Science

Create a job alert for [job role title] at [location].

×

Advertisement

Similar Resumes

View All
Hadoop-Admin-resume-sample

Hadoop Admin

Teaneck, New Jersey

Hadoop-Admin-resume-sample

Hadoop Admin

Jamaica, New York

Hadoop-Administrator-resume-sample

Hadoop Administrator

Mansfield, Texas

About
  • About Us
  • Privacy Policy
  • Terms of Use
  • Sitemap
Help & Support
  • Work Here
  • Contact Us
  • FAQs
Languages
  • EN
  • UK
  • ES
  • FR
  • IT
  • DE
  • NL
  • PT
  • PL
Customer Service
customerservice@livecareer.com
800-652-8430 Mon- Fri 8am - 8pm CST
Sat 8am - 5pm CST, Sun 10am - 6pm CST
  • Stay in touch with us
Site jabber winner award

© 2021, Bold Limited. All rights reserved.