close
  • Dashboard
  • Resumes
  • Cover Letters
  • Resumes
    • Resumes
    • Resume Builder
    • Resume Examples
      • Resume Examples
      • Nursing
      • Customer Service
      • Education
      • Sales
      • Manager
      • View All
    • Resume Search
    • Resume Templates
      • Resume Templates
      • Microsoft Word
      • Professional
      • Modern
      • Traditional
      • Creative
      • View All
    • Resume Services
    • Resume Formats
      • Resume Formats
      • Chronological
      • Functional
      • Combination
    • Resume Review
    • How to Write a Resume
      • How to Write a Resume
      • Summary
      • Experience
      • Education
      • Skills
        • Skills
        • Hard Skills
        • Soft Skills
    • Resume Objectives
  • CV
    • CV
    • CV Examples
    • CV Formats
    • CV Templates
    • How to Write a CV
  • Cover Letters
    • Cover Letters
    • Cover Letter Builder
    • Cover Letter Examples
      • Cover Letter Examples
      • Customer Service
      • Marketing
      • Sales
      • Education
      • Accounting
      • View All
    • Cover Letter Services
    • Cover Letter Templates
    • Cover Letter Formats
    • How to Write a Cover Letter
  • Questions
  • Resources
  • About
    • About
    • Reviews
  • Contact
  • jane
    • Settings
    • Help & Support
    • Sign Out
  • Sign In
Member Login
  • LiveCareer
  • Resume Search
  • Hadoop Admin
Please provide a type of job or location to search!
SEARCH

Hadoop Admin Resume Example

Love this resume?Build Your Own Now
MH
HADOOP ADMIN
Summary
  • Having around 6+ years' experience in Operations, maintaining, monitoring and upgrading Hadoop Clusters (Cloudera and Hortonworks distributions).
  • Hands on experience in installing/configuring/maintaining Apache Hadoop clusters for application development and Hadoop tools like Hive, Spark, YARN, Flume, Kafka, Impala, Zookeeper, Hue and Sqoop using both Cloudera and Hortonworks.
  • Experience Capacity Planning, validating hardware and software requirements, building and configuring small, medium size clusters, smoke testing, managing and performance tuning the Hadoop clusters.
  • Experience in Configuring Name-node High availability and Name-node Federation and depth knowledge on Zookeeper for cluster coordination services.
  • Hands on experience in analyzing Log files for Hadoop and eco system services and finding root cause.
  • Expertise in implementing Kerberos Security to Hadoop clusters
  • Responsible for planning of Capacity Planning, Infrastructure Planning and version fix to build Hadoop Cluster.
  • Excellent expertise and knowledge of Cloud Platforms and its components (IBM Private/Public Cloud, Kubernetes, Docker).
  • Experienced in using HDFS, Pig, Hive, Spark, Impala, Sqoop, Oozie, ZooKeeper and Cloudera Manager.
  • Experience in scheduling all Hadoop/Hive/Sqoop/HBase jobs using Oozie.
  • Diverse background with fast learning skills and creative analytical skills.
  • Self-starter and ability to learn new things in a quick span of time.
  • Good communication and documentation skills.

Skilled System Administrator focused on performance optimization and technical improvements with an understanding of cost-effective decision making and usability.

Enthusiastic individual with superior skills in working in both team-based and independent capacities. Bringing strong work ethic and excellent organizational skills to any setting. Excited to begin new challenge with successful team.

Skills
  • Active Directory,
  • Backup, Quality
  • Big data, Express
  • Version control, SAS
  • CA-7, RDBMS
  • Catalog, Real Time
  • Hardware, Reporting
  • CPU, Requirement
  • Database administration
  • Data Integration, SDLC
  • Databases, Shell Scripts
  • Database, SQL
  • Data lake
  • Data Warehousing, SSL
  • Delivery, Strategy
  • Designing, Tableau
  • Certificate in Linux Programming and Administration
  • Encryption, SAP
  • Disaster Recovery, Tables
  • Encryption, SAP
  • Eclipse, Teradata
  • ETL, Troubleshooting
  • IDE, Unix Shell Scripts
  • Informatica, Upgrades
  • Encryption, SAP
  • JDBC, Written
  • LDAP
  • Linux
  • Logging
  • Managing
  • Meetings
  • Memory
  • Access
  • SQLServer
  • Migration
  • MySQL
  • Enterprise
  • NFS
  • Network
  • ODBC
  • Operating system
  • Oracle
Experience
Hadoop Admin / Virtusa - Irving , TX09/2018 - Current
  • Project Description : Citizen Bank is the corporate and investment banking division of Citizen Bank. The Project is for implementing big data analytics in Hadoop, loading data from multiple sources like MySQL, Web Server Logs into Hive and querythe data as required. The main idea is to understand customer base, buying habits, buying decisions etc
  • Responsibilities:
  • Extensively involved in Installation and configuration of Cloudera distribution Hadoop Name Node, Secondary Name Node, Resource Manager, Node Manager and Data Nodes. Done stress and performance testing, benchmark for the cluster.
  • Installing Patches and packages on Unix/Linux Servers. Worked with development in design and ongoing operation of several clusters utilizing Cloudera's Distribution including Apache Hadoop.
    • Worked extensively with importing metadata into Hive and migrated existing tables and applications to work on Hive and HBase
    • Responsible to migrate from Hadoop to Spark frameworks, in-memory distributed computing for real time fraud detection.
    • Provided System support/maintenance for 24x7 for Customer Experience Business Services
    • Supported Data Analysts in running Map Reduce Programs.
    • Implemented Fair scheduler on the job tracker to allocate the fair amount of resources to small jobs.
    • Involved in running Hadoop jobs for processing millions of records of text data. Troubleshoot the build issue during the Jenkins build process. Implement Docker to create containers for Tomcat Servers, Jenkins.
    • Responsible for scheduling jobs in Hadoop using FIFO, Fair scheduler and Capacity scheduler
    • Expertise in Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.
    • Worked on a live Big Data Hadoop production environment with 220 nodes.
    • HA implementation of Name Node to avoid single point of failure.
    • Experience working on LDAP user accounts and configuring ldap on client machines.
    • Automated day to day activities using shell scripting and used Cloudera Manager to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.
    • Responsible for Cluster maintenance, Adding and removing cluster nodes, Cluster Monitoring and Troubleshooting, Manage and review data backups, Manage and review Hadoop log files.
    • Involved in planning the Hadoop cluster infrastructure, resources capacity and build plan for Hadoop cluster installations.
    • Resolving tickets submitted by users, P1 issues, troubleshoot the error documenting, resolving the errors.
    • Installed and configured Hive in Hadoop cluster and help business users/application teams fine tune their HIVE QL for optimizing performance and efficient use of resources in cluster.
    • Installed and configured Ganglia Monitoring system to get metrics and monitoring the Hadoop cluster. Also configured Hadoop logs rotation and monitoring them frequently.
    • We do performance tuning of the Hadoop Cluster and map reduce jobs. Also the real-time applications with best practices to fix the design flaws.
    • Implemented Oozie work-flow for ETL Process for critical data feeds across the platform.
    • Configured Ethernet bonding for all Nodes to double the network bandwidth
    • Implementing Kerberos Security Authentication protocol for existing cluster.
    • Built high availability for major production cluster and designed automatic failover control using Zookeeper Failover Controller (ZKFC) and Quorum Journal nodes.
    • Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to parquet files.
    • Worked closely with Business stake holders, BI analysts, developers, and SAS users to establish SLAs and acceptable performance metrics for the Hadoop as a service offering.
    Environment: Hadoop, Apache Pig, Hive, OOZIE, SQOOP, Spark, Hbase, Pig, LDAP, CDH5, Unravel, Splunk, Tomcat, and Java

-

Hadoop Admin / Virtusa - Jersey City , NJ05/2016 - 07/2018
  • Project Description: The project is aimed at collecting reports periodically for AT&T Customers and storing them into HDFS. Also, these reports extracted data to the service layer for presentation.Perform analytics and provide insights for business needs.
  • Responsibilities:
  • Experience in managing scalable Hadoop cluster environments.
  • Involved in managing, administering and monitoring clusters in Hadoop Infrastructure.
  • Regular Maintenance of Commissioned/decommission nodes as disk failures occur using MapR File
  • Used Sqoop to import and export data from HDFS to RDBMS and vice-versa.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  • Responsible for troubleshooting issues in the execution of MapReduce jobs by inspecting and reviewing log files.
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
  • Experience in HDFS maintenance and administration.
  • Managing nodes on Hadoop cluster connectivity and security.
  • Experience in commissioning and decommissioning of nodes from cluster.
  • Experience in Name Node HA implementation.
  • Working on architected solutions that process massive amounts of data on corporate and AWS cloud-based servers.
  • Working with data delivery teams to setup new Hadoop users.
  • Installed Oozie workflow engine to run multiple Map Reduce, Hive and HBase jobs.
  • Configured Megastore for Hadoop ecosystem and management tools.
  • Installed and configured Zookeeper
  • Hands-on experience in Nagios and Ganglia monitoring tools.
  • Experience in HDFS data storage and support for running Map Reduce jobs.
  • Performing tuning and troubleshooting of MR jobs by analyzing and reviewing Hadoop log files.
  • Installing and configuring Hadoop eco system like Sqoop, Pig, Flume, and Hive.
  • Maintaining and monitoring clusters. Loaded data into the cluster from dynamically generated files using Flume and from relational database management systems using Sqoop.
  • Importing And Exporting Data from MySQL/Oracle to HiveQL using SQOOP.
  • Experience in using distcp to migrate data between and across the clusters.
  • Hands on experience in analyzing Log files for Hadoop eco system services.
  • Coordinate root cause analysis efforts to minimize future system issues.
  • Highly involved in operations and troubleshooting Hadoop clusters.
  • Troubleshooting of hardware issues and closely worked with various vendors for Hardware/OS and Hadoopissues.
    Environment: Cloudera4.2, HDFS, Hive, Sqoop, HBase, Chef, Rhel, Mahout, Tableau, Micro strategy, Shell Scripting, Red Hat Linux.
Hadoop Administrator / Cognizant Technology Solutions - Edina , MN06/2014 - 05/2016

Project: Hadoop Admin support
John Wiley & amp Sons, Inc, also referred to as Wiley, is a global publishing company that specializes in academic publishing and markets its products to professionals and consumers, students and instructors in higher education, and researchers and practitioners in scientific, technical, medical, and scholarly fields. This project deals with maintaining complete end to end Hadoop environment support.

  • Responsibilities:
  • Hands on experience in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH5). Responsible for Cluster maintenance, Monitoring, commissioning and decommissioning Data nodes, Troubleshooting, Manage and review data backups, Manage & review log files.
  • Day to day responsibilities includes solving developer issues, deployments moving code from one environment to other environment, providing access to new users and providing instant solutions to reduce the impact and documenting the same and preventing future issues.
  • Evolved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.
  • Strong Experience in Linux core Environment
  • Worked on Capacity planning for the Production Cluster
  • Worked on Configuring Kerberos Authentication in the Hadoop cluster and AS400
  • Ability to Configuring queues in capacity scheduler and taking Snapshot backups for Hbase tables.
  • Worked on fixing the cluster issues and Configuring High Availability for Name Node in CDH5.
  • Involved in Cluster Monitoring backup, restore and troubleshooting activities.
  • Handled the imports and exports of data onto HDFS using Flume and Sqoop.
  • Respo• Responsible for Cluster maintenance, Monitoring, commissioning and decommissioning Data nodes, Troubleshooting, Manage and review data backups, Manage & review log files • Used Spark API over
    Cloudera Hadoop YARN to perform analytics on data in Hive.nsible for implementation and ongoing administration of Hadoop infrastructure
  • Managed and reviwed Hadoop log files.
  • Importing and exporting data from RDBMS into HDFS and HBASE using Sqoop.
  • Good Understanding of installation and configuring Spark and Impala.
  • Successfully installed and configured Queues in Capacity scheduler and Oozie scheduler.
  • Worked on Performance Optimization for the Hive queries while Performing tuning in the Cluster level and adding the Users in the clusters.
  • Monitored workload, job performance and capacity planning .
  • Involved in Analyzing system failures, identifying root causes, and recommended course of actions.
  • Worked closely with team members to deliver project requirements, develop solutions and meet deadlines.
  • Environment: RHEL, CDH 5.11, Hive, Sqoop, Flume, Hbase, MySQL, Cassandra, Oozie, Zookeeper, Puppet, Nagios, AWS (S3, EC2, IAM, EMR, Github)
Education and Training
Royal University of Dhaka (RUD)08/2013Bachelor of Science: Computer Science
Activities and Honors
  • Photographer
  • Writer
  • Member of RUD student Association.
Build Your Own Now

DISCLAIMER

Resumes, and other information uploaded or provided bythe user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

72Average
Resume Strength
  • Formatting
  • Word choice
  • Length
  • Strong summary
  • Typos

Resume Overview

School Attended

  • Royal University of Dhaka (RUD)

Job Titles Held:

  • Hadoop Admin
  • Hadoop Administrator

Degrees

  • Royal University of Dhaka (RUD) 08/2013 Bachelor of Science : Computer Science

Create a job alert for [job role title] at [location].

×

Advertisement

Similar Resumes

View All
Hadoop-Admin-resume-sample

Hadoop Admin

Virtusa

Atlanta , GA

Hadoop-Admin-resume-sample

Hadoop Admin

Virtusa

Buffalo , NY

Hadoop-Developer-resume-sample

Hadoop Developer

Infosys Ltd

Rockville , MD

  • About Us
  • Privacy Policy
  • Terms of Use
  • Sitemap
  • Work Here
  • Contact Us
  • FAQs
  • Accessibility
  • EN
  • UK
  • ES
  • FR
  • IT
  • DE
  • NL
  • PT
  • PL
customerservice@livecareer.com
800-652-8430 Mon- Fri 8am - 8pm CST
Sat 8am - 5pm CST, Sun 10am - 6pm CST
  • Stay in touch with us
Site jabber winner award

© 2022, Bold Limited. All rights reserved.