Livecareer-Resume
Jessica Claire
  • Montgomery Street, San Francisco, CA 94105
  • Home: (555) 432-1000
  • Cell:
  • resumesample@example.com
Summary
  • 3 years of experience in the field of DevOps Engineer in application configurations, code compilation, packaging, building, automating, managing and releasing code from one environment to other and deploying to servers. 
  • 9+ years of overall experience working in Devops, Big Data and Datawarehouse Technologies.
  • Experienced in Jenkins by installing, configuring and maintaining for purpose of continuous integration (CI) and for end to end automation for all build and deployments and creating Jenkins CI pipelines.
  • Hands on experience with EC2, S3, RDS, VPC, ELB, EBS, Auto scaling. 
  • Experienced in branching, merging and maintaining the versions using SCM tools like Git and GitHub on windows and Linux platform. 
  • Experienced in Project Management and issue tracking tool like JIRA.
  • Experienced in the creation of Docker containers and Docker consoles for managing the application life cycle.
  • Creating custom Docker Images using Docker file for easier replication of DEV and QA Environments in local machines. 
  • Performed and deployed builds for various Environments like QA, Integration, UAT and Productions Environments.Developed and deployed Chef, puppet based on their cookbooks, recipes and manifest. 
  • Configured and monitored distributed and multi-platform servers using Nagios. 
  • Strong analytical and problem-solving skills and can work either independently with little or no supervision or as a member of a team. 
  • Good written and verbal communication skills, strong organizational skills and a hard-working team player, well-practiced in attending phone calls and answering business team queries.
Skills
  • Versioning Tools - Git
  • CI - Jenkins, Chef, Puppet
  • Build Tools - Maven
  • Ticket Tracking Tool - JIRA
  • Containerization Tool -  Docker
  • Operating System - ​Windows , Unix 
  • AWS - Amazon EC2, S3, RDS, ELB, EBS, Auto scaling.
​​
  • Languages - SQL, NO SQL
  • Scripting Language - Shell, Python
  • Web server - Apache Tomcat
  • Database -  Oracle,SQL Server, MySQL, Teradata, DB2, Netezza
  • Big Data - Hadoop, HDFS, Map Reduce, Flume, Pig,Sqoop, Hive, Oozie, MongoDB​
  • ETL - ​Ab Initio 
  • Monitoring Tool - Nagios
Experience
Devops Engineer, 01/2015 to Current
Two95 International Inc.Ellisville, MO,
  • Managed cookbooks in Chef and Implemented environments, roles, and templates in Chef for better environment management. 
  • Used Shell scripts to day to day activities and tasks for automating. 
  • Used Jenkins tool to automate the build process. 
  • Installing and configuring Jenkins master and slave nodes. Built CI/CD pipeline and managing the infrastructure as code using chef and puppet. 
  • Have experience in cloud platform like AWS.  
  • Created and implemented chef cookbooks for deployment and also used Chef Recipes to create a Deployment directly into Amazon EC2 instances. 
  • Worked in GIT to manage source code. 
  • Setup Chef Server, workstation, client and wrote scripts to deploy applications.
  • Deployed the applications to Tomcat Application Server and static content to Apache web servers. 
  • Automated the continuous integration and deployments using Jenkins, Docker.
  • Installed, Configured, and Managed Monitoring Tools such as Nagios for Resource Monitoring/Network Monitoring.
  • Worked on Docker container to create Docker images for different environments.
  • Responsible for taking the source code and compiling using Maven and package it in its distributable format, such as a WAR file.
  • Implemented process for release management, automated code deployment, configuration management, and monitoring. 
Environment: Amazon EC2, S3, RDS, VPC, ELB, EBS, Auto scaling, UNIX/LINUX, Redhat Linux 6, CentOS, Jenkins, Windows, Apache Tomcat, Shell Scripts, Docker, Nagios, puppet. 
 
Big Data / Datawarehouse Tech Lead, 03/2012 to 01/2015
Salient CrgtCamp Pendleton, CA,
  • Worked with systems engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters with agile methodology. 
  • Monitored multiple Hadoop clusters environments using Control-M, monitored workload, job performance and capacity planning using Cloudera Manager.
  • Experienced with through hands-on experience in all Hadoop, Java, SQL and Python. 
  • Participated in functional reviews, test specifications and documentation review.
  • Performed MapReduce programs on log data to transform into structured way to find user location, age group, spending time. 
  • Analyzed the web log data using the HiveQL to extract number of unique visitors per day, page views, visit duration, most purchased product on website. 
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports by Business Intelligence tools. 
  • Documented the systems processes and procedures for future references, responsible to manage data coming from different sources.  
  • Managed multiple projects as a Datawarehouse tech lead in Onsite - Offshore model. 
Environment: Hadoop, HDFS, Map Reduce, Flume, Pig, Sqoop, Hive, Pig, Sqoop, Oozie, MongoDB, Shell Scripting.
 
Systems Engineer, 02/2011 to 03/2012
Infosys LtdPennington, NJ,
  • Supported in Gathering requirement and understanding the business functionality in case of TPR (Technical Project Request) document. 
  • Develop the graphs according to the business requirements. 
  • Analyze source systems file layouts and write DML for extracting the data from various sources like flat files, tables, Mainframe Copy Books, Responder Layouts. 
  • Involve in analyzing the data transformation logics, mapping implementations and data loading into target database through Ab-Initio graphs. 
  • Developing UNIX shell scripts for automation processes. 
  • Involve in fixing the unit and functional test case/data defects. 
  • Analysis of Existing application and identifying improvements.  
Environment : Ab Initio 3.03, Teradata 12, UNIX (Sun Solaris Korn Shell).
 
 
Ab Initio Developer, 2009 to 02/2011
Hewlett Packard Global India Pvt Ltd.City, STATE,
  • Develop the graphs according to the TPR document. 
  • Analyze source systems file layouts and write DML for extracting the data from tables. 
  • Involve in analyzing the data transformation logics, mapping implementations and data loading into target database through Ab-Initio graphs. 
  • Developing UNIX shell scripts for automating processes. 
  • Involve in fixing the unit and functional test case/data defects.
  • Analysis of Existing application and identifying improvements. 
  • Prepare result analysis graphs to validate business test case results. 
  • Performance testing of all graphs against huge volumes of data.  
Environment: Ab Initio 3.03, Teradata 12, UNIX (Sun Solaris Korn Shell), MOSS 2010.
 
Education
Bachelor of Engineering: Electronics and Communication, Expected in 2008
Anna University - Chennai, Tamil Nadu
GPA:
Certifications
  • Cloudera Certified Hadoop Developer (CCD - 410).
  • Certified MongoDB Developer.

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

70Average

resume Strength

    Resume Overview

    School Attended

    • Anna University

    Job Titles Held:

    • Devops Engineer
    • Big Data / Datawarehouse Tech Lead
    • Systems Engineer
    • Ab Initio Developer

    Degrees

    • Bachelor of Engineering

    By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

    *As seen in: