Jessica Claire
, , 100 Montgomery St. 10th Floor (555) 432-1000,
Professional Summary

A team player and security-driven DevOps & Cloud Platform Engineer with over 6 years’ experience in systems administration, software configuration management, cloud integration using various CI/CD and SCM tools for end-to-end automation & deployment of software to different environments Experience in implementing security best practices to provision and manage secure, fault-tolerant, scalable and highly available architectures. Hands on experience in containerization and orchestration of microservice applications into different environments leveraging various DevOps automation tools. Familiar with agile and Scrum SDLC mythologies.

  • Containerization Tools: Docker, Docker Swarm, EKS, ECS, Kubernetes
  • Automation Tools: Ansible, Terraform, Jenkins, Code Commit CodePipeline, Lambda, Fargate
  • Infrastructure as Code: Terraform, CloudFormation, Helm
  • Version Control: Git, Github
  • Databases: MySQL, MongoDB
  • Scripting Languages: Shell Scripting, Python, Yaml, Groovy
  • AWS Cloud Compute: IAM, VPC, EBS, S3, ELB, Auto Scaling, Route53, Cloud Watch, CloudTrail,
  • Platform: Linux (RedHat, CentOS, SUSE), Ubuntu, Windows
  • Virtualization: Hyper-V, VMware,
  • Logging & Monitoring: EFK, CloudWatch, Prometheus, Grafana, NewRelic
  • Ticketing: Jira, SNOW
Work History
02/2019 to Current Senior Devops Engineer Capital One | Brookville, NY,
  • Built, configured and administered both self-managed and managed Kubernetes Clusters.
  • Experience using docker to containerize monolithic and micro services applications and docker swarm / Kubernetes for container orchestration using various Kubernetes objects such as Deployments, replicasets, stateful sets.
  • Used terraform AWS/CLI/AWS SDK (Boto3), KOPS and ansible to automate the provisioning and configuration of infrastructure on AWS such as VPC, subnets, route tables, DNS, IGW, ELB, IAM, Auto Scaling, EBS, Databases, S3, Security Groups NACLS etc.
  • Experience using Jenkins to build complex CI/CD pipelines and integrating with other tools for end-to-end automation of builds and deployments.
  • Managed and monitored Kubernetes clusters using Prometheus for data aggregation & Grafana for data visualization well as data analytics and log management using EFK.
  • Troubleshoot issues during build, and failed deployments of Kubernetes pods.
  • Installed and configured web/application servers (Nginx, apache, Tomcat, JBoss/Wildfly.
  • Experience in applying best practices in Cloud Security, auditing and implementing security controls to.
  • Implemented security best practices in AWS including multi factor authentication, access key rotation, encryption using KMS, firewalls- security groups and NACLs, S3 bucket policies and ACLs, mitigating DDOS attacks etc.
  • Applied security by design concepts through the SDLC and leveraged best practices to build highly secure, fault-tolerant, highly available and scalable architecture and applications.
  • Monitoring and troubleshooting server performance by using Performance Monitor and Performance logs and counters.
  • Used Jira to track vulnerabilities, requests and incidents.
03/2015 to 12/2018 Cloud Engineer Deloitte & Touche L.L.P. | Ogden, UT,
  • Performed build and release of all software cycles engineering, test, production, update, patches and maintenance.
  • Experience in designing, building, and an automating build/ release of software from development to production environment that also satisfies the internal audit and compliance.
  • Created users, repositories, branching, tagging, patch fixes, pull request, and trained users on the Git version control system.
  • Automate provisioning of Infrastructure on AWS using Terraform and Ansible with Dynamic Inventory.
  • Created Docker files and build images and push to the Docker private registry.
  • Configured and managed highly available and scalable Kubernetes cluster to manage containerized applications and exposing to end users using various services – DNS, ELB, NodePort, Ingress Controllers and Auto scaling.
  • Leveraged MongoDB and MySQL for persistent volume in our clusters.
  • Performed continuous monitoring using Prometheus and Grafana and CloudWatch.
02/2013 to 01/2015 System Administrator Alakaina Family Of Companies | Fort Bragg, NC,
  • Configure various environments including Linux, windows and installed relevant packages based on requirements.
  • Install and configure Active Directory Domain Services for objects management (users, groups, network printers, network guests' nodes).
  • Configured sudo for users to access root privileges, reset password and unlock user accounts.
  • Managed roll out of McAfee Antivirus software to all servers and endpoints.
  • Managing, installing and troubleshooting Microsoft windows active directory DNS, DHCP services on windows server 2003/2008 and 2012 platforms.
  • Used Active Directory to administer users and groups and given appropriate permissions and privilege to access LAN and Domain environment.
  • Identify and troubleshoot network connectivity issues by using TCP/IP tools.
  • Write powershell and bash shell scripts to automate tasks including patching servers and user account creation.
Expected in B.Tech | Computer Engineering LMU, , GPA:

AWS Solutions Architect Asssociate

CompTIA Security Plus

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy


Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:


resume Strength

  • Formatting
  • Personalization
  • Strong Summary
  • Target Job

Resume Overview

School Attended

  • LMU

Job Titles Held:

  • Senior Devops Engineer
  • Cloud Engineer
  • System Administrator


  • B.Tech

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in: