LiveCareer-Resume

data engineer resume example with 7+ years of experience

Jessica
Claire
resumesample@example.com
(555) 432-1000,
, , 100 Montgomery St. 10th Floor
:
Summary
  • SnowPro Core and AWS Certified Cloud Practitioner Professional with strong knowledge on Designing & building the data model for snowflake cloud data warehouse.
  • Experience in Design & Development of ETL methodology for supporting Data Migration, Data transformations and processing in a corporate wide ETL solution.
  • Involved in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment.
  • Significant experience in ETL (Extract, Transform and Loading data) tool Informatica PowerCenter in analyzing, design and development ETL processes for Data Warehousing projects using Informatica PowerCenter (10.4.2/9.6.1).
  • Experience in developing Spark applications using Spark - SQL, Pyspark for data extraction, transformation.
  • Strong programming experience using PL/SQL Packages, Stored Procedures, Functions, Cursors, Indexes, Views, Materialized Views.
  • Good experience in Extraction, Transformation, and Loading (ETL) data from multiple database sources (Oracle, SQL Server & Teradata) for medium to large Enterprise Data Warehousing.
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python.
  • Good interpersonal, Analytical, and Communication skills.
  • Can-do, Will-do Attitude, Problem-solving, Self-directed, Ability to learn new tools and use them.
  • Experience in various methodologies like Agile and Waterfall.
Skills
  • Methodologies: Agile (Scrum), Waterfall
  • Database: MySQL, Oracle, SQL Server, Teradata
  • Cloud: Snowflake, AWS Redshift
  • Big Data Frameworks: Hadoop(HDFS), Spark, Hive, HBase
  • DevOps: CI/CD pipeline
  • ETL Tools: Informatica Power Center 10.4.2/9.6.1
  • Scheduling: Maestro, Control-M, Autosys
  • Scripts: Advanced SQL, PL/SQL, Unix shell scripting, Python, XML
  • Defect Tracking Tools: Microsoft TFS
  • Operating System: Windows, Linux
  • Business Domain: Healthcare, Insurance, Retail
Education and Training
Campbellsville University Campbellsville, KY Expected in 08/2019 Master of Science : Information Technology Management - GPA :
Northwestern Polytechnic University Fremont, CA Expected in 04/2015 Master of Science : Electrical Engineering - GPA :
JNTU Hyderabad, India, Expected in 05/2012 Bachelor of Technology : Electronics & Communication Engineering - GPA :
Certifications
  • SnowPro Core Certification - 2022
  • AWS Certified Cloud Practitioner - 2022
Experience
Splunk - Data Engineer
Livermore, CA, 11/2017 - Current
  • Expertise in identifying and analyzing the business need of the end-users and building the project plan to translate the functional requirements into the technical task that guide the execution of the project.
  • Involved in Design, analysis, Implementation, Testing, and support of ETL processes for Stage, ODS, and Mart.
  • Design and developed Informatica mappings and Sessions based on business user requirements and business rules to load data from diverse sources such as flat files and oracle tables to target tables.
  • Implemented One time Data Migration of Multistate level data from SQL Server to Snowflake by using Python and SnowSQL.
  • Used azure databricks to do ETL (Hive, workspace, map).
  • Strong Oracle, PL/SQL knowledge and ability to code complex queries against a marketing database.
  • Hands on experience in implementing an DevOps pipeline using the AWS CI/CD tool sets.
  • Builded data pipelines using airflow in GCP for ETL related jobs using different airflow operations.
  • Involved on migrating manual deployments to automated, containerized deployments (Azure DevOps, GKE, Cloud SQL).
  • For multi cloud strategy, implemented the CI/CD pipeline using GKE, Terraform.
  • Develop scripts to automate the execution of ETL using shell scripts under Unix environment.
  • Involved in scrum meetings, product backlog and other scrum activities and artifacts in collaboration with the team.
  • Perform troubleshooting analysis and resolution of critical issues.
Genesys - Software Engineer
Burlington, MA, 04/2017 - 10/2017
  • Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • Developed slowly changing dimension Type 2 mappings to maintain the history of the data.
  • Design and customizing data models for Data Warehouse supporting data from multiple sources.
  • Created Web service source and target in mapping designer and published web services.
  • Design and coded required Database structures and components.
  • Involved in loading data into snowflake tables from the internal stage using SnowSQL.
  • Created Filewatcher jobs to setup the dependency between cloud and PowerCenter jobs
  • Very good knowledge of RDBMS topics, ability to write complex SQL, PL/SQL.
  • Heavily involved in testing Snowflake to understand best possible ways to use the cloud resources.
Children's Place - ETL Cloud Developer
Bakersfield, CA, 02/2016 - 01/2017
  • Developed Cloud Formation scripts to build on demand EC2 instance formation.
  • Configured and maintained the monitoring and alerting of production and corporate servers/storage using Cloud Watch.
  • Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Cloud.
  • Developed single dynamic ETL mapping to load more than 30 reference tables.
  • Working with development teams to help engineer scalable, reliable, and resilient software running in the cloud.
  • Developed Shell Scripts for automation purpose.
  • Maintained and automated the production database to retain the history backup tables to 30 days.
  • Deep understanding of Infrastructure as a Code and agile methodologies.
  • Experience with Cloud based hosting solutions (AWS-EC2/S3, Azure, Google Cloud).
  • Good coding exposure on various AWS services like Cloud Formation Templates, Cloud Watch, Cloud Trail, encryption, logging and Lambda.
Anthem - ETL Developer
City, STATE, 08/2015 - 12/2015
  • Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions
  • Designed logical models as per business requirements using Erwin.
  • Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Designed and developed Complex mappings like Slowly Changing Dimensions Type 2 in the mapping designer to maintain full history of transactions.
  • Developed and invoked PL/SQL stored procedures and functions for data processes used in the Informatica mappings.

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • Campbellsville University
  • Northwestern Polytechnic University
  • JNTU

Job Titles Held:

  • Data Engineer
  • Software Engineer
  • ETL Cloud Developer
  • ETL Developer

Degrees

  • Master of Science
  • Master of Science
  • Bachelor of Technology

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: