LiveCareer-Resume

consultant resume example with 5+ years of experience

Jessica Claire
  • , , 609 Johnson Ave., 49204, Tulsa, OK 100 Montgomery St. 10th Floor
  • Home: (555) 432-1000
  • Cell:
  • resumesample@example.com
  • :
Summary

6+ years experienced ETL Developer with a proven track record of designing, debugging new & existing pipelines using cloud based ETL tool Azure Data Factory & On-premise Informatica Power Center. Demonstrated the capability of figuring out the requirements, accessing data situation and collaborating with the cross functional teams to design a data warehousing system to meet business needs. Adept at developing Cloud based solutions for integrating & migrating data from on-premise to cloud.

Technical Skills

Cloud Platforms - Azure Databricks, Azure Data Factory, Azure Synapse, SnowFlake, Azure Blob Storage, Azure Data Lake storage Gen2, Azure Delta Lake, Azure Portal, Dedicated SQL pools and Azure Functions

On Prem Data Warehousing - Oracle & SQL Server

ETL Tool - Informatica power center

Languages - Structured Query language (SQL), Unix, Python, Pyspark

Experience
Consultant, 08/2020 to 03/2023
Rockwell Automation, Inc.Omaha, NE,

Role - Azure Data Engineer

Development of Solution for alternate method of verifying data file integrity from more common implementations of customized solutions for each vendor across Medica Health Insurance data repository servers.

  • Involved in full lifecycle of projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.
  • Extracting the raw data in the form of CSV, Json and Parquet files using Data-frames, Dataset Reader API & Dataset writer API.
  • Transformed the data using Pyspark and Spark SQL notebooks in Azure Databricks. Also converting notebooks code to use Delta Lake.
  • Mouting the Azure Data Lake Storage(ADLS Gen2) to Azure Data Bricks to different Paths (Raw,Processed and Presentation Layers) so that we can effectively and securely read the Data Using Azure Key Vault & Databricks Secret scope.
  • Defining the pipelines in Azure Data Factory using the Pyspark and Spark SQL Notebooks & Scheduling them using Triggers.
  • Data refresh/Data Promotions from STG to Prod using the Azure data Studio for CMM & OCCT applications.
  • Developed Pipelines and Data flows in the Azure Data Factory to read the .CSV files from SAS server and from Azure Synapse database tables to SAS server.
  • Created linked services to connect to Azure Blob storage and Azure synapse database tables and used these linked services while creating Datasets.
  • Reviewing technical specifications and design documents associated with the Data warehouse.

Environment - Azure ADF, ADLS Gen2, Data Bricks, Spark SQL, Python, Azure SQL DB, Synapse, Azure Boards.

Associate Consultant, 04/2019 to 07/2020
National Financial Partners Corp.Paramus, NJ,

Role - ETL Developer

Implementing Commercial and Personal Farm, Auto & Property Line of Businesses by replacing the existing policy Administration, Billing Systems with Guide wire Policy Center, Billing Center & Claim Center for Wawanesa Mutual Insurance Company.

  • Involved in building the ETL Framework, Source to Target mappings to load data into Data warehouse system.
  • Created ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation.
  • Designed and Maintained complex ETL Mappings and performed unit testing on the developed code.
  • Involved in Root cause Analysis (RCA) and come up with solution for frequent repetitive errors in Production Environment and make the application stable.
  • Co-ordinated with Informatica Admin team to promote code from lower Environment using Informatica Deployment Groups.
  • Participated in meetings with cross functional teams and co-ordinate in fixing bugs in application.
  • Developing the test plan, conducting testing and dealing with business partners to perform end-user testing.
  • Involve in post-production support.

Environment - Informatica Power center 10.2, Oracle 10g, SQL, Oracle SQL Developer, Unix, puTTY, Jira.

Senior Associate, 04/2017 to 04/2019
The Kemtah GroupBerkeley, CA,

Role - ETL Developer

  • Developed standard and reusable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.
  • Prepared design documents consisting of data flow from Source to Target for all the mappings that are developed using Informatica.
  • Extensively used transformations like Connected & Unconnected Lookups, Aggregator, Filter, Router, Expression, Joiner, Sequence generator etc.
  • Prepare SQL Queries to validate Data in Source and target databases according to the business needs.
  • Provide weekly status reports to the Team Lead.
  • Involved in Unit Testing and preparing the Unit Test Cases as per the requirements.

Environment - Informatica power center, Oracle 10g, SQL, SQL Plus, Oracle SQL developer, Unix, puTTY.

Education
Bachelor of Technology : Electronics And Communication Engineering, Expected in 05/2016 to Jawaharlal Nehru Technological University - India,
GPA:

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • Jawaharlal Nehru Technological University

Job Titles Held:

  • Consultant
  • Associate Consultant
  • Senior Associate

Degrees

  • Bachelor of Technology

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: