Livecareer-Resume
Jessica Claire
  • Montgomery Street, San Francisco, CA 94105
  • H: (555) 432-1000
  • C:
  • resumesample@example.com
  • Date of Birth:
  • India:
  • :
  • single:
Professional Summary

Dynamic ETL Developer/Data Engineer/Data Analyst with 9 years of experience in helping companies with diverse transitioning, including sensitive financial data and massive big data installations. Having good Knowledge of Big Data Analytics, promotes extensive simulation and testing to provide smooth ETL execution. Known for providing quick, effective solutions to automate and optimize data management tasks. Talented BI professional bringing advanced understanding successful strategies for identifying, protecting and capitalizing on data assets. Able to construct new systems to answer complex business needs.

Skills
  • Data Warehousing Concepts
  • DataStage/Informatica/Teradata - ETL
  • UNIX Shell Scripting
  • SQL and PL/SQL (Oracle/SQL Server/DB2/Teradata)
  • Unit/System/UAT/Post Production testing - ETL
  • ETL Automation (Control M/AutoSys/Tivoli)
  • Big Data Analysis (pyspark/pandas/numpy)
  • Hadoop - HDFS, Spark, Hive, Sqoop, Python and Kafka (real time read)
Experience Summary
ETL Developer - Data Engineer - Data Analyst, 08/2016 - 09/2019
UnitedHealth Group City, STATE,

GALAXY is the primary data warehouse of UHG and is also one of the most comprehensive in the entire health care industry. It follows a dimensional schema and has the data/tables divided among different subject areas. The primary objective of this application is to provide a single, consolidated environment for processing claims and health service data sourced both internally and externally and make it available at a central location accessible by any entity within UnitedHealth Group with the appropriate authority.
Responsibilities:
• Extraction of data from different data sources using different ETL tools.
• Involved in design of ETL jobs to apply transformation logic.
• Worked on complex mappings in DataStage.
• Used FastLoad and MultiLoad utilities for loading data to staging and target Teradata tables respectively.
• Developed an automated script as part of innovation to pick the suitable utility based on volume of data dynamically.
• Worked on TPT LOAD & TPT STREAM utilities.
• Used Teradata Export utilities for reporting purpose.
• Developed TWS Jobs to schedule the process based on frequency of sources.
• Created dependencies on the source system by using hard dependency concept rather file watchers.
• Involved in purging data based on retention period and developed automated process for the data purge and archival.
• Recent Integration with Hadoop Data Lake and using Hive as a structured data Layer on top HDFS.
• Had been successful in implementing Spark as a process back bone for Hive Queries and involved in writing python functions to refine the unstructured and poor data to make it understand to ETL Layer.

Senior Software Engineer - ETL Developer, 07/2010 - 06/2013
UnitedHealth Group City, STATE,

UGAP is a national consolidated data warehouse for UnitedHealth Group. Contains data from multiple claim processing systems such as UNET, COSMOS, MAMSI and GALAXY data sources categorized into flat files and DB2 database. Data extracted is transformed by applying business logics and loaded into UGAP data mart which is on Teradata. Final warehouse describes the complete details about an individual claim with provider details, member details and pharmacy details.

Responsibilities:

  • Worked as an onsite/offshore coordinator and lead developer.
  • Involved in BRD gathering and translation of BRD into functional specification, so that developer can understand and design the ETL components.
  • Involved in data modeling from simple data marts to complex data warehouse.
  • Involved in migration of legacy DW applications to Spark environment.
  • As an SME proposed best solution to complex business logics to technical translations.
  • Designed Teradata Bteq, MultiLoad, FastLoad and TPT scripts according to volume of data.
  • Design and development of ETL components using IBM DataStage.
  • Involved in the capacity planning, data profiling and data quality check design.
  • Involved in the performance tuning of ETL(IBM DataStage all versions) components and Teradata components.
  • Understand the technical differences between the Teradata/Traditional ETL technologies and Spark technologies.
  • Demonstrations to customers and convinces them to migrate to advanced open technologies like PySpark ETL pipelines and trained them according to technical differences.
Software Engineer - ETL Developer, 07/2007 - 06/2010
HSBC Software Development Private Limited City, STATE, India

Worked for the Group Economic Capital project with Basel Accords (calculating the risk capital required for the HSBC group) and Global Business Market (creating a simple data mart for the global group decision making reports) project.

Responsibilities:

  • Responsible for requirement analysis and creating low level design specifications for ETL jobs and developing the jobs.
  • Extraction of monthly financial data feeds and applying required defaulting before processing the data with required aggregations and business rules.
  • Validation of data after applying required business transformations.
  • Validation of extracted and staging area data before loading to the data mart for RiskFrontier application.
  • Masking of production data when dealing with third party vendors by following the firm guidelines for the privacy of customers.
  • Automation of the production run with Control-M scheduler.
  • All the ETL jobs are run through Shell Scripting.
  • Generating the required QA jobs within ETL workflows and sending the validated values to the business for immediate confirmation within the required SLA.
  • Unit testing to post production testing responsibilities with documenting test cases and managing testing and defect control with HP Quality Center.
  • Maintenance of version control of the code in all the environments.
  • Lead all the project related SCM activities.
Education
Master of Computer Applications: , Expected in
-
Andhra university - ,
GPA:
Activities and Honors
  • Best Performance Award
  • Best Team Award
  • Spot Award
  • Appreciations for CSR activities

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

82Good

resume Strength

  • Length
  • Measurable Results
  • Personalization
  • Strong Summary
  • Target Job

Resume Overview

School Attended

  • Andhra university

Job Titles Held:

  • ETL Developer - Data Engineer - Data Analyst
  • Senior Software Engineer - ETL Developer
  • Software Engineer - ETL Developer

Degrees

  • Master of Computer Applications

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in: