Livecareer-Resume
Jessica Claire
, , 100 Montgomery St. 10th Floor
Home: (555) 432-1000 - Cell: - resumesample@example.com - -
Education
Expected in 05/2011
Bachelor of Engineering: Computer Science & Information Technology
Sri Krishnadevaraya University - Kurnool, AndhraPradesh,
GPA:
Skills
  • ETL Tools : DataStage ,Informatica Power Centre and SSIS Packages
  • Big Data Technologies : Hive,Spark , HDFS,Kafka, Sqoop
  • Database : Oracle ,SQL Server,DB2, Netezza , Mongo, Snowflake
  • Programming Languages : UNIX, Python, PL/SQL
  • Working experience in Agile ,Waterfall model and tracking in JIRRA and Microsoft Devops
  • Configuration Tools : PVCS , Microsoft TFS, Azure CI/CD pipeline
  • Cloud Experience: Azure
  • Job Scheduling Tools: CA7, Control -M
  • Operating System : Win XP, 7, 10 and UNIX.
  • Adaptability
  • Data management
  • Organization and Time management
  • Teamwork
Certifications

IBM Data Stage

Oracle

Informatica

Netezza

Professional Summary
  • Having 9 years of Industry experience in ETL Tools such as DataStage , Informatica and SSI Packages.
  • Having experience in Data Bricks, Hive SQL, Azure CI/CD pipeline, Delta Lake, Data Lake, Hadoop File system, Snowflake
  • Having experience in Building ETL pipe lines using Apache Spark, Python
  • Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using DataStage/Informatica to populate tables in Data Warehouse, DataMart ,ODS and Large Data sets.
  • Experience in working on streaming data using IBM MQ and Kafka.
  • Experience in working using Agile(SCRUM) and Waterfall Development Methodology
  • Experience in logging tickets in Service now, Version control tools PVCS, Azure DevOps CI/CD pipeline
  • Experience in Azure work environment.
  • Work experience in IBM Master Data Management(MDM) Architecture
  • Working Experience on Azure Cloud
  • Experience in working with multiple Data Bases like Oracle, SQL Server, DB2, Netezza, NOSQL Mongo DB, Salesforce, Snowflake DB
  • Experience in working on data migration from oracle 9i to 10g and DB2 to Netezza
  • Expertise in using DataStage/Informatica to integrate with different Sources and Targets like Azure SQL database, Oracle, Mainframe systems, Netezza, Salesforce, SOAP and REST services, XML, SQL Server and MongoDB
  • Experience in UNIX, AIX and Linux server resource monitoring and load balancing.
  • Ensured that user requirements are effectively and accurately communicated to the other members of the development team and Facilitate communications between business users, developers and testing teams.
  • Conducting internal and external reviews as well as formal walkthrough among various teams, and documenting the proceedings.
  • Excellent problem-solving and trouble-shooting capabilities. Quick learner, highly motivated, result oriented and an enthusiastic team player
Work History
02/2021 to Current
Senior Data Engineer Factset Research Systems Inc. San Francisco, CA,
  • Bank Operational Data Distribution Hub is highly availability distribution center for operational data. The servers have been set up to provide failover capabilities in the event of any issues which could cause the hardware to shut down. The design of this system focuses on four main vendors. We load data to HDFS storage as well and built HIVE on top of this to analyze.
  • Developed, implemented, supported and maintained data analytics protocols, standards and documentation.
  • Analyzed complex data and identified anomalies, trends and risks to provide useful insights to improve internal controls.
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.
  • Communicated new or updated data requirements to global team.
  • Explained data results clearly and discussed how it can be utilized to support project objectives.
  • Planned and implemented security measures to safeguard vital business data.
  • Created and implemented database designs and data models.
  • Monitored incoming data analytics requests, executed analytics and efficiently distributed results to support strategies.
  • Built databases and table structures following OLAP/OLTP architecture methodology for web applications.
01/2018 to 01/2021
Lead Data Engineer Cox Communications Inc Dayton, OH,

Master Data Management(MDM) EQH is primary vehicle for customer self-service for Life and Annuity products. Displays current policy values, statements, confirmation notices and prospectuses. Supports profile maintenance including address, phone, and email address changes, financial profile and investment strategies. Self-service tools include performance, financial transactions, ACH payments, and loans.

Responsibilities:.

  • Leads program/project application engineering teams consisting of cross functional, global, and virtual groups; directly supervises staff; assigns responsibility to members; monitors progress of daily activities.
  • Monitor and manage program/project application engineering baseline to ensure activities are occurring as planned - scope, budget and schedule and managing variances.
  • Managed performance monitoring and tuning while identifying and repairing issues within database realm
  • Proactively identify risks, issues and problems on programs/projects - leading engineering and project/program team to develop risk management and issue management plans. I have saved 1 person monitoring work by performing optimization
  • Ability to clearly articulate problems and proposed options and solutions and apply judgment in implementing application engineering methodologies, processes and practices to ensure security, resilience, maintainability and quality of MDM solutions.
  • Analyses and defines detailed MDM processes, tasks, data flows, and dependencies.
  • Develop custom mapping functions.
  • Participates in system and integration testing.
  • Produces database code (SQL, stored procedures, and any other database specific code) solutions meeting technical specifications and business requirements according to established designs.
  • Proactively resolved issues within team.
  • Validated warehouse data structure and accuracy.
  • Cooperated fully with product owners and enterprise architects to understand requirements.
  • Collaborated with multi-functional roles to communicate and align development efforts.
  • Mapped data between source systems and warehouses.
  • Performed systems and data analysis using variety of computer languages and procedures.
  • Documented data warehouse architecture to guarantee capacity met current and forecasted needs.
  • Developed and modified programs to meet customer requirements.
  • Quickly learned new skills and applied them to daily tasks, improving efficiency and productivity.
  • Provided global thought leadership in analytics solutions to benefit customers
08/2017 to 12/2017
Senior ETL Developer Epam Systems Inc Washington, DC,

Netezza Re-write: This project is about migrating around 70 Data Marts runs on Oracle to Netezza in 10 phases. This includes redesigning DataStage jobs integrates with oracle to change it to Netezza and Informatica to DataStage migration.

Role & Responsibilities :

  • Requirement Analysis, Creating mappings, Unit Testing, Defect Fixing, Documentation and Status Reporting.
  • Identifying Entities, cardinality and developing Logical and Physical Data Model.
  • Mentored newly hired employees, offering insight into job duties and company policies for easier transition to job position
    Prioritized and organized tasks to efficiently accomplish service goals
  • Analyzing existing process, scripts and preparing design document with performance optimized approach.
  • Performed impact analysis on every source and target tables.
  • Responsible for estimation of Design, Development and Unit testing.
  • Analyzing dependent objects and data involved and updating efficient unit testing approach.
  • Responsible for scheduling changes which includes changing node to new 11.5 server, change in run time parameters, change in predecessor or successor requirements and removing jobs.
  • Have published Play book or Implementation plan for every release.
  • Responsible for driving, implementation and doing post implementation data checks.
  • Resolved complex DataStage performance issues and other environment issues.
  • Designed and Developed reusable components which can parse dsx and provide input and output SQLs used in DataStage code.
  • Reviewed Netezza Deliverables and DataStage deliverables in every phase of project.
  • Resolved Netezza SQL issues to Business users.
  • Co-ordinated with downstream systems and worked on impacted system sign off, review and production preparation.
01/2016 to 07/2017
Senior ETL Developer Epam Systems Inc PA, State,

Financial Move Forward (inforce Data): Equitable Enterprise Data Warehouse (EDW) manages collection of components in both Mainframe (Information Database(IDB)) and Distributed environments (Open system Data Warehouse(OSDW)) that include batch processes, operational data stores, business intelligence (BI) data marts and general data services to all IT lines of business.

  • FMF-Inforce Data project consists of two phases.
  • First phase involves migration of DB2 data to Netezza DB with help of integrated ETL tool [Data Stage 8.7].
  • Second phase contains Data Modelling and DataMart design of migrated tables in Netezza.
  • First phase, basically involves initial data load (IDL) of 400 Db2 tables to Netezza DB.
  • Took care of installing Netezza client on UNIX box where Data stage client exists and ensured connectivity is good for designing data stage jobs Role & Responsibilities:.
  • Understanding requirements and coming up with high level design.
  • Created Low-level design of mapping document.
  • Created mappings and transformations as per business requirements.
  • Writing re-usable mapplets and Oracle PL/SQL stored procedures.
  • Unit test jobs according to test plans.
  • Monitored, debugged, and scheduled mappings according to requirements.
  • Improving performance of mapping execution thus reducing CPU and execution cost and time.
  • Providing System Testing and User Testing Support IQA of Mappings.
  • Capable of assisting team of developers, both onshore and offshore to provide strategic plan for execution of this project.
  • Ability to work with key team members to ensure solution meets business requirements.
  • Provided Proof of Concept (POC) for technical approach regarding design of data stage jobs.
  • Understanding requirements and coming up with technical design strategies with project team and business users.
  • Contributed to detailed estimation of development work.
  • Involved in Estimation of DBA Effort for this project.
  • Designed Field level mapping template design based on business rules/ transformations and validations.
  • Performed problem assessment, resolution and documentation for new and existing database objects.
  • Prepared Knowledge Transition documents which were appreciated by business IT people.
  • Communicated with data architects, programmers and engineers to keep projects on track.

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

79Average

resume Strength

  • Length
  • Personalization
  • Target Job

Resume Overview

School Attended

  • Sri Krishnadevaraya University

Job Titles Held:

  • Senior Data Engineer
  • Lead Data Engineer
  • Senior ETL Developer
  • Senior ETL Developer

Degrees

  • Bachelor of Engineering

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in: