LiveCareer-Resume
Jessica Claire
  • Montgomery Street, San Francisco, CA 94105 609 Johnson Ave., 49204, Tulsa, OK
  • Home: (555) 432-1000
  • Cell:
  • resumesample@example.com
Summary
I am a professionally prepared and focused IT developer having around 4 years of experience in providing  ETL/BI, Data warehouse solutions aiming to obtain full time opportunity in your company where I believe my theoretical and practical experience would be put to test in the process of improving my skills to positively impact the organization i work for.
Technical Skills
  • Ab initio (Co>op, GDE, EME)
  • Talend Data integration
  • Talend Big data Integration
  • Hadoop
  • Unix and shell Scripting
  • ​SQL, PL/SQL
  • ​Teradata ,Oracle ,Mysql
  • Windows, Linux
  • Xml, Html
  • Tivoli work Scheduler
  • Tableau
  • Proficient In ETL designs using Ab intio. 
  • Agile Development Methodoligies
  • Requirement Analysis and Design Phases
  • Data warehouse
  • Data Modelling Concepts
  • Analytical
  • Self - starter
  • Team player
Experience
Ab Initio/Talend ETL Consultant, 10/2013 to Current
AlightWA, State,
Client : Discover Financial Services
  • Worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema. 
  • Experience in all phases of Software Development Life Cycle (SDLC). 
  • Expertise in designing and development of Ab Initio Generic Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc.
  • Experience in working with Advanced Metadata techniques like Conditional DML, PDL (Parameter Definition Language).
  • Well versed with implementation of Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
  • Created applications to backfill large volume of data (3+ years) maintaining the type II SCD.
  • Enhanced ETL jobs performance by using lookups (in place of joins) and data parallelism techniques. 
  • Extensively used AIR commands to check in the objects from Dev to Prod EME and also to perform dependency analysis on all ABI objects. 
  • Experience in creation of UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases.
  • Converted critical Teradata stored procedures to Ab Initio ETL jobs.
  • Experience in using different loading utilities like API and FASTEXPORT, FASTLOAD, MULTILOAD, TPT using Ab initio to load data into Teradata database.
  • Created generic ETL jobs to extract data from legacy systems to stage before cleansing and loading it into data warehouses.
  • Experience in Talend Big Data Integration suite for Design and development of ETL code and Mappings for Enterprise DWH ETL Projects.
  • Experience in creation of  mapping documents based on the requirements, conduct review meetings with Business and Architects to finalize the mapping documents.
  • Experience in developing jobs for Data Ingestion from different source such as Teradata and oracle.
  • Extensive experience in designing and developing complex mappings from varied transformation logic and used various Components like tMap,  tJoin,  tReplicate,  tParallelize, tSendemail,  tDie,  tUnique,  tFlowToIterate,  tSort,  tFilterRow in designing Jobs in Talend.Developed complex Talend ETL job to load the data from file to HDFS, HDFS to Hive, Hive to Teradata tables.
  • Automated batch jobs  using scheduling tools like tivoli work scheduler.
SQL developer, 05/2010 to 06/2011
Hinduja Global SolutionsCity, STATE,

  • Have in-depth knowledge in Data Analysis, Data Warehousing, Data Marts and ETL (Extract Transform and Load) techniques.
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Teradata BTEQ.
  • Involved writing the scripts using Teradata SQL as per requirements.
  • Good knowledge in data warehouse concepts like Star Schema, Snow Flake, Dimension and Fact tables.
  • Knowledge on UNIX Shell Scripting.
  • Worked extensively on joins, sub queries and set operations.
  • Fix data quality issues.
  • Involved in Unit Testing and Preparing test cases.
  • Responsible for enhancing existing mappings and creating new mappings.
  • Experience with Scrum (Lean development) Technologies.
Education
: , Expected in
- ,
GPA:
Wright State University , Dayton , OH          Aug 2011 – May 2013
College of Engineering and Computer Science
Master’s in Electrical Engineering.
: , Expected in
- ,
GPA:
Jawaharlal Nehru Technological University,           Sep 2006 – May 2010
Andhra Pradesh, India 
Bachelor of Technology in Electronics and Communications Engineering

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

Job Titles Held:

  • Ab Initio/Talend ETL Consultant
  • SQL developer

Degrees

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: