Livecareer-Resume
Jessica Claire
  • Montgomery Street, San Francisco, CA 94105
  • H: (555) 432-1000
  • C:
  • resumesample@example.com
  • Date of Birth:
  • India:
  • :
  • single:
Summary
ETL and Big Data Application Developer with 6+ years of experience in analyzing, designing, developing, testing and deploying EDW application, integrating with various databases and data sources and data sources for the application as per the requirement *Strong knowledge of principles in Data Warehousing using fact & dimension tables, start & snowflake schema modeling *Good exposure in executing multiple end-to-end Data Warehouse development projects *Knowledge in various ETL and Data Integration development tools like Ab Initio, Informatica, Talend, SSIS and Data Warehousing using Teradata & SQL Server *Worked extensively with Dimensional Modeling, Data Migration, Data Cleansing and ETL processes for Warehouses *Proficient in designing and developing ETL strategies and mappings from source systems to target systems *Good understanding of all the phases of Software Development Life Cycle (SDLC), including requirements gathering, specification documentation, design, construction, testing and maintenance *Experience in troubleshooting of jobs and addressing production issues like data issues, environment issues, performance tuning and enhancements *Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regressions Testing, User Acceptance Testing (UAT) and Performance Testing *Strong communication, organizational, inter-personal, problem solving and analytical skills *Proactive and hardworking with the ability to meet tight schedules *Experience in scheduling sequence and parallel jobs using Unix scripts, scheduling tools like Tivoli Workload Scheduler and CA7 Autosys *Strong knowledge on Big Data/Hadoop components like HDFS, MapReduce, YARN, Sqoop, Hive, Impala and Oozie
Skills
  • Guest services
  • Inventory control procedures
  • Merchandising expertise
  • Loss prevention
  • Cash register operations
  • Product promotions
Technical Skills
Accomplishments
  • Reduced the runtime of existing Ab Initio graph from 90 minutes to 3 minutes and increased the performance.
  • Tuned existing SQL Server Stored Procedure and increased the performance for generating the lineage of a given application from source to target and target to source.
  • Selected as a member of the review board of Enterprise Data Warehouse and Data Architecture team to set and review the enterprise coding standards.
  • Lead 4 member offshore team and 3 member onsite team.
  • Awarded with "TCS-GEMS" On The Sport awards.
  • Tuned Teradata BTEQ scripts to reduce consumption of huge CPU cycles and increased the performance.
  • Conducted WebEx sessions on Ab Initio, Teradata, Linux and Data Warehouse architecture to TCS associates.
  • Appreciated by customers for bug-free and on time deliverables.
Experience
Senior Big Data Engineer, 08/2016 - Present
Nike Inc Dumont, CO, United States
Senior Data Engineer, 08/2015 - 07/2016
Kroll Grand Prairie, TX, United States
  • I am responsible for Architect, design, and develop data reporting structures in the enterprise data ware house and data lake.
  • Support the reporting requirements from business customers across all the Supply Chain initiatives.
  • Orchestrate the analysis, requirements, development, and design of business intelligence solutions to enable data driven decision making for internal customers.
  • Lead report development planning and execution, including team meetings, team communication, task assignment / optimization, and escalation of resource gaps.
  • Align solution designs with overall architectures and strategic technologies.
  • Help define team development best practices and process improvements.
  • Work with cross-functional groups to coordinate design, development, and deployment activities Skills.
  • Ab Initio, Talend and JBO (Java Batch Orchestration)- ETL mapping and transformations.
  • Teradata - Slowly Changing Dimensions, Complex BTEQs, Fast Export, Fast Load, TPT and Multiload jobs.
  • Tivoli Workload Scheduler - Scheduling, monitoring batch jobs.
  • Linux Shell Scripting - Batch processing, Job automation.
  • SQL Server & Visual Studio - Stored Procedures, SSIS packages, Business Intelligence.
  • Linux Shell Scripting - Batch processing, Job automation.
  • Big Data - Hortonworks HDFS, Hive, Sqoop, PIG, Flume and Oozie.
  • Google Cloud Platform (Google Cloud Storage, Big Query, Big Table, Cloud SQL, Pub/Sub) Lead SQL Data Integration and Hadoop Developer Roles and Responsibilities.
  • Analyzed the systems, met with end users and business units in order to define the requirements.
  • Involved in the requirement definition and analysis in support of Data Warehousing efforts.
  • Developed ETL mappings, transformations using Informatica PowerCenter.
  • Developed data mappings between source systems and warehouse components using Mapping Designer.
  • Worked extensively on different transformations like source qualifier, expressions, filter, aggregator, router, update strategy, lookup, normalizer, stored procedures, mapping variables and sequence generator.
  • Tuning ETL procedures for performance and load optimization.
  • Setup batches and sessions to schedule the jobs at required frequency using Power Center Workflow manager.
  • Involved in User Acceptance Testing and provided the technical guidance for business users Skills.
  • Informatica PowerCenter - ETL mapping and transformations.
Lead ETL and Hadoop Developer, 02/2013 - 07/2015
Clearesult, Inc. RI, State, United States
  • Roles and Responsibilities.
  • Analyzed the systems and met with end users and business teams to define the requirements.
  • Created high level and low level specification documents.
  • Participated in daily Scrum, Spring Planning and Retrospective meetings.
  • Code reviews as part of the review board meetings.
  • Production deployment and warranty support.
  • Lead onsite and offshore teams Skills.
  • Ab Initio - ETL mapping and complex transformations.
  • Process data from heterogeneous databases.
  • Performance tuning.
  • Teradata - Slowly Changing Dimensions, Complex BTEQs, Fast Export, Fast Load, TPT and Multiload jobs.
  • Talend - ETL mapping, Sqoop, Hive jobs.
  • Tivoli Workload Scheduler - Scheduling, monitoring batch jobs.
  • Linux Shell Scripting - Batch processing, Job automation.
Senior ETL Developer, 01/2011 - 01/2013
Tata Consultancy Services City, STATE, India
  • Roles and Responsibilities.
  • Requirement analysis and mapping document creation.
  • ETL job development - Ab Initio graphs and Teradata BTEQs.
  • Conducted unit testing, system testing, performance testing and user acceptance testing.
  • Production deployment and warranty support.
  • Lead offshore team and coordinated with onsite team Skills.
  • Ab Initio - ETL mapping and complex transformations.
  • Process data from heterogeneous databases.
  • Performance tuning.
  • Teradata - Slowly Changing Dimensions, Complex BTEQs, Fast Export, Fast Load, TPT and Multiload jobs.
  • Tivoli Workload Scheduler - Scheduling, monitoring batch jobs.
  • Linux Shell Scripting - Batch processing, Job automation.
Warehouse Designer, Mapplet Designer, Transformation Developer, Repository Manager and Workflow Manager, -
Informatica Client Tools - Source Analyzer , ,
  • SQL Server & Visual Studio - Stored Procedures, SSIS packages, Business Intelligence.
  • Linux Shell Scripting - Batch processing, Job automation.
  • Autosys - Scheduling Informatica and SSIS jobs.
  • Big Data - Cloudera HDFS, Hive, Sqoop and Oozie.
Education and Training
Master of Computer Applications: , Expected in 2010
-
Jawaharlal Nehru Technological University - ,
GPA:
Bachelor of Science: Computers, Expected in 2007
-
Sri Venkateswara University - ,
GPA:
Computers
Skills
Architect, automation, Big Data, Business Intelligence, Data Integration, databases, Data Warehousing, decision making, Dimensions, ETL, Fast, functional, graphs, Informatica, Java, team development, Linux, meetings, enterprise, optimization, Developer, reporting, Requirement, router, Scheduling, Scrum, Shell Scripting, specification, SQL, SQL Server, strategy, strategic, Supply Chain, Teradata, Tivoli, Visual Studio, Workflow

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

64Fair

resume Strength

  • Formatting
  • Length
  • Personalization
  • Target Job

Resume Overview

School Attended

  • Jawaharlal Nehru Technological University
  • Sri Venkateswara University

Job Titles Held:

  • Senior Big Data Engineer
  • Senior Data Engineer
  • Lead ETL and Hadoop Developer
  • Senior ETL Developer
  • Warehouse Designer, Mapplet Designer, Transformation Developer, Repository Manager and Workflow Manager

Degrees

  • Master of Computer Applications
  • Bachelor of Science

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in: