4 years of strong experience in performing ETL processes with Informatica Power Center 9.x, 8.x.
*Experience in all phases of the Data Warehouse project life cycle including Requirement Gathering, Design, Development and Testing.
*Extensive knowledge of various kinds of Informatica Transformations such as Source Qualifier, Aggregate, Lookup, Rank, Joiner, Filter, Router, Sorter, Sequence, Union, Update Strategy, Stored Procedure, Normalizer, xml Transformation.
*Experience in Repository Configuration, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems (Oracle, SQL server, Flat files) to targets.
*Strong understanding of RDBMS Concepts, Dimensional Modeling (Star Schema, Snow-flake Schema) and Slowly Changing Dimensions.
*Hands on experience on database management tools and UNIX scripts.
*Experience in maintenance and enhancement of existing system. Experience in debugging the failed mappings and developing error-handling methods as part of production support.
Informatica Power Center 9.x/8.x (Workflow Manager, Workflow Monitor, Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager).
Database & Related Tools
Oracle 11g/10g , MS SQL Server 2005/2008, DB2, MS Access, Toad for Oracle, Rapid SQL, SQL Developer, SQL Server Management studio, FileZilla.
SQL, PL/SQL, Unix Shell Scripting, JIL
Unix. Microsoft Windows 98/NT/2000/2003/XP/2007/Vista/2010
MS Office 2007/2010/2013- Word, Excel, Outlook, PowerPoint, Notepad ++, Notepad, MS Visio, OneNote, Adobe Acrobat Assisted in writing Oracle procedures, functions, packages, database, triggers and troubleshooting PL/SQL and SQL.
Environments: Informatica Power Center 9.x/8.x, (Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor), MS SQL server 2008, Oracle 11g, Toad for Oracle 12.1, UNIX, MS VISIO, Microsoft Office 2013, NOTEPAD++, Micro Strategy, Tableau, ER Studio, Jira.
Informatica Developer, 09/2016
to Current Technocraft Solutions LLC Thermo Fisher Scientific – Carlsbad,
The objective of the project is to provide a centralized repository of consolidated information of the dependent and independent revenue forecast data sourced from Demantra Demand management system.
This data enabled organization and users to be more demand driven and offered wide range of options in decision making with real time demand intelligence.
Key responsibilities involved: -.
Learned the company standards and processes around data warehouse system implementation.
Worked with Analyst and the data modeler to derive high level and detailed level data model for revenue forecasting data.
Assisted Analyst to write the technical requirements around getting the data into files from source system, Demantra.
Developed ETL design and code for loading the source files into the Stage environment by using Autosys scheduler, UNIX scripts and Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Parsed Source to target mapping document and high-level ETL design specification to simple ETL coding and mapping standards.
Created ETL technical mapping documents to outline data flow from sources to targets.
Recommended tweaks and changes to the data model design based on the challenges in implementing ETL code.
Developed ETL code (Informatica, SQL stored procedures) for loading the integration and data mart fact and dimension tables after requirements, source to target mapping and Data model design were finalized.
Worked closely with the testers to debug the defects and issues found in the code.
Also, built scripts to create large volume of test data for testers.
Worked closely with the migration team to move the code form DEV to QA and then to PROD environment.
Developed the ETL process to migrate the historical data into the new data model.
Supported other smaller scale projects and the core development tasks included following-.
Developed mapping parameters and variables to support SQL override.
Created Mapplets to use them in different mappings.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, Worklets, Assignment, Timer and scheduling of the workflow.
Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Extensively used SQL.
loader to load data from flat files to the database tables in Oracle.
Modified existing mappings for enhancements of new business requirements.
Used Debugger to test the mappings and fixed the bugs.
Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
Involved in Performance tuning tasks for Informatica code and query optimization.
Prepared migration document to move the mappings from development to testing and then to production repositories.
Environments: Informatica Power Center 9.1, Toad for Oracle, SQL Developer, PL/SQL, Oracle 10g/9i, Autosys, SQL Server, UNIX, SQL Server Management Studio, Tableau, WinSCP.
Informatica Developer, 01/2012
to 12/2014 Purestudy Software Services Deemed University – Pune, India
The objective of Enhanced Telephonic Profiling (ETP) is to integrate new data sources available within various systems to the BI database (data warehouse) to facilitate more universal reporting on the service behaviors of the customers.
PICT – Pune, India
Implement new data sources available within ASPECT systems into BI to facilitate more holistic reporting on the Voice Interactions received on the new call center system, ASPECT.
The effort also standardized underlying data sets to broaden reporting access for efficient reporting.
Project also included historical ASPECT data migration to the new tables designed to house both ASPECT and Invensys data.
Studied and assessed the existing Source to target mapping documents, logical and Physical data model and the data definition training documents to get familiar with the system.
Created ETL framework and design document for Stage, Integration and data mart schema after the PDM (Physical Data Model) was ready.
Implemented ETL process to consume the data files (pipe delimited quoted) and load to the staging schema without any transformations.
Designed and developed Informatica mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows to populate the tables in Data Warehouse (DW) and Data Mart (DM) schema as per the business rules defined in STM.
Created Mapping Parameters & Variables and used but not limited to Lookup, Joiner, Expression, Source Qualifier, Router, Filter, Aggregator, Sequence transformations as part of ETL implementations.
Participated in Data Model design sessions with Data modeler, Analysts and provided valuable input impacting the ETL build work.
Slowly Changing Dimensions (SCD.
Implemented error-handling, dependencies, while creating the mappings and Autosys JIL scripts for scheduling the workflows.
Created Autosys JIL scripts as file watchers and to invoke the pearl scripts after arrival of the source files.
Used UNIX scripts to decrypt, copy, archive, delete, and validate the files and to call the Informatica workflows.
Executed complex SQL queries for unit testing, data analysis and data profiling.
Bachelors of Engineering: Information Technology,
CA Information Technology