In depth understanding of all phases in the software development life cycle (SDLC) from requirements gathering
through implementation. Excellent communication, coordination, and leadership skills. Strong troubleshooting,
debugging and problem solving skills. Ability to effectively lead and motivate teams. Process oriented technology
professional in database design development and data analysis skills on various platforms. Great knowledge of
RDBMS, MPP, OOPS, Procedural, Analytical, Spark, Hadoop, MapReduce concepts and strong experience in
design and development of integrated software systems. Able to collaborate with multi-discipline (functional &
technical) teams inside the organization to achieve the timely delivery of projects while ensuring data quality and
Data Warehouse Architecture Amazon Web Services (AWS)
Vertica db development and administration Amazon Redshift
Cloud Computing Strategic Planning
Program Management Management
Networking Shell Scripting
Linux System Administration Python
Microsoft SQL Server SDLC
SQL SSIS and SSAS
Tableau Business Analysis
Continuous Integration Informatica and Administration
Talend Real Time Big data Data Analysis
Jasper Reports Relational Data Modeling
Business Intelligence ArchitectNov 2014 to Nov 2016 Eliza Corporation － Danvers, MA
Responsible for building data-mart from data-warehouse for efficient and fast reporting and analytics business
needs using vertica.
Designed ETL (Extract Transform and Load) architecture to get the data from redshift to vertica incrementally and
mis-matched months in an allocated time window.
Designed projections in vertica primarily for performance in reading the data and also considering daily load
Designed projections using partitions on date-time fields for faster query response time and also for ease of
loading, maintaining, and purging the data.
Extensively involved in helping the team to convert key value pair data stored in redshift database to columnar data
for reporting purposes.
Designed strategies for loading and easily maintainable aggregate tables to provide all slicing and dicing of data
to internal and external customers.
Designed multiple ETL strategies to extract data to load various dimensions.
Extensively involved in gathering requirements from internal customers to validate and cross check the reporting
requirements to architect the ETL process.
In-depth knowledge of Tableau to judge what need to be done on ETL or database side versus parameters in the
Replicated SSAS and SSRS functionality and logics using vertica analytical functions for complex data analysis in
Designed the process to ingest the transactional data through Kafka/Kinesis streaming services to feed to
Spark/EMR Hadoop Clusters for further processing.
Designed the partition read process off of Avro and Parquet formats in Hive and Impala for efficient retrival.
Vertica Database AdministratorOct 2014 to Current Eliza Corporation － Danvers, MA
Capacity planning, user/AD management, resource pool management and maintaining K-Safety.
Responsible for performance tuning using vertica profiler and explain plans to avoid broadcast and
re-segmentation and to ensure happening of merge joins and group by pipes.
Implemented hourly and nightly complete/incremental local & remote backups to handle catastrophic errors/failures.
More performance tuning include partitons, manual compression strategies, considering help of database design,
and stress testing.
Implemented Top K projections to extract latest member data to avoid distincts in sql.
Designed Live Aggregate Projections (LAP) to summarize or aggregate data at the load time to accommodate the
Designed the architecture in a way that the response time in slicing and dicing for about 2 billion records is less
than 2 seconds.
Making sure to have less than 720 partitions at any point of time on all fact tables/projections.
Implemented pre-join projections to join the data at load times rather joining at reading.
Implemented amazon s3 loader/UDx to read data from amazon S3 to load data to vertica tables to avoid a hop of
unloading data to S3 and using of COPY command.
Implementation of UDx C++ and Java functions from GIT and vertica market place to help customized business
Responsible for complete end to end hp vertica database administration such as preparing nodes, adding nodes,
rebalancing, database designing/modeling, management console etc.
Senior Technical Team LeadMar 2012 to Oct 2014 Eliza Corporation － Danvers, MA
Responsible for Data Operations Management and Quality Control.
Responsible for upgrading the legacy operational architecture using more lookup tables, check points and recovery
Comprehensive SQL Server database management and migration of legacy SSIS packages to newer versions.
Built a reporting tool to convert key value pair data stored in amazon redshift to tabular/columnar/pivoted/transposed
Responsible for performance tuning by creating clustered and non clustered indexes and analyzing explain plans.
Responsible for implementing security using IP tables to reject, forward or/and accept packets when the database
or ssh ports are wide-open to public.
Responsible for maintaining high quality code by peer-reviewing and automation testing.
Responsible for business data analysis, cleansing and loading data to production calling platforms.
In depth analysis of post called data and made sure data can be fed to data-warehouse considering all dimension
and fact tables for reporting.
Responsible for in-depth requirement gathering from Customers and Project Managers.
Responsible for data migration from legacy platform to amazon cloud platform.
Involved in revamping the legacy architecture to amazon cloud platform.
Implemented auto reporting tool using Microsoft Business Intelligence suite, batch and shell scripts.
In-depth analysis of incoming customer data using Talend Open studio profiling tools and SQL.
Senior ETL ConsultantJun 2010 to Dec 2011 Wipro Infocrossing － Jefferson City, MO
Strong experience in the Analysis, design, development, testing and implementation of Business Intelligence
solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5/8.1/7.1
In-depth knowledge of using and maintaining PowerCenter Client tools - Mapping Designer, Repository manager,
Workflow Manager/Monitor and Server tools - InformaticaServer, Repository Server manager.
Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope,
Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, Systemtesting,
Implementation and production support.
Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into
DataWarehouses and Data Marts using Informatica Power Center (Repository Manager, Designer,
WorkflowManager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on
Oracle,DB2 and SQL Server Databases.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner,
StoredProcedure, and Union to develop robust mappings in the Informatica Designer.
Created mapplets to reuse the business logic in different mappings.
Installed, configured, and Administered Informatica Power Center 8.x/7.x/6.x.Experienced in Performance tuning of
Informatica (sources, mappings, targets and sessions) and tuning the SQL queries.
Master of Science, Embedded Software EngineeringDec 2009Gannon University － Erie, PAEmbedded Software Engineering
AD, architect, automation, Big data, broadcast, Business Analysis, BI, Business Intelligence, C++, Capacity planning, Client/Server, Client, Data Analysis, data migration, Data Modeling, data-warehouse, Data Warehouse, database administration, Databases, database, database design, database management, Data Warehousing, designing, dimensions, ETL, fast, hp, PHP, DB2, Informatica 9.1, Informatica, IP, Java, Linux, logic, market, Exchange, window, migration, mis, modeling, Networking, ODS, OLAP, Operations Management, Oracle, db, Program Management, Python, quality, Quality Control, reading, read, Real Time, reporting, requirement, requirements gathering, Safety, SDLC, ssh, Shell Scripting, shell scripts, sql, Microsoft SQL Server, SQL Server, Strategy, Strategic Planning, System Administration, Tableau, tables, upgrading, Workflow