LiveCareer-Resume

architect cloudbigdatadataengineering resume example with 16 years of experience

Jessica Claire
, , 609 Johnson Ave., 49204, Tulsa, OK 100 Montgomery St. 10th Floor
Home: (555) 432-1000 - Cell: - resumesample@example.com - : - -
Summary
  • Experience spanning over 16 years with Data Integration, Data Engineering and Build of Large Scale Data Warehouse and Data Lake implementations using various ETL ,ELT,BIGDATA and Cloud Technology.
  • Seasoned in Data management solution design, Data Engineering Architecture, development ,deployment and Cloud Computing.
  • Strong experience in Data Modeling and deployment of multi-dimensional schema (Star and 3NF Schemas), Database design, Entity Relationship modeling and NOSQL Modeling.
  • Experience in developing high performance and scalable systems.
Skills
  • Scripting langauge:Python , Shell Scripting, SQL,Advanced SQL,PySparK,Java
  • Databases & Tools :Redshift, MY SQL, AURORA, Dynamo dB, Teradata, Oracle, MS SQL Server, DB2.
  • ETL Tool :AWS Glue, PyCharm, Jupiter notebook.Talend DataManagement, Talend Big Data, Talend Integration Cloud, Pipeline designer, Talend ESB,Ab Initio (GDE, Co>Ops, EME, Metadata Hub, Express>IT),Abinitio Vector Programming , Abinitio Metaprogramming,
  • Big Data Technology : Apache Spark, Hadoop, Kafka, Hive,DataBricks
  • Cloud Technology :AmazonWebServices(Kinesis fire hose ,Kinesis Data Stream, Kinesis Data Analytics, Backup,Athena,CF,S3,EC2,CLOUD WATCH,CFT,ECS,EMR,EKS,SSM,LAMBDA,CLI,DMS,Glue,Step Function)
  • DevOps Tool: GIT,Jenkins, AWS Code Pipeline,AWS code build, AWS Code Deploy,AZURE DevOps Pipeline, TALEND CI Builder, Talend Metaservlet API,Azure VSTS, Azure TFS
  • Streaming Data Processing : AWS Kinesis, Apache Kafka, Spark Stream
  • Data Modeling Utility: LUCID Chart, Erwin ,MYSQL Workbench, Vizio
  • WorKflow /Schedulers: AWS STEP Function , AWS Glue Workflow,Control-M , Autosys.
  • Domain Knowledge: retail banking ,Credit card, PBM , manufacturing.
  • People Management
  • Team Building and Mentoring
  • Recruitment
Experience
11/2016 to Current
Architect-Cloud|BigData|DataEngineering Convergence Consulting Group Tampa, FL,
  • Working as Big Data and Cloud Architect with business partners, application product owners to understand detailed business requirement, and translating them to technical requirement.
  • Designed data warehouse and Data Lake solution along with data processing Pipeline using PySpark using GLUE, Lambda, EMR,Step Function,Glue Workflow,Glue Data catalogue ,Athena and Redshift. Entire cloud infrastructure created using AWS Cloud formation.
  • Performed Data Modeling on Aurora MYSQL, Redshift, DynamoDB for transactional and analytics need .
  • Designed and developed a pipeline using Kinesis Data Stream ,Kinesis Delivery Stream to load data to data lake from Aftermarket portal.
  • Involved on hands on development and configuration of data processing using Pyspark on AWS Glue using Pycharm,Glue studio and Jupiter notebook
  • Designed and Implemented the Data Migration from On Premise to Cloud using DMS
  • Created various dashboards using cloud watch for monitoring various stats of the system .
  • Implemented all PACCAR standard security policy in AWS Cloud including SSL security at application label.
  • Designed Data model for the warehouse using Erwin followed by physical database creation with required standards
  • Developed event based ETL load using S3, Lambda, AWS Glue, and Redshift for secondary manufacturing module.
  • Designed and developed data catalogue using Glue Crawlers and used them in on Athena for analytics and Glue Jobs for ETL use.
  • Implemented CICD for AWS Glue using AWS CF, AWS CODE Build,AWS Code Deploy.
  • Worked and delivered critical on-premises to cloud modernization projects as part of AWS cloud migration journey.
  • Define and standardize and developed - HA and DR strategy for the platform.
  • Automated infrastructure shutdown and startup of on demand systems using SSM which business could start and stop from there user screen.
  • Automated AWS Key and password rotation using AWS Lambda,AWS Secretmanger.Cloudwatch event.
  • Designed and developed data lake on AWS S3 and integrated data from various systems.
  • Designed and developed data ingestion framework using Talend Big data to ingest data to data lake from various data sources.
  • Designed and Implemented the history Data Migration from On Premise database to Cloud using DMS.
  • Created various dashboards using cloud watch for monitoring various stats of the system (that includes EC2/RDS)
  • Implemented all PACCAR standard security policy in AWS Cloud including SSL security at application label.
  • Designed Data model for the warehouse using Erwin followed by physical database creation with required standards and indexes for better performance.
  • Event based ETL load designed using Pyspark and Step function EMR to process data on spark cluster and loads to Aurora database.
  • Designed and Implemented CICD on AWS for Talend code using Jenkin for automated and faster code promotion which also integrated with automated test case running on the CICD pipeline. Azure VSTS and Azure Code pipeline are used to integrate with Jenkins and Talend CI Builder/maven/TalendMetaservlet along with Jupiter (Cognizant’s Internal tool for testing automation)
  • Configured AWS EMR to use by Talend for Data Ingestion.
  • Configured all AWS services like (EC2/RDS/REDSHIFT/IAM/SSM/CLOUDWATCH/ECS etc) using cloud formation.
  • Automated entire infra backup based on tags using AWS Backup.
  • AWS Cloud watch ,SNS: for system and process monitoring and alerts
  • Designed and Developed quick sight reports for business reporting.
  • Designed and developed REST API solution using API Gateway ,Lambda.
  • Define and standardize and developed HA and DR strategy for the platform.
  • Automated infrastructure shutdown and startup of on demand systems using SSM which business could start and stop from there user screen.
  • Automated AWS Key and password rotation using AWS Lambda,AWS Secretsmanger.Cloudwatch event.
  • Automated Budgets and bill dashboard using SNS
  • Implemented Microservices and distributed applications

Key Projects:

On-premises to AWS Could Modernization

  • Integrated Various on prem systems and portals with AWS Analytics System using API GATEWAY ,Lambda and DynamoDB
  • Designing and Developing Data ingestion and Data processing pipelines using build Big Data algorithms in Spark, Python,Pyspark and Hadoop technologies.
  • Data Modeling for Aurora MYSQL and DynamoDB.
  • Real time data ingestion using Kinesis DataStream and Kinesis Firehose Stream
  • Integration of Data with existing data lake and DWH.

Green House Gas Analytics Platform Implementation

  • Designed and Developed Data Warehouse on redshift
  • Performed Data Modeling and Developed Data Processing pipelines using AWS Bigdata Solutions[ EMR,GLUE,STEP FUNCTION,lambda]
  • Designed Data lake and various Data Ingestion pipelines using Talend , Glue Catalog, Athena.
  • Developed and integrated MDM and UI solution
01/2014 to 10/2016
Senior Architect-Data Engineering Dentsu Aegis Network New York City, NY,
  • Designed, developed end-to-end ETL solution using Hadoop, Abinitio as ETL tool.
  • Define reusable components in Ab Initio, Hadoop ecosystem for ETL.
  • Worked on Data Model on DB2 using Erwin where I modeled data using Type-II dimensional modelling and created the database structure based on the model.
  • CDC and various Datawarehouse best practices implemented for optimal data loading and analytics.
  • Involved in End to End coding along with Unit testing, Abinitio Coding standards like metaprogramming, Web services, Abinitio Conduct IT-Plans, used Resource pool being used intensively to give optimized performance and making the code more generic.
  • Designed All Complex business rules on Abinitio Express>IT to make all complex business rules independent of developer, so that they can be managed independently whenever business need without much evolvement development team.
  • Designed Generic ETL framework for loading files from different sources with different layout and different business transformation to load in common benefit (copay, accume) model.
  • Developed automated code pipeline using Jenkin, Maven, Ab Initio utilities like test runner/TF, For day-to-day operations and continuous improvement process of project.

Key Projects:

CBM-AUTOMATED GROUP/BPL Loading Automation

  • Designed and Developed Data Model on DB2
  • Designed and Developed ETL Pipeline using Abinitio as ETL tool .

BENEFIT BUILDER AUTOMATION

  • Designed and Developed Data Model on DB2
  • Designed and Developed ETL Pipeline using Abinitio as ETL tool .
  • Implemented Buisness rules automation on Abintio Express IT.
07/2011 to 12/2013
Project Lead| ETL Architect Location, Capital One City, STATE,
  • Coordinating with various enterprise architects and other groups like BSA, Data Modelers, and DBAs etc.
  • Coding the application using Abinitio Metaprogramming, vectors, shell Scripting.
  • Involved in integrating system with Voltage tool for doing Tokenization and Detokenization carried out user acceptance, QSA and Technical Design reviews.
  • Performance tuning of the application with help of abinitio support.
  • Planning, estimate, and designing of solution for all different applications after tokenizing their Files using TGE.
  • Then integrating TGE with different applications, so that all applications can be utilized TGE SERVICE.
  • Development and mentoring junior/new team members, Conduct code walkthroughs and review code and documentation.
  • Being leading the development of an Enterprise tool, I conducted multiple trainings, demo sessions with various application teams to use the enterprise system TGE, and helped them in utilizing the service.

Key Projects:

Shield Program –PCI DSS-TGE

  • Worked as Project Lead Developer for the PCI DSS Shield Program of Capital one and performed various tasks as designed and developed end-to-end solution for building new enterprise system TGE using Voltage and Abintio as ETL tool and teradata.
  • Being leading the development of an Enterprise tool, I conducted multiple trainings, demo sessions with various application teams to use the enterprise system TGE, and helped them in utilizing the service

Digital Analytics

  • Worked on Planing and estimation of project.
  • Data Modeling on teradata
  • Building ETL pipeline to load to DWH
  • Preparing project management documents to present to customer
  • Coordinating with various team members for regular Review and Follow-up for each task and deliverables
03/2005 to 06/2011
Project Lead GE Capital City, STATE,
  • The project is to provide Development, Implementation, support and Enhancement for CDCI vertical of GE CAPITAL.
  • Data Model preparation and ETL logic design.
  • Preparation of high level and technical design documents.
  • Reviewing the Development work made by offshore team, to meet the time lines and to make sure proper performance technique is applied and all business needs achieved.
  • Documentation for proof of concept.
  • Incorporating change requests for Data Warehouse.
  • Performance tuning of complex Ab initio graphs.
  • Schedule the ETL jobs with help of Autosys.

Key Projects:

Re-engineering OLTP application using ETL tool Abinitio

  • As a Lead developer was responsible to translates the conventional .
  • PLSQL code to Abintio ETL for the OLTP Data processing need

Data Warehousing (DWH) and ETL Implementations

  • Responsible for design and development of ETL process for DWH
  • Performance tuning for the complex ETL workflows
  • Performance tuning at database level -Oracle
  • Implemented Oracle Exadata Solution.
Education and Training
Expected in 10/2004 to to
Bachelor of Science: Engineering -Computer Science
Institute Of Technical Education And Research - Bhubaneswar,India,
GPA:
Websites, Portfolios, Profiles
  • http://linkedin.com/in/santosh-Claire-b6424828
Certifications
  • AWS Certified Developer – Associate.
  • AWS Certified Data Analytics Specialty

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • Institute Of Technical Education And Research

Job Titles Held:

  • Architect-Cloud|BigData|DataEngineering
  • Senior Architect-Data Engineering
  • Project Lead| ETL Architect
  • Project Lead

Degrees

  • Bachelor of Science

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: