LiveCareer-Resume

data analyst data modeler resume example with 3+ years of experience

Jessica Claire
  • Montgomery Street, San Francisco, CA 94105 609 Johnson Ave., 49204, Tulsa, OK
  • H: (555) 432-1000
  • C:
  • resumesample@example.com
  • Date of Birth:
  • India:
  • :
  • single:
  • :
Professional Summary
  • Around 4 years of strong experience in Business and Data Modeling/ Data Analysis, Data Architect, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, NoSQL and Metadata Management Services and Configuration Management.
  • Expertise and Vast knowledge of Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration (ETL/ELT) and Business Intelligence.
  • Skilled in implementing SQL tuning techniques such as Join Indexes(JI), Aggregate Join Indexes (AJI's), Statistics and Table changes including Index.
  • Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, and Sybase Power Designer, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical data modeling.
  • Good experience in Production Support, identifying root causes, Troubleshooting and Submitting Change Controls.
  • Extensive experience in development of T-SQL, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Responsible for architecture design, data modeling, and implementation of Big Data platform and analytic applications.
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.
  • Experience in Creating Audit control system for ETL process for Big Data and Data Warehouse Application.
  • Sound knowledge on SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation,and maintenance of applications.
  • Performing application level DBA activities creating tables, monitored and tuned Teradata BETQ scripts using Teradata Explain utility.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle and Teradata databases.
  • Technically proficient, Customer dedicated with remarkable experience in Mainframe development & Maintenance projects built with Teradata, JCL & IBM Tools.
  • Experienced in Creating and Maintaining documentation such as Data Architecture/Technical/ETL Specifications.
  • Worked on the multiple projects involving the cross-platform development and testing (Mainframe, Unix).
  • Experience in Handling Huge volume of data in/out from Teradata/Big Data.
  • Worked and extracted data from various database sources like Oracle, SQL Server, DB2, and Teradata.
  • Hands on experience with modeling using ERWIN in both forward and reverse engineering cases.
  • Strong exposure in writing simple and complex SQL, PL/SQL queries.
Skills
  • Programming Languages
  • Java, SQL, Python, R.
  • ETL tools
  • SSIS, Visual Studio, BOS Designer,Informatica
  • Data Modeling
  • BI & Reporting tools
  • SSRS, Crystal Report
  • MS-Office Package
  • Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).
  • Visualization tools
  • Tableau Desktop, Python, Pandas,
  • ETL Tools / Tracking tool
  • SSIS, SSAS, SSRS / JIRA.
  • Database Development
  • T-SQL and PL/SQL
  • Databases
  • MS SQL Server
  • MS Excel
  • Using advanced Excel features like Pivot tables and Charts for generating Graphs.
  • Designed and developed weekly, monthly reports using MS Excel Techniques (Charts, Graphs, Pivot tables) and Powerpoint presentations.
  • Strong Excel skills, including pivots, VLOOKUP, conditional formatting, large record sets. Including data manipulation and cleaning.
  • Environment: SQL/Server, Oracle 9i, MS-Office, Teradata, ER Studio, XML, Hive, HDFS, Flume, Sqoop, R connector, Python, R, Tableau 9.2.
  • OLTP, DBAs, ETL, DDL, DML, Erwin 9.6, UML, diagrams, Snow-flak schema, SQL, Data Mapping, Metadata, OLTP, SAS, Informatica 9.5, BODS Designer
  • Ad, Database, Functional, Word, Requirement, Visual Studio
  • Agile, Databases, Graphic, ODS, Requirements analysis, Workflow
  • Approach, Database design, Graphs, OLAP, SAS, XML
  • Architect, Database Development, IBM, Oracle 9, SDLC
  • Business intelligence, Designing, DB2, Oracle, Scripts
  • BI, Dimensions, Indexing, Oracle Database, SQL/Server
  • Charts, DML, Informatica 9.5, PL/SQL, MS SQL Server
  • Com, Erwin, Informatica, Performance analysis, SQL
  • Concept, Erwin 9.6, Java, Processes, SQL Server
  • Content, ETL, MS Excel, Programming, Stories
  • Crystal Report, Features, Excel, Coding, Sybase
  • Data Analysis, Fetch, Microsoft Office, Prototyping, Tables
  • Data conversion, Financing, MS-Office, Python, Teradata
  • Data Management, Focus, Powerpoint presentations, Quality, T-SQL
  • Data Migration, Forms, PowerPoint, RDBMS, UML
  • Data Modeling, Full life cycle, Windows, Reporting, Visio
Work History
Data Analyst/Data Modeler, 07/2022 - Current
Kbr Patuxent, MD,
  • Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
  • Gathered business requirements, working closely with business users, project leaders, and developers.
  • Analyzed the business requirements and designed Conceptual and Logical Data models.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain SDLC.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Create and maintain data model standards, including master data management (MDM).
  • Involved in extracting the data from various sources like Oracle, SQL.
  • Extensively worked on early-stage business projects, discovery efforts, and engagements initiated by Business Relationship Managers to provide appropriate architecture deliverables, such as stakeholder analyses, capability analyses, risk/value analyses, or technical analyses.
  • Worked with Database Administrators, Business Analysts, and Content Developers to conduct design reviews and validate the developed models.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Performed data analysis and data profiling using complex SQL on various sources systems and answered complex business questions by providing data to business users.
  • Generated ad-hoc SQL queries using joins, database connections, and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment using Informatica Data Quality and developed working documents to support findings and assign specific tasks.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS ,and OLAP.
  • Work with Data Architect on Dimensional Model with both Star and Snowflake Schemas utilized.
  • Conduct Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Performed Data Analysis and Data Profiling using complex SQL queries on various sources systems including Oracle.
  • Implemented data blending using various data sources and created stories using Tableau for a better understanding of the data.
  • Worked on creating filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Analyze business requirements and Design, Develop ETL processes to load data from various sources like Flat files, XML files, Oracle Database.
  • Documented Source to Target mappings as per the business rules.
  • Implemented Error Handling during ETL load in SSIS packages to identify dimensions and facts that were not properly populated.
  • Data governance functional and practical implementation and responsible for designing common Data governance frameworks.
  • Prepared Business Requirement Documents (BRD's) after the collection of Functional Requirements from System Users that provided the appropriate scope of work for the technical team to develop a prototype and overall system.
  • Created Informatica jobs, sessions, workflows to load organization related fact and dimension tables.
  • Utilized Informatica IDQ to complete initial data profiling and matching/ removing duplicate data.
  • Designed and developed Project document templates based on SDLC methodology.
Data Analyst, 10/2021 - 07/2022
One Span City, STATE,
  • Work with users to identify the most appropriate source of record required to define the asset data for financing.
  • Experience in using OLAP function like Count, SUM, and CSUM.
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle.
  • Developed normalized Logical and Physical database models for designing an OLTP application.
  • Imported the customer data into Python using Pandas libraries and performed various data analysis - found patterns in data which helped in key decisions for the company.
  • Used Prompts, filters, Conditional, Transformation metrics and Level Metrics for creating complex reports.
  • Designed reports, which are the focus and goal of business intelligence.
  • Created documents, which contain a group of reports and developed Dynamic Dashboards, which is a summarization of Key performance indicators (KPI).
  • Exported the data required information to RDBMS using Sqoop to make the data available to the claims processing team to assist in processing a claim based on the data.
  • Design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
  • Created Teradata SQL scripts using OLAP functions like RANK () to improve the query performance while pulling the data from large tables.
  • Worked on MongoDB database concepts such as locking, transactions, indexes, replication, schema design, etc.
  • Performed Data analysis using Python Pandas.
Data Analyst/Business Intelligence Developer, 05/2017 - 07/2019
SYNFORMATICA City, STATE,
  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Extensively used the Agile methodology as the Organization Standard to implement the Data Models.
  • Developed a OLTP system by designing Logical and eventually Physical Data Model from the Conceptual Data Model.
  • Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided a source to target mapping to implement incremental, full and initial loads into the target data mart.
  • Worked on normalization techniques.
  • Normalized the data into 3rd Normal Form (3NF).
  • Created the best fit Physical Data Model based on discussions with DBAs and ETL developers.
  • Created conceptual, logical and physical data models, data dictionaries, DDL and DML to deploy and load database table structures in support of system requirements.
  • Identified required dimensions and Facts using Erwin tool for the Dimensional Model.
  • Validated business data objects to ensure the accuracy and completeness of the database.
  • Represented existing business models by UML diagrams.
  • Ability to recognize and resolve discrepancies between executed DDL and the physical data model using complete compare.
  • Used Erwin tool to develop a Conceptual Model based on business requirements analysis.
  • Implemented Snow-flak schema to ensure no redundancy in the database.
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies to develop the logical data model using Erwin.
  • Reverse Engineered physical data models from SQL Scripts and databases.
  • Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
  • Maintained Data Consistency by evaluating and updating logical and physical data models to support new and existing projects.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class Words Standards Document.
  • Involved in Data profiling to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
Education
Post Graduation : Engineering Business And Safety Management, Expected in 03/2021
-
Confederation College - Thunder Bay, ON,
GPA:
Status -
Bachelor of Technology: Mechanical Engineering, Expected in 04/2017
-
K L University - Vijayawada, India,
GPA:
Status -

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • Confederation College
  • K L University

Job Titles Held:

  • Data Analyst/Data Modeler
  • Data Analyst
  • Data Analyst/Business Intelligence Developer

Degrees

  • Post Graduation
  • Bachelor of Technology

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: