LiveCareer-Resume

data analyst resume example with 3+ years of experience

Jessica
Claire
resumesample@example.com
(555) 432-1000,
Montgomery Street, San Francisco, CA 94105
:
Professional Summary
  • Around 4 years of strong experience in Business and Data Modeling/ Data Analysis, Data Architect, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Governance, Data Integration, NoSQL and Metadata Management Services and Configuration Management.
  • Expertise and Vast knowledge of Enterprise Data Warehousing including Data Modeling, Data Architecture, Data Integration (ETL/ELT) and Business Intelligence.
  • Skilled in implementing SQL tuning techniques such as Join Indexes(JI), Aggregate Join Indexes (AJI's), Statistics and Table changes including Index.
  • Experienced in Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, and Sybase Power Designer, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical data modeling.
  • Good experience in Production Support, identifying root causes, Troubleshooting and Submitting Change Controls.
  • Extensive experience in development of T-SQL, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Responsible for architecture design, data modeling, and implementation of Big Data platform and analytic applications.
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.
  • Experience in Creating Audit control system for ETL process for Big Data and Data Warehouse Application.
  • Sound knowledge on SDLC process - Involved in all phases of Software Development Life Cycle - analysis, design, development, testing, implementation,and maintenance of applications.
  • Performing application level DBA activities creating tables, monitored and tuned Teradata BETQ scripts using Teradata Explain utility.
  • Thorough Knowledge in creating DDL, DML and Transaction queries in SQL for Oracle and Teradata databases.
  • Experienced in Creating and Maintaining documentation such as Data Architecture/Technical/ETL Specifications.
  • Worked on the multiple projects involving the cross-platform development and testing (Mainframe, Unix).
  • Strong exposure in writing simple and complex SQL, PL/SQL queries.
Skills
  • Programming Languages: Java, SQL, Python, R.
  • ETL tools: SSIS, Visual Studio, BOS Designer, Pentho ,Informatica
  • Data Modeling : Sybase Power Designer / IBM Data Architect
  • BI & Reporting tools : SSRS, Crystal Report
  • MS-Office Package : Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).
  • Visualization tools: Tableau Desktop, Python, Pandas, NumPy
  • ETL Tools / Tracking tool : SSIS, SSAS, SSRS / JIRA.
  • Database Development : T-SQL and PL/SQL
  • Databases : MS SQL Server
  • Data Engineering
  • Business Intelligence
  • Data Quality Assurance Processes
  • Defect Tracking
  • Strong Organization
  • Project Management Skills
  • Process Mapping
  • Use Cases
Education
Illinois Institute Of Technology Chicago, IL Expected in 05/2020 Master of Science : Information Technology - GPA :
Indira Gandhi Institute Of Technology Delhi, Expected in 06/2016 B.Tech : ECE - GPA :
Work History
Abbott Laboratories - Data Analyst
New Haven, CT, 06/2019 - 06/2020
  • Created Logical & Physical Data Modeling on Relational (OLTP), Dimensional Data Modeling (OLAP) on Star schema for Fact & Dimension tables using Erwin.
  • Gathered business requirements, working closely with business users, project leaders, and developers.
  • Analyzed the business requirements and designed Conceptual and Logical Data models.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain SDLC.
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Create and maintain data model standards, including master data management (MDM).
  • Involved in extracting the data from various sources like Oracle, SQL.
  • Extensively worked on early-stage business projects, discovery efforts, and engagements initiated by Business Relationship Managers to provide appropriate architecture deliverables, such as stakeholder analyses, capability analyses, risk/value analyses, or technical analyses.
  • Worked with Database Administrators, Business Analysts, and Content Developers to conduct design reviews and validate the developed models.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Performed data analysis and data profiling using complex SQL on various sources systems and answered complex business questions by providing data to business users.
  • Generated ad-hoc SQL queries using joins, database connections, and transformation rules to fetch data from legacy DB2 and SQL Server database systems.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment using Informatica Data Quality and developed working documents to support findings and assign specific tasks.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS ,and OLAP.
  • Work with Data Architect on Dimensional Model with both Star and Snowflake Schemas utilized.
  • Worked Normalization and Denormalization concepts and design methodologies like Ralph Kimball and Bill Inmon approach.
  • Conduct Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Performed Data Analysis and Data Profiling using complex SQL queries on various sources systems including Oracle.
  • Implemented data blending using various data sources and created stories using Tableau for a better understanding of the data.
  • Worked on creating filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Analyze business requirements and Design, Develop ETL processes to load data from various sources like Flat files, XML files, Oracle Database.
  • Documented Source to Target mappings as per the business rules.
  • Implemented Error Handling during ETL load in SSIS packages to identify dimensions and facts that were not properly populated.
  • Data governance functional and practical implementation and responsible for designing common Data governance frameworks.
  • Prepared Business Requirement Documents (BRD's) after the collection of Functional Requirements from System Users that provided the appropriate scope of work for the technical team to develop a prototype and overall system.
  • Created Informatica jobs, sessions, workflows to load organization related fact and dimension tables.
  • Utilized Informatica IDQ to complete initial data profiling and matching/ removing duplicate data.
  • Designed and developed Project document templates based on SDLC methodology.
Abbott Laboratories - Data Analyst
Olympia, WA, 12/2018 - 05/2020
  • Work with users to identify the most appropriate source of record required to define the asset data for financing.
  • Experience in using OLAP function like Count, SUM, and CSUM.
  • Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle.
  • Developed normalized Logical and Physical database models for designing an OLTP application.
  • Imported the customer data into Python using Pandas libraries and performed various data analysis - found patterns in data which helped in key decisions for the company.
  • Used Prompts, filters, Conditional, Transformation metrics and Level Metrics for creating complex reports.
  • Designed reports, which are the focus and goal of business intelligence.
  • Created documents, which contain a group of reports and developed Dynamic Dashboards, which is a summarization of Key performance indicators (KPI).
  • Exported the data required information to RDBMS using Sqoop to make the data available to the claims processing team to assist in processing a claim based on the data.
  • Design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
  • Created Teradata SQL scripts using OLAP functions like RANK () to improve the query performance while pulling the data from large tables.
  • Worked on MongoDB database concepts such as locking, transactions, indexes, replication, schema design, etc.
  • Performed Data analysis using Python Pandas.
IBM - Data Analyst/Business Intelligence Developer
City, STATE, 07/2016 - 07/2018
  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Extensively used Agile methodology according to Organization Standard to implement Data Models.
  • Developed OLTP system by designing Logical and eventually Physical Data Model from the Conceptual Data Model.
  • Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided a source to target mapping to implement incremental, full and initial loads into the target data mart.
  • Worked on normalization techniques.
  • Normalized the data into 3rd Normal Form (3NF).
  • Created the best fit Physical Data Model based on discussions with DBAs and ETL developers.
  • Created conceptual, logical and physical data models, data dictionaries, DDL and DML to deploy and load database table structures in support of system requirements.
  • Identified required dimensions and Facts using Erwin tool for the Dimensional Model.
  • Validated business data objects to ensure the accuracy and completeness of the database.
  • Represented existing business models by UML diagrams.
  • Ability to recognize and resolve discrepancies between executed DDL and the physical data model using complete compare.
  • Used Erwin tool to develop a Conceptual Model based on business requirements analysis.
  • Implemented Snow-flak schema to ensure no redundancy in the database.
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies to develop the logical data model using Erwin.
  • Reverse Engineered physical data models from SQL Scripts and databases.
  • Supported SalesForce.com maintenance with services such as periodic data cleansing and workflow.
  • Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
  • Maintained Data Consistency by evaluating and updating logical and physical data models to support new and existing projects.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class Words Standards Document.
  • Involved in Data profiling to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • Illinois Institute Of Technology
  • Indira Gandhi Institute Of Technology

Job Titles Held:

  • Data Analyst
  • Data Analyst
  • Data Analyst/Business Intelligence Developer

Degrees

  • Master of Science
  • B.Tech

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: