Livecareer-Resume
JC
Jessica Claire
Montgomery Street, San Francisco, CA 94105 (555) 432-1000, resumesample@example.com
Accomplishments
  • Certifications:.
Professional Summary
.x, MapReduce, HDFS, Hive, Pig, Sqoop, Oozie, Flume, Hbase, MapR, ZooKeeper, Impala, Spark, and ETL Teradata 13.x/14.x/15.x & Informatica 8.x/9.x and Databases Teradata, MS SQL and Oracle *5+ Years of Experience in Tableau and Hadoop development *4 years experience in Business Intelligence and ETL development in Informatica and Teradata *Sound knowledge in Software Development Life Cycle (SDLC) and Agile/Scrum methodologies involving Requirements Gathering, Analysis, Design, Development, Testing, Implementation and Maintenance *Experience in Data Warehouse environment with emphasis on data modeling involving design, coupled with strong knowledge of end-to-end data warehouse development life cycle (data integration, logical & physical modeling, & data delivery) supporting enterprise analytics & BI Solution *Expert in gathering requirements and authoring Business Requirement Documents (BRD) into Functional and Technical specifications and identifying interface and business process flows *Proficient in gathering and creating Requirements engineering, Requirements management, Solution scoping & evaluation and create RACI, Use case/ User stories and Work flow diagrams, Gap analysis *Expertise in designing and developing Views, dashboards & stories using data visualization tool Tableau 10.x/9.x/8.x (Desktop/Server) using data from multiple sources like RDBMS,Hadoop Hive, Oracle and Excel/Flat Files and statistical sources like R *Extensive experience in Tableau Desktop and creating Tableau workbooks, Dashboards, Stories with interactive Prompts and expertise in Tableau Administration using tabcmd/tabadmin commands *Experience in developing rich interactive Tableau visualizations using various visualizations like Heat and Tree Maps, Bubble Charts, Reference Lines, Dual Axes, Line diagrams and Geographic Visualizations *Extensive working knowledge in various reporting objects like Dimensions, Measures, Facts, Hierarchies, Transformations, Filters, Calculated fields, Sets, Groups, Actions, Parameters, Prompts in Tableau and publishing the dashboards onto Tableau Servers *Very Strong in writing, debugging SQL/T - SQL queries for Data Analysis, cleansing and Data Quality *Strong experience in RDBMs/NoSQL databases and sound knowledge on data mining using R *Extensive experience in using Teradata BTEQ, FLOAD, MLOAD, TPUMP,TPT and FEXPORT utilities *In-depth knowledge of Teradata Explain and Visual Explain to analyze and improve query performance *Sound knowledge on Teradata administrator and View point tools in managing database server health *Excellent understanding of Hadoop/Big Data architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce programming paradigm *Experience in Importing and exporting data into HDFS and writing Hive queries, PIG Scripts and Sqoop *Experience in working with different data sources like text, delimited files, xml files, Json files, SQL server, Oracle to load data into Hive, HBase and Impala tables *Expertise in writing UNIX shell scripts and ingesting data into HDFS using Sqoop and Flume *Solid Linux and Windows administration skills, and understanding of system performance (memory, CPU, disk, I/O and network I/O) *Experience in real-time Big Data solutions using HBase, Hive handling billions of records *Strong experience in writing database objects like Stored Procedures, Functions, Triggers and Cursors *Advance knowledge, excellent concepts and experienced in decision support system (DSS), querying and analysis, Extraction transformation loading (ETL)/Teradata and data warehousing *Expertise in data warehouse architecture and designing star schema, snowflake schema, FACT and Dimensional Tables, Physical and Logical Data modeling using Erwin and Sybase power designer *Extensive experience in developing Extraction, Transformation, Loading (ETL) to load data from/to various sources into data warehouse and data marts using Teradata tools and utilities and Informatica *Experience in doing data quality, unit testing the code and coordinate with the business and technology teams to identify and resolving issues *Expertise in Microsoft suite of application Excel, Access, Word, Power Point,Visio, SQL Server, SSRS/SSIS *Good Knowledge on AWS ( Amazon Web services ), EC2, AMI, Elastic Map Reduce, Redshift and S3 *Highly self motivated and team evolved person with ability to grasp things quickly and possesses excellent interpersonal, technical and communication skills *Strong teamwork, project management and leadership skills with the ability to take initiative and work effectively in a variety of demanding situations; Acted as a Lead and managed a team of 10 resources *Excellent attention to detail and results oriented with a track record demonstrating strong problem solving and analytical skills ; Strong oral, written and presentation skills *Sound working knowledge on Banking and Finance domain projects involving mortgage life cycle (Underwriting, origination and servicing) and Capital Markets
Skills
  • Visualization/Reporting Tools
  • Tableau /8.x/9.x/10.x, SSRS 2014/2016&SAP Business Objects
  • Big Data Ecosystems
  • Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, Hbase, HUE, Zookeeper, Oozie, Flume, Impala, ClouderaCDH5, MapR and Bigdata Architecture
  • ETL Tools
  • Teradata ETL Tools v13/v14,Informatica 8.x/9.x, MSBI SSIS 2012/14
  • Databases
  • Oracle 9i/10g/11g,Teradata, SQL SERVER 2005/08/12, MySQL, Postgres
  • Languages
  • UNIX shell programming, SQL,PL/SQL, C, HTML/XML
  • Data Mining
  • R, R-Studio
  • Operating Systems
  • UNIX (HP, Sun Solaris, AIX, Linux), Windows Server 2005/08, CentOS, MAC
  • Other Tools
  • Jira,Autosys,PAC 2000, HP Quality Center,MS Office Suite 2007/10,Visio
Work History
04/2015 to 02/2017 Data Scientist / Tableau SME Salient Crgt | Indianapolis, IN,
  • Received Innovator of the year for Q2-2016 from Wells Fargo Enterprise.
  • Received Achieving Excellence for Q3-2014 from Wells Fargo Home Mortgage.
  • Received STAR Performer in the year Jan-2013 from Wells Fargo Finance.
  • Received Best performance award for Q4-2009 from Tech Mahindra EDA- Home Mortgage Integrated Data Environment (MIDE) Enterprise Data Analytics line of business in Wells Fargo is focused on different type of mortgages in USA.
  • The Mortgage Integrated Data Environment (MIDE) is an extensive effort requiring collaboration across multiple teams across Wells Fargo Home Mortgage.
  • The successful implementation of the MIDE program will provide the opportunity to query and report across loan origination and servicing systems Responsibilities: Tableau:.
  • Use the Tableau interface/ paradigm to effectively create the most powerful data visualizations.
  • Data requirements gathering, analysis and study of existing source systems.
  • Design Functional, Technical specifications and Mapping documents with Transformation rules.
  • Designed, developed and implemented Tableau Business Intelligence reports.
  • Hands on experience in creating Tableau Dashboards, Stories and publishing them.
  • Create calculations including string manipulation, arithmetic calculations, custom aggregations and ratios, date math, logic statements, level of detail and quick table calculations.
  • Performed data modeling using Erwin and Sybase power designer.
  • Build advanced chart types and visualizations such as Bar Charts, Bullet Graphs, Box and Whisker Plots, reference lines, cross Tabs, scatter Plots, geographic Maps, pie Charts, heat & Tree Maps.
  • Implemented advanced geographic mapping techniques and use custom images and geo coding to build spatial visualizations of non-geographic data.
  • Used Trend Lines, Reference Lines and statistical techniques to describe the data.
  • Used groups, bins, hierarchies, sorts, sets and filters to create focused and effective visualizations.
  • Used parameters and Input controls to give users control over certain values.
  • Create customized Sets and calculations including string, basic arithmetic calculations, custom aggregations and ratios, date math, logic statements and quick table calculations.
  • Using device designer, prepared the visualization to suit layouts ( Desktop / Tablet / Mobile ).
  • Connecting to tableau server repository PostgreSQL for monitoring sever health and performance.
  • Using Tableau server managing user/group permissions for multiples sites.
  • Tableau administrating activities such as installing/upgrading/backup/restore.
  • Involved in Tableau admin related activities using tabcmd & tabadmin commands.
  • Managing Tableau story development, publishing, scheduling, subscriptions.
  • Used R and R-studio ( performed linear regression, clustering, classification techniques ) to mine the data and used multiple plots to create visualizations Hadoop:.
  • Involved in end to end agile implementation of project includes requirements gathering, creating design documents, Coding and Unit Testing, rollout to testing and production environments.
  • Responsible to manage data coming from different sources and involved in HDFS maintenance and loading of structured, semi-structured and unstructured data.
  • Performed data analysis in Hive by creating internal/external tables and loading it with data.
  • Created Hive External tables and loaded the data into tables and query data using HQL.
  • Developed optimal strategies for importing & exporting the data into HDFS using Sqoop & Flume.
  • Hands on Experience in Importing/ingesting data in to HDFS, Hive from RDBMS Oracle, Teradata.
  • Implemented Hive Generic UDF's to incorporate business logic into Hive Queries.
  • Developed Java custom Map Reduce programs to transform the data and load into Hive tables.
  • Writing Hive queries for joining multiple tables based on business requirement.
  • Monitored workload, job performance and capacity planning using Cloudera Manager.
  • Involved in Agile methodologies, daily scrum meetings, sprint planning.
  • Involved in Build, Deployment and Integration.
  • Collaborated with the infrastructure, network, database, application and BI/BA teams to resolve Production issue and ensure data quality and availability Environment: Tableau 9.x/10.x, Hadoop 2.x, Impala 2.5.x, Sqoop, Hive, HBase, Pig, R, Oracle 11g/TOAD Teradata 14, Unix, Shell Scripting, Autosys, HP QC, Jira, Cloudera, Eclipse.

03/2011 to 03/2015 Hadoop & Tableau Developer Infosys Ltd | San Ramon, CA,
  • Project Title

05/2010 to 03/2011 ETL Developer Wells Fargo | City, STATE,
  • Project Title

06/2008 to 04/2010 Informatica Developer Tech Mahindra | City, , USA
  • Project Title

Education
Expected in Master of Science | BITS-Pilani, Rajasthan, GPA:
Certifications
Certified Tableau 10 Professional *Certified Teradata professional *Certified IBM UNIX flavor - AIX Server Administrator

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

71Average

resume Strength

    Resume Overview

    School Attended

    • BITS-Pilani

    Job Titles Held:

    • Data Scientist / Tableau SME
    • Hadoop & Tableau Developer
    • ETL Developer
    • Informatica Developer

    Degrees

    • Master of Science

    By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

    *As seen in: