LiveCareer
LiveCareer
  • Dashboard
  • Jobs
  • Resumes
  • Cover Letters
  • Resumes
    • Resumes
    • Resume Builder
    • Resume Examples
      • Resume Examples
      • Nursing
      • Education
      • Administrative
      • Medical
      • Human Resources
      • View All
    • Resume Search
    • Resume Templates
      • Resume Templates
      • Nursing
      • Education
      • Medical
      • Human Resources
      • Customer Service
      • View All
    • Resume Services
    • Resume Formats
    • Resume Review
    • How to Write a Resume
    • CV Examples
    • Resume Objectives
  • Cover Letters
    • Cover Letters
    • Cover Letter Builder
    • Cover Letter Examples
      • Cover Letter Examples
      • Education
      • Medical
      • Human Resources
      • Customer Service
      • Business Operations
      • View All
    • Cover Letter Services
    • Cover Letter Templates
    • Cover Letter Formats
    • How to Write a Cover Letter
  • Jobs
    • Mobile App
    • Job Search
    • Job Apply Tool
    • Salary Calculator
    • Business Letters
    • Job Descriptions
  • Questions
  • Resources
  • About
  • Contact
  • 0Notifications
    • Notifications

      0 New
  • jane
    • Settings
    • Help & Support
    • Sign Out
  • Sign In
Member Login
  • LiveCareer
  • Resume Search
  • Sr. Business Intelligence Developer
Please provide a type of job or location to search!
SEARCH

Sr. Business Intelligence Developer Resume Example

Resume Score: 80%

Love this resume?Build Your Own Now
SR. BUSINESS INTELLIGENCE DEVELOPER
Professional Summary
  • Successful BI/ETL Developer & Designer with 8 years of experience.
  • Focus on creating clean, robust code with exceptional security while meeting and exceeding customer demands.
  • Development and Support of Business Intelligence using IBM datastage, MSBI, Informatica and Tableau.
  • Worked in various SDLC processes like Agile Scrum and Waterfall methods.
  • An excellent Review, Analysis and Problem resolving skills in Technology for Financial, Commercial, HealthCare and Pharmaceutical domains.
Work History
Federal Home Loan Bank. Dallas, TXSr. Business Intelligence Developer | 07/2017 - Current
  • Worked as ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1)
  • Developed complex ETL packages to extract and transform data from various sources and load data to different destinations.
  • Created and automated Datastage jobs to migrate data from legacy system(Progress) for Bank's de-customization to .Net applications which saved close to quarter million dollars.
  • Developed/Modified complex Datastage jobs for Ad-hoc, small and medium projects to analyze and strategize critical Financial decisions.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.
  • Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics.
  • Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x.
  • Implemented Shared container for multiple jobs and Local containers for same job as per requirements.
  • Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data
  • Modify already existing T-SQL as required after discussion with business team and migrated to SDLC environment. write PLSQL, T-Sql queries to generate reports for other teams for their development efforts in the given span of time.
  • Created shared dimension tables, measures, hierarchies, levels, cubes and aggregations on MS OLAP/ Analysis Server (SSAS).
  • Analyze the request from business & other teams and do research and generate the PLSQL and T-Sql queries and provide the data as needed.
  • Implemented various Audit Trail for critical application based on business needs.
  • Was part of IBM InfoSphere DataStage conversion project from DS 8.5 to 11.5, fully involved in testing and all code migration in SDLC Environment
  • Helped out Team with FRD (Functional Requirement Document), SDD (Solution Definition Document) for a project which is in a starting Phase.
  • Developed UNIX shell script to run the IBM InfoSphere DataStage job, transfer files to the different landing zone.
  • Developed Query for generating drill down and drill through reports in SSRS 2017.
  • Identified and worked with Parameters for parameterized reports in SSRS 2017.
  • Responsible for scheduling subscription reports with subscription report wizard.
  • Involved in modifying existing reports in SSRS and Exact ERP system.
  • Created reports in SSRS with different types of properties like chart controls, filters, Interactive Sorting, SQL Parameters etc.
  • Designed and implemented automated deployments for Database, SSIS, SSRS using Azure DevOps and TFS.
  • Created Power BI Reports and effective visualizations to publish them to Power BI service.
  • Develop and migrate Tableau reports as per the requirement of business.
  • Scheduling the IBM InfoSphere DataStage jobs using AutoSys.
  • Administered SQL server by creating user logins, dropping and locking logins, monitoring user accounts, creating groups, granting privileges to users and groups.
  • Created Distribution Databases, updated Publishers, created & managed Publications, managed Replication monitors.
  • Perform Capacity Monitoring and short and long-term Capacity Planning in collaboration with development resources and System Administrators.
Kansas City Southern Railways. Kansas City, MOSr. Business Intelligence Developer | 01/2016 - 07/2017
  • Extensively worked on DataStage for extracting, transforming and loading databases from sources including Oracle, DB2 and Flat files.
  • Collaborated with EDW team in, High Level design documents for extract, transform, validate and load ETL process data dictionaries, Metadata descriptions, file layouts and flow diagrams
  • Worked with EDW team in, Low Level design document for mapping the files from source to target and implementing business logic.
  • Created Batches (DS job controls) and Sequences to control set of jobs.
  • Extensively used DataStage Change Data Capture for DB2 and Oracle files and employed change capture stage in parallel jobs.
  • Worked on embedding reports in critical Web Applications.
  • Executed Pre and Post session commands on Source and Target database using Shell scripting.
  • Managed content packs, apps and workspaces with user access in Power BI service
  • Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler activity and Execute Command.
  • Created SSRS and developed shared data sets for generating drill down, drill through reports.
  • Created multiple customized, on-demand, ad-hoc reports involving complex queries which involved in analyzing multi-dimensional reports in SSRS/Power BI.
  • Created roles and implemented security at department level to allow/restrict users to view data depending on departments belong to.
  • Designed ETL Packages to extract, transfer existing data into SQL Server from different environments for SSAS cubes.
  • Created dimensions in performance point server.
  • Imported data into Performance point server from database through Flat files.
  • Developed UNIX shell scripts to automate file manipulation and data loading procedures.
  • Created parameterized stored procedures to implement business logic.
  • Improvised views on database for better performance and security.
  • Created DS jobs and scheduled them for nightly loads.
  • Created SSIS package to get data from share point and load Cubes.
  • Deploy SSRS reports in Microsoft Office share point server.
  • Utilized Parallelism through different partition methods to optimize performance in a large database environment.
  • Involved in creation of reports using SSRS and Configuring SQL Server Reporting Service(SSRS).
  • Created workflow items in version control system (Visual Studio TFS) to assign work to team.
  • Tuned performance by making use of clustered and non-clustered indexes.
  • Rebuilding Indexes and Tables as part of performance tuning exercise.
  • Monitor SQL Error Logs /Schedule Tasks/database activity/eliminate Blocking & deadlocks /user counts and connections/locks etc.
  • Performing query plans and making sure each and every query is using appropriate useful indexes.
  • Worked SCDs to populate Type I and Type II slowly changing dimension tables from several operational sources.
  • Created UTP document and performed unit test processing.
  • Designing Data masking techniques to mask sensitive information when working with offshore.
  • Created/Scheduled over 60 jobs for refreshing cubes and multiple cyclical jobs to move critical data for business end users.
  • Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.
  • Monitored assessment software for performance, security and database availability throughout lifecycle.
  • Wrote and optimized SQL statements to assist business intelligence practices.
Inventiv Health. Princeton, New JerseySr.SQL BI/Tableau Developer | 08/2015 - 01/2016
  • Designed application that collects client transaction volumes for WSS Custody and Sub-Custody products, applies unit costs, and produces monthly export to T&SS Costing system (CIIS).
  • Excellent Knowledge of HIPAA(Health Insurance Portability and Accountability Act) transaction codes such as 270/271(inquire/response health care benefits), 276/277(claim status), 470(benefit codes), 835(payment or remittance advice), 837 (health care claim) and 834 (benefit enrollment).
  • Created Volume Automation to enable import and maintain unit costs and other reference data, as well as providing summarized MIS reports for administrative/integrity purposes.
  • Responsible for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check Constraints Views, Stored Procedures, Triggers, Rules.
  • Created functions and developed procedures to implement application functionality at database side for performance improvement.
  • Created Tableau wireframes for various lines of business.
  • Lead team of 3 tableau developers in assigning work to meet quality and deadlines.
  • Utilizing tableau server to publish and share reports with business users.
  • Built Stories in Tableau with various Data points to narrate data to end users.
  • Created and published Tableau Dashboards into Tableau Server.
  • Created action filters, parameters for preparing dashboards and worksheets using Tableau.
  • Created and Modified T-SQL stored procedures/triggers to validate integrity of data.
  • Created DTS and SSIS packages.
  • Created packages and scheduled them in SQL Agent jobs to get data from OLTP.
  • Created SSIS package to get data from different sources, consolidate and merge into one single source.
  • Created ETL package for data conversion using various transformation tasks.
  • Created Cubes from which data can be retrieved rapidly by enterprise information consumers.
  • Created dashboard to show business performance.
  • Created partitions and designed aggregations in Cubes.
  • Created perspectives in SSAS and configured security levels for Cubes.
  • Designed KPI's in SSAS and imported them to Excel.
  • Created datasets for reports using T-SQL and stored procedures.
  • Responsible for maintaining cubes using SSAS and populating cubes with data.
  • Responsible for Query optimization and Performance tuning
Glasko Smith Kline. Mumbai, IndiaSQL BI Developer | 12/2012 - 08/2015
  • Involved in analysis, design and development of projects to consume transactional data, consolidate into data warehouse and creating reports and dashboards, writing complex Stored Procedures for creating datasets for Reports and dashboards.
  • Gathered business requirements, defined and designed data sources, data flows, data quality analysis, worked in conjunction with data warehouse architect on development of logical data models.
  • Created ETL packages using SSIS to move data from various heterogeneous data sources.
  • Created critical SSRS reports for analysis which increased sales by 12% for organization.
  • Created logs for ETL load at package level and task level to log number of records processed by each package and each task in package using SSIS.
  • Implemented error handling and roll back process in ETL load.
  • Configured SSIS packages using Package configuration wizard to allow packages run on different environments.
  • Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases.
  • Used XML and environmental variables in package configuration for Deploying SSIS packages.
  • Created region level security in reports to prevent users of one region from seeing data of different region and implemented subscriptions for Reports using SSRS.
  • Created Cross-Tabs, Drill-down and Sub-Reports using SSRS.
  • Deployed RDLs to Report Server.
  • Created Jobs Performance report that queries system tables to track duration of each job and weekly-average duration using SSRS.
  • Created reports in SSRS using different types of properties like chart controls, filters, Interactive Sorting, SQL parameters etc.
  • Designed SSAS cube to hold summary data for Targit dashboards.
  • Created User Defined Hierarchies depending on business requirement using SSAS.
  • Responsible for maintaining cubes using SSAS and populating cubes with data.
  • Created different measure groups and calculations using SSAS.
  • Created usage based aggregations in cube to minimize time for query retrieval from client tool Targit.
  • Implemented cell level security in cubes using MDX expressions to restrict users of one region viewing data of another region using SSAS.
  • Populated data store with CRM Data (Oracle) using Integration Services (SSIS) as catalyst for ETL process.
  • Created Databases, Tables, Views, Indexes and Created and maintained Roles and Database users.
  • Performed unit and system testing, troubleshooting and bug fixing in development and QA environments.
  • Scheduling Jobs and Alerting using SQL Server Agent.
  • Created datasets for reports using T-SQL and stored procedures.
  • Performed DBA tasks like Backup, Restoration of tables and Stored Procedures and Refreshing Databases on weekly basis.
  • Involved in project status report updates/creation of issue items for reviews/updates of technical and transformation files on timely basis.
  • Planned and engineered REST web services to manipulate dynamic datasets.
  • Completed full redesigns of existing websites to improve navigation, enhance visuals and strengthen search engine rankings.
  • Reviewed code to validate structures, assess security and verify browser, device and operating system compatibility.
LinkedIn

https://www.linkedin.com/in/abhinay-potu-8b7563139/

Business Intelligence
  • Datastage(5)
  • B.O Data Integrator(2)
  • IPC 9x(2)
  • SSAS(4)
  • SSRS(4)
  • Power BI(3)
  • Informatica(3)
  • Tableau(3)
Skills
  • TOAD(5)
  • Service NOW (2)
  • IBM Tools(1)
  • SSMS(4)
  • XML Web Services(3)
  • HTML(2)
  • DB Visualizer(2)
  • SQL Profiler(5)
  • Performance Tuning(7)
  • Database Design(5)
  • Data Modeling(4)
  • Data Migration(6)
  • Jira(4)
  • Confluence(4)
  • Control-M(4)
  • Tidal(3)
  • Tivoli Workload (1)
  • QNXT(3)
  • UNIX(3)
  • Visio(5)
  • Enterprise Architect(4)
  • BCP(5)
  • Visual Basic(3)
  • Agile(5)
  • Waterfall(3)
Languages
  • PL/SQL(5)
  • Oracle(5)
  • Teradata(3)
  • SQL Server(5)
  • T-SQL(5)
  • MongoDB(1)
  • PowerShell(3)
Tools
  • DB Visualizer(4)
  • IBM Data Architect(1)
  • Visual Studio(7)
  • SAS Enterprise(2)
  • MS Office(10)
  • SQL Server BIDS(3)
  • SharePoint Portal Server(3)
  • Windows(10)
  • Unix(3)
Version Control
  • Subversion(2)
  • Azure DevOps(3)
  • GitHub(5)
  • Bitbucket(5)
  • Team Foundation Server(7)
Education
Andhra UniversityAndhra Pradesh, IndiaBachelor of Science: Electronics and Communication Engineering
Certifications
  • Microsoft Power BI: https://courses.edx.org/certificates/91513f60bdae4a4aa7b8ed68e6faa3e6
  • AWS Certified Developer - Associate: https://www.youracclaim.com/badges/ff9a58a3-9586-4a1f-88f5-0d056143ce2a/public_url
Achievements & Events
  • Won Scrum champion of Month multiple times at IT Department level
  • Scrum developer training in FHLB Dallas.
  • Trained citizen developers to excel in ETL and Reporting.
  • Attended various SQL server sessions organized by Microsoft.
Reference

Can be provided upon request

Build Your Own Now

DISCLAIMER

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

Resume Overview

Companies Worked For:

  • Federal Home Loan Bank
  • Kansas City Southern Railways
  • Inventiv Health
  • Glasko Smith Kline

School Attended

  • Andhra University

Job Titles Held:

  • Sr. Business Intelligence Developer
  • Sr.SQL BI/Tableau Developer
  • SQL BI Developer

Degrees

  • Bachelor of Science : Electronics and Communication Engineering

Create a job alert for [job role title] at [location].

×

Advertisement

Similar Resumes

View All
Sr.-Business-Intelligence-Developer-resume-sample

Sr. Business Intelligence Developer

UCare

Minneapolis, Minnesota

Sr.-Business-Intelligence-Developer-resume-sample

Sr. Business Intelligence Developer

Federal Home Loan Bank

Irving, Texas

Sr.-Business-Intelligence-Manager-resume-sample

Sr. Business Intelligence Manager

Amazon

Kirkland, Washington

About
  • About Us
  • Privacy Policy
  • Terms of Use
  • Sitemap
Help & Support
  • Work Here
  • Contact Us
  • FAQs
Languages
  • EN
  • UK
  • ES
  • FR
  • IT
  • DE
  • NL
  • PT
  • PL
Customer Service
customerservice@livecareer.com
800-652-8430 Mon- Fri 8am - 8pm CST
Sat 8am - 5pm CST, Sun 10am - 6pm CST
  • Stay in touch with us
Site jabber winner award

© 2021, Bold Limited. All rights reserved.