LiveCareer-Resume

senior sql developer resume example with 20+ years of experience

Jessica Claire
, , 609 Johnson Ave., 49204, Tulsa, OK 100 Montgomery St. 10th Floor
Home: (555) 432-1000 - Cell: - resumesample@example.com - : - -
Summary

Data Engineering with experience in Design, Development, Implementation and support of Data Warehousing for over 8 years. Experienced in complete Software Development Life Cycle (SDLC), software Testing Life Cycle (STLC), SDLC methodologies. Extensively worked on Informatica Designer Tools (Source Analyze, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer) and Workflow Manager Tools (Task Developer, Worklet and Workflow Designer), Workflow Monitor and informatica Power exchange. Experience in working with using Informatica in SAP HANA, Oracle, MS SQL, Teradata and DB2 environments. Hands on experience in migrating on premise ETL to Google Cloud Platform(GCP) using cloud native tools such as BIG Query, Cloud Data Proc, Google Cloud Storage, Composer.

Practical understanding of the Data modeling concepts like Star - Schema Modeling, snowflake Schema Modeling, Fact and Dimension tables. Also experience in Optimizing Database querying, data manipulation using SQL and PL/SQL in Oracle, flat files and SQL server database. Experienced in implementing Change Data Capture (CDC) using informatica Power Center for Oracle and SAP Systems. Experienced in debugging mapping by analyzing the data flow and evaluating transformations. Experienced in performance tuning of data flow through source, target, sessions and mappings, identifying and resolving performance bottlenecks at various stages using techniques like Database tuning and Session Partitioning. Experienced in test strategy, developing test plan, details test cases, and writing test scripts by decomposing business requirements, and developing test scenarios to support quality deliverables. Involved in test planning and execution for various Test Phases: Unit Test, System and User Acceptance Testing. Expert in analyzing, designing, developing, installing, configuring and deploying MS SQL Server suite of products with Business Intelligence in SQL Server Reporting Services, SQL Server Analysis Services and SQL Server Integration Services. Performed data profiling, data cleansing, data conversion, exception handling and data matching using informatica IDQ. Excellent communication skills, good organizational skills, self-motivated, hardworking, ability to grasp quickly and learn fast and open to new technologies.

Skills & Tools
  • ETL Tools Informatica Power Center 10.5.1 Power Exchange 9.6.1/9.0.1, Informatica Data Quality 9.6.1, Power connect for SAP BW, power connect for JMS, power connect for IBM MQ series, power connect for Mainframes, DTS, MDM, ERWIN.
  • Scheduling, TIDAL, UC4, CONTROL-M, AUTOSYS, OpCon.
  • Oracle, SAP HANA, MS SQL Server, Snowflake, MongoDB, Teradata, DB2, AWS.
  • Python, Java, SQL, UNIX, Unix Shell Scripts, HTML, XML, JSON, Microsoft Office, Apache Spark, Kafka, Kubernetes, Hive, Scala
  • Data analysis, Data management, Data warehouse, Big Data, PL/SQL, RDBMS, NoSQL, Vertica, Spark, Kafka, Oozie, Maven.
  • Debugging, Coding, Designing, Quality analysis, ETL, SAP BW.
  • AWS Cloud Tools: EC2, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, Elastic Beanstalk, Cloud Front, Elastic Files System, RDS, Dynamo DB, DMS, VPC, Direct Connect, Route53, Cloud Watch, Cloud Trail, Cloud Formation, IAM, EMR, ELB, Lambda functions, REST API, Airflow, Data Pipeline, RedShift.
  • Google Cloud Platform: GCP Cloud Storage, Big Query, Composer, Cloud Dataproc, Cloud SQL, Cloud Functions, Cloud Pub/Sub, Dataflow, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, Firestone.
  • Azure: Azure Storage, Database, Azure Data Factory, Azure Analysis Services.
Experience
02/2022 to Current
Senior SQL Developer Kpmg Orlando, FL,
  • Responsible in Identifying all the Legacy systems, analyze their data models.
  • Business and Data Mapping Analyst worked with Client to gather business and functional requirements.
  • Experience in building large-scale data pipelines and data-centric applications using Big Data tooling like Hadoop, Spark, Hive and Airflow in a production setting.
  • Experience in working with Rest APIs for data extraction.
  • Development of ETL (Extract, Transform and Load) code components like mappings, Workflows, Sessions and database objects like stored procedures, functions and Unix shell scripts from business requirements and design plan.
  • Responsible Identifying the performance bottleneck in the ARIES systems during Integrated, Testing (IT) and System Testing (ST) and PT (Performance Testing) phases and implement Performance tuning techniques.
  • Confer with the scrum master, client partners, Business Managers and adhere to the agile/DevOps.
  • Experience in multiple database technologies such as traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL), Distributed Processing (Spark, Hadoop, EMR), NoSQL (MongoDB, DynamoDB)
  • Conduct code review sessions with peer developers to ensure code quality.
  • Wrote scripts and processes for data integration and bug fixes.
  • Utilized Python to handle debugging and automation scripting tasks.
  • Created and implemented complex business intelligence solutions.
  • Create & Managing buckets on S3 and store DB and log backup, upload images for CND server.
  • Setup databases on Amazon RDS or EC2 Instances as per requirements.
  • Hands on Experience in Data Management, Data Security, Data Modeling, Data Quality, Workflow Automation, Formulas & Validations.
  • Collaborated with Legacy team to define data extraction methodologies and data source tracking protocols.
  • Used Spark-SQL to read parquet data and create tables in Hive using Scala API.
  • Used various spark transformations and Actions for cleansing the input data.
  • Build data pipelines in Airflow in GCP for ETL related jobs using different airflow operators.
  • Experience in GCP Dataproc, GCS, Cloud functions, BigQuery.
  • Analyzed existing SQL queries to identify opportunities for improvements.
  • Created optimal technical solutions to user needs with in-depth system analysis.
  • Designed and implemented TSQL queries for reporting and complex solution development.
  • Managed workload independently but collaborated with colleagues to complete larger scale tasks in distributed team environment.
  • Supported projects enabling delivery efficiencies and technical resolution.
  • Followed standard practices for migrating changes to test and production environments.
  • Helped plan data extraction and processing tasks, aiding in eventual deployments.
  • Resolved complex business issues, proposing long-term solutions to avoid repeat problems.
  • Developed custom database objects, stored procedures and delivered application support.
  • Enhanced existing reports with introduction of new system features.
  • Developed reusable components for exploitation in coding and documentation.
  • Developed, implemented and optimized stored procedures and functions using T-SQL.
  • Designed integration tools to combine data from multiple, varied data sources such as RDBMS, SQL and big data installations.
  • Designed and created ETL code installations, aiding in transitions from one data warehouse to another.
  • Utilized code versioning systems such as [Type] to reduce development times.
  • Designed and created ETL code installations, aiding in transitions from one data warehouse to another.
  • Designed intuitive graphical user interfaces to improve user experience.
  • Led version control efforts for organization, employing public and open source repositories.
  • Designed and implemented scalable applications for data extraction and analysis.
  • Conducted data modeling, performance and integration testing.
  • Created proofs of concept for innovative new solutions.
  • Built databases and table structures for web applications.
  • Reviewed code, debugged problems, and corrected issues.
  • Developed unit test cases for testing and automation.
  • Developed and maintained microservices architectures using Docker, Kubernetes and OpenShift.
  • Worked with back-end developers to design APIs.
  • Used NodeJS, ORM and SQL/No-SQL to develop and manage databases.
  • Supervised work of programmers, designers and technicians, assigned tasks and monitored performance against targets.
  • Tuned systems to boost performance.
  • Corrected, modified and upgraded software to improve performance.
  • Documented software development methodologies in technical manuals to be used by IT personnel in future projects.
  • Designed and developed analytical data structures.
  • Designed and developed forward-thinking systems that meet user needs and improve productivity.
  • Maintained complex T-SQL queries, views and stored procedures in multi-database environment with little supervision.
  • Developed functional databases, applications and servers to support websites on back-end.
  • Optimized web applications for speed, scalability and security.
08/2010 to 02/2022
BI_ETL Developer Altus Group Limited Denver, CO,
  • Analyzing the source data coming from Mainframe sources and working with business users and developers to develop the Model.
  • Effectively and efficiently communicated systems solutions to business problems to team members, business unit representatives, management, and other impacted project teams.
  • Analyze and modify existing stored procedures, functions, and queries in order to integrate them into previously built reporting application utilizing SSIS and SQL Server Management Studio.
  • Change ETL process from importing flat files and various databases as source for previously built application utilizing SSIS to read data from existing Reporting Server (SQL Server databases).
  • Interface with report users to determine requirements for new and existing reports.
  • Develop, test, and deploy ETL jobs with reliable error/exception handling and rollback framework Manage automation of file processing as well as all ETL processes within a job workflow.
  • Worked as a Data Analysts to match the current data mappings with old mainframe mappings.
  • Created Design Documents, Context Diagrams on every Projects and did code review.
  • Created Reusable lookup to send emails Notification to User's.
  • This Lookup was used by multiple developer in multiple Projects.
  • Prepared best practices documentation for teams in writing NOSQL.
  • As a part of Informatica decommission objects Project Used SVN to archive the ETL Code and delete the unwanted mappings and workflows.
  • Extracted data from flat file and staged into a single place and applied business logic to load them in the BigQuery database.
  • Lead in few complex Projects and Performed Unit testing and created QA documents.
  • Participated in solution brainstorming and provided technical instruction and coaching to others within organization.
  • Developed technical project deliverables.
  • Created spreadsheets using Power BI and Power Pivot by importing data from the sources directly.
  • Created visually impactful dashboards in Excel and Tableau for data reporting using PivotTables and VLOOKUP.
  • Involved in migrating ETL code lower environments to like dev/ test/Load to production environment.
  • Migrated ETL code using Deployment Groups from Dev to Prod environments.
  • Created SSIS package for loading the data coming from various interfaces like OMS, Orders, Adjustments and Objectives and also used multiple transformation in SSIS to collect data from various sources.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Created SSIS packages for File Transfer from one location to the other using FTP task.
  • Manage and document our platform infrastructure. This can go from installing a new Consul server to resolving performance issues in a MongoDB cluster, through setting up a continuous integration pipeline.
08/2008 to 08/2010
Senior Consultant Boeing Hazelwood, MO,
  • Involved in all the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT.
  • Analyzed problematic areas to provide recommendations and solutions.
  • Extracted data from heterogenous sources and performed complex transformations to load data into the target systems.
  • Closely monitored and updated company systems for efficiency, output and other factors to improve overall productivity.
  • Assessed needs for projects and made proposals to senior executives.
  • Resolved various performance issues by examining the logs, current design and removing the bottlenecks.
  • Wrote complex SQL queries and performed extensive data analysis in Oracle 11g.
  • Experience in integration of various data sources like Oracle, DB2, SQL server, .csv, XML and Flat Files into staging area.
  • Developed code to extract, transform, and load (ETL) data from inbound flat files and various databases into outbound flat files and XML files using complex business logic.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD Monitored and improved query performance by creating views, indexes and sub queries Extensively involved in enhancing and managing Unix Shell Scripts.
  • Developed custom solutions based upon clients' strict requirements.
  • Troubleshot issues by understanding issue, diagnosing root cause and coming up with effective solutions.
07/1999 to 03/2008
Senior Software Developer TradeAm International Inc City, STATE,
  • Developed databases, programs and processes for integration and implementation across enterprise.
  • Trained new developers and programmers on company standards for design and review.
  • Monitored ongoing operation of assigned programs and responded to problems by diagnosing and correcting logic and coding errors.
  • Coordinated efficient large-scale software deployments.
  • Consulted with engineering team members to determine system loads and develop improvement plans.
  • Met with stakeholders to provide detailed project reports and milestone updates.
  • Completed analysis, design and testing phases of software development life cycle.
  • Participated in highly complex projects with customers, managers and end-users.
  • Troubleshot and resolved performance issues for databases and software.
  • Monitored database performance to keep workflows running smoothly.
  • Resolved hardware and software compatibility and interface design issues.
  • Participated in requirements gathering to solidify prerequisites and determine best technical solution to meet business needs.
  • Discussed project progress with customers, collected feedback on different stages and directly addressed concerns.
  • Boosted network, system and data availability and integrity through preventive maintenance and upgrades.
  • Researched and integrated design strategies, product specifications, development schedules and user expectations into product capabilities.
  • Tuned systems to boost performance.
  • Conducted data modeling, performance and integration testing.
  • Developed conversion and system implementation plans.
  • Coordinated deployments of new software, feature updates and fixes.
Education and Training
Expected in 12/2017 to to
Master of Science: Computer And Information Systems
LNMU - India,
GPA:

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Your data is safe with us

Any information uploaded, such as a resume, or input by the user is owned solely by the user, not LiveCareer. For further information, please visit our Terms of Use.

Resume Overview

School Attended

  • LNMU

Job Titles Held:

  • Senior SQL Developer
  • BI_ETL Developer
  • Senior Consultant
  • Senior Software Developer

Degrees

  • Master of Science

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in:As seen in: