Livecareer-Resume
Jessica Claire
  • Montgomery Street, San Francisco, CA 94105
  • H: (555) 432-1000
  • C:
  • resumesample@example.com
  • Date of Birth:
  • India:
  • :
  • single:
Career Overview

Interdisciplinary Lead Software Quality Assurance Engineer experienced in high performance, high load environments. An advocate for Test First and other Agile Software Development Methodologies such as Scrum, and Constant Integration.

Career Focus
  • Expertise in setting up test frameworks and writing test automation on the full stack from scratch.
  • Proficiency in further integrating end to end testing as part of the build process.
  • Proficient in performing root cause analysis across the full stack. With specific expertise in debugging and resolving performance and scaling issues.
  • Proficient in developing BI (Business Intelligence) reports that provide historical, current and predictive views of business operations.
  • Demonstrated ability to leverage talents and leadership skills to infuse new ideas and consistently deliver results that exceed expectations.
Areas of Expertise
  • Performance Testing
  • Load Testing
  • Stress Testing
  • RUM - Real User Monitoring
  • Automating Continuous Integration tests
  • Automating Functional Test Cases
  • GUI Testing
  • Web Testing
  • Cross browser, Cross platform testing
  • Leading Scrum Teams to hyper-productivity
Technical Skills
  • Windows, Unix, Linux, Mac OSX, VMWare
  • LAMP Architecture, HTTP/Apache, IIS, HTML, PHP
  • Visual Basic, Java, C, C++, C#, ASP
  • MS Access, SQL Server, MySQL
  • MS - Exchange, Forefront Server, Sharepoint, OCS
  • MS - Windows Test Technologies (WTT), Proactis
Testing Tools
  • Performance Testing: Visual Studio Test , Load Runner 11.0, Performance Monitor, Process Explorer, Ants Performance & Memory Profiler, JetBrains .trace and .memory, Soasta - Mpulse for R.U.M, New Relic App performance monitoring, apache top, Xdebug
  • Unit-Testing: nUnit 2.0, Perseus .net 1.6, Visual Studio Test
  • Front End Automation: Win Runner 8.0, QTP, WebDriver
  • Bug Tracking Tool: Bugzilla 2.18 - 2.20 - 3.0, Product Studio 2.10, Quality Center 11.0, Microsoft TFS
  • Forensic Testing Tool: AccessData Forensics Tool Kit 1.50
  • Data Recovery Tool: EnCase 4.20
  • Virtualization Tool: VMware, Microsoft Hyper V
  • Tracing Tool: Ethereal, Wireshark, Fiddler
  • Runtime Verification:Microsoft Appverifier 4.0
  • File Fuzzing Tool: File Fuzzer 2.5, Fuzz Guru 1.5
  • Service Verification: soapUI 4.0
Accomplishments
  • Implemented automated functional testing from scratch at Hudson Energy and Bloomberg TV.
  • Implemented Performance test automation from scratch at Bloomberg TV and Publishers Clearing House.
  • Implemented Front end performance testing from scratch at Publishers clearing house
  • Implemented R.U.M across 8 properties and multiple mobile applications at Publishers Clearing House.
  • Implemented 27 Performance improvements projects across the full stack at Publishers Clearing House - seeing an increase in throughput month after month to over a billion hits a moth served.
Work Experience
Principal Performance Engineer, 08/2012 - Current
Splunk Mclean, VA,

Project Description:


Publishers Clearing House is a direct-marketing company that sells merchandise, magazine subscriptions and operates several prize-based websites. While best known for the sweepstakes and Prize Patrol it uses to promote its magazine subscriptions, the majority of the company's revenue now comes from merchandise and online advertisements. Since 1996 PCH has been selling media online successfully, and now also runs over eight ad driven gaming sites, including PCH Search and Win, PCH Lotto, PCH Games, PCH Save and Win, PCH BlackJack and Candystand.


Role/Responsibilities:


•Evaluation of Application

•Evaluation of Application Design and implementation to catch potential performance issues at the design stage.

•Analyzing the Non Functional Requirement Specs (NFRS)

•Ensuring non functional stories are complete with clearly defined acceptance criteria.

•Development of Non- Functional Test Strategy and Master Test Plans

•Development of automated performance/load tests using Visual Studio Test.

•Execute, Analyze and propose changes/Fixes when bottlenecks or other performance issues are found.

•Time Line

•Resourcing

•Resourcing automation tools

•Design and implementation of performance test framework.

•Design and implement the Agile Performance Test LifeCycle within the company.

•Setting up Test Bed

•Continuous Integration Performance Testing

•Creating measurable quality gates for performance of applications

•Prioritizing the bugs and getting them fixed

•Provide risk assessment, test schedule, test results and test reports and metrics

•Use metrics to identify areas with most need.

•Design and implement companies R.U.M (Real User Measurement) Implementation, to indentify bottlenecks not visible by artificial tests.

•Analyzed, proposed and Implemented 27 web application performance improvements projects across the .net and php stack including -

1.Expires headers across the complete asp and php stack.

2.Google page speed across the php stack

3.Automated Minification of assets (JS & CSS) accross the full stack

4.Image Optimization / Image sizing accross the full stack

5.Corrected CDN usage across the full stack.

6.Reduce HTTP requests

7.Domain Sharding

8.Reduce Cookie Size

9.Cookie less domains

10.Combine images on the same domain

11.Implementing App Caching - memcache / Scaleout

12.Implementing Bootstrap Caching for php

Lead Automation Engineer, 2012 - 08/2012
Eaton Corporation Texas, AL,

Project Description:


Bloomberg LP is a recognized leader in delivering financial market data, including securities data (debt, equities, and derivatives), commodities data, and other global economic news. Bloomberg makes financial data available through many mediums, such as the Bloomberg Terminal, Bloomberg Businessweek, Bloomberg Radio and Bloomberg Television. The Bloomberg media engineering group is responsible for all the supporting hardware and software applications required to produce, edit, re-engineer and distribute content produced in-house and syndicated through Bloomberg outlets. As the Lead Automation Engineer for Bloomberg Media Engineering, my primary responsibility is to improve the current software development life cycle, by removing the engineering group’s biggest impediment, "Testing". The current test cycle takes weeks to execute, and as the applications grow and become more complex to meet business needs, the group is unable to deliver quality products in a timely manner.


Role/Responsibilities:


•Evaluation of Application

•Analyzing the FRS

•Ensuring Stories are complete with clearly defined acceptance criteria.

•Development of Test Strategy and Master Test Plans

•Time Line

•Resourcing

•Decision to Test Application Manually/with Automated Tool

•Resourcing automation tools

•Design and implementation of test framework.

•Setting up Test Bed

•Testing important phases for roll out

•Executing Functional and Non-Functional tests (including Performance, load and stress testing)

•Prioritizing the bugs and getting them fixed

•Provide risk assessment, test schedule, test results and test reports and metrics

•Use metrics to identify areas with most need.

•Gathered Performance requirements for multiple applications.

•Used Visual Studio Test for Performance Testing.

•Developed Web Service load test scripts using Soap UI.

•Monitored Metrics for indication of bottlenecks on Application server, Web server and database server.

Sr. Quality Assurance Engineer/ Analyst, 04/2011 - 11/2011
Qvc, Inc. Spring Hill, FL,

Project Description:


Bridgewater Associates is a top - performing hedge fund that manages approximately $125 billion in global investments for a wide array of institutional clients, including foreign governments, central banks, corporate and public pension funds, university endowments and a number of charitable foundations. As part of making investment decisions, Bridgewater relies on researchers to analyze equities, non-equities and global trade data among other factors, from external sources such as GTIS, Bloomberg, and Reuters to generate signals for Buy Side and Sell Side Trading so the company can minimize risk and maximize profits for their clients. The research technology team supports researchers in achieving their goals by creating tools to onboard, visualize and clear new data. My role as a Scrum Team member, in research technology, meant that I was always working through a varied set of responsibilities with the ultimate team goal of delivering business value. These responsibilities often consisted of gathering clear and crisp requirements and acceptance criteria, assuring quality in the products we were delivering, and ensuring product meets business needs.


Role/Responsibilities:


•Evaluation of Application

•Analyzing the FRS and NFRS

•Ensuring Stories are complete with clearly defined acceptance criteria.

•Development of Test Strategy and Master Test Plans

•Time Line

•Resourcing

•Decision to Test Application Manually/with Automated Tool

•Setting up Test Bed

•Testing important phases for roll out

•Executing Functional and Non-Functional tests (including load testing)

•Prioritizing the bugs and getting them fixed

•Updating project manager with the status of application

•Adhering to time frames and giving on time deliverables

•Provide risk assessment, test schedule, test results and test reports and metrics

•Use metrics to identify areas with most need.

•Coordinated test walk through and follow ups

•Lead Product Demo's

•Lead Retrospectives, help improve self and team each iteration

•Gathered Performance requirements for the application and designed performance tests for multiple clients within the organization.

•Customized LoadRunner scripts in C language.

•Developed Web Service load test scripts using Soap UI.

•Responsible for setting up monitors to monitor network activities and bottlenecks.

•Analyzed results for Bottlenecks and made recommendations for the bottlenecks.

•Monitored Metrics on Application server, Web server and database server.

•Created rendezvous point for Performance test scenarios to find deadlocks.

•Involved in Business functionality review meetings and Use-Case Analysis and developing the templates for User/Customer Training and documentation.

Lead Quality Assurance Engineer, 02/2010 - 04/2011
Svb Financial Group Portland, OR,

Project Description:


Hudson Energy is one of the largest and fastest growing B2B suppliers of electric and natural gas commodity in North America. Hudson has been serving commercial and residential customers since 2002 providing customers with the ability to hedge electric and gas commodities, or purchase fixed and index products in New York, New Jersey, Texas and Illinois. To meet customer and market needs Hudson was building a new billing and CRM solution. My initial role on the team was to document and lead the testing efforts on our new billing system, so that we are ready for the switch over from a Paradox based solution to a new SQL based model. As part of this role, I was in-charge of leading the complete test effort from test strategy to test resourcing, including interviewing and training new testers. However, my role quickly evolved into Scrum Master, leading a team of (6) developers as well as other (2) testers on various projects within Hudson. As scrum master, i had the responsibility of removing impediments, which often required interacting with other groups within Hudson, understanding the business needs, and relaying team needs to upper management, as well as providing transparency into the current status of the team. Also as an effective part of the scrum team itself, I furthered BI efforts by gathering sales metrics, performance data, and creating reports in SSRS so they can be accessed on demand by management. Additionally, while leading the team in completing the billing system and other supporting systems, I also scripted and automated the complete migration effort from Paradox to SQL, the automation was written in c#, and scripted with sql/t-sql.


Role/Responsibilities:


•Evaluation of Application

•Analyzing the FRS and NFRS

•Decision to Test Application Manually/with Automated Tool

•Setting up Test Bed

•Creation of Master Test Plans

•Delegation of work among team members

•Training of team members for the use of tool specific to the application

•Testing important phases for roll out

•Conducting daily and weekly meetings for progress and enhancements

•Leading daily SCRUM as scrum master.

•Removing Impediments

•Creation of utilities to making the testing process faster.

•Prioritizing the bugs and getting them fixed

•Updating project manager with the status of application

•Development of Test Strategy

•Development of Automation Libraries in C#

•Development of Migration scripts from paradox to sql.

•Time Line

•Resourcing

•Adhering to time frames and giving on time deliverables

•Provide risk assessment, test schedule, test results and test reports and metrics

•Use metrics to identify areas with most need.

•Design and implementation of non functional tests - such as performance tests.

•Coordinated test walk through and follow ups

•Creating reports in SSRS

•Lead Product Demo's

•Compile Velocity, Burndown, Business Value metrics

•Lead retrospectives and help improve team each iteration

Sr. Software Test Engineer, 09/2007 - 02/2010
Microsoft Corporation City, STATE,

Project Description:


The Forefront Security team, within Microsoft, is involved with the development and testing of Microsoft Forefront Security for exchange server, among various other configurations. My role in the project initially (as a manual tester) involved testing the exchange server version of the product. The Exchange Server version of Microsoft Forefront Security includes multiple scan engines from industry-leading antivirus and spam vendors, all integrated into a single solution to help businesses protect their Exchange messaging environments from viruses, worms, and spam. Although my main role in the team was testing the exchange version of the product, occasionally i also had the opportunity to work with the sharepoint as well as the office communicator version of forefront.


Role/Responsibilities:


•Creation of Test Cases.

•Functionality Testing.

•Smoke Testing.

•Verify Bugs – reproduce by re-creating customer environments. Since each customer environment is different, this could mean recreating several virtual machines with unique configurations, each time.

•Verify Fixes

•Regression test


My role eventually evolved into including automation of test cases for anti-virus & anti-spam engine testing. Additionally I was given the responsibility of monitoring our submissions mailbox, and creating a solution for an automated response, with a tracking id. The code for automation as well as code for creating the auto reply add-in was written in C#, and I wrote the design document for the same. I also trained new junior testers, and familiarized them with the role and responsibilities of a tester on our team.


Role/Responsibilities:


•Automation of Test cases – using the nUnit and Perseus Framework.

•Set up test environments using VMware and Hyper V

•Unit Testing

•Execution of tests

•Functionality Test

•Performance Test

•Memory Leak Test

•Runtime Verification Test

•File Fuzzing Test

•Logging bugs in product studio, getting bugs fixed

•Regression Test

•Creation of an Automated Response Add-in for Outlook

•Responding to Enterprise customer submissions

•Submitting virus samples to vendors

•Following up on vendor responses.

•Adhering to time frames and giving on time deliverables

•Training Junior testers.

Relevant Professional Experience

Certified General Holding Corporation. Jan 06-Sep 07

Systems Analyst / QA Tester


World Trade Zone Mar 05- Dec 05

Position: Systems Analyst


New York Restoration Inc, NY Jan 04-Feb 05

Systems Administrator

Education
Bachelor of Science: Computer Forensics, Expected in
-
SUNY - Farmingdale - Farmingdale, NY
GPA:
Certifications
  • Diploma in Network Administration(CompTechnology NY)
  • Certified ScrumMaster, CSM (ScrumAlliance, NY)

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

Disclaimer

Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.

How this resume score
could be improved?

Many factors go into creating a strong resume. Here are a few tweaks that could improve the score of this resume:

80Good

resume Strength

  • Length
  • Personalization
  • Strong Summary
  • Target Job

Resume Overview

School Attended

  • SUNY - Farmingdale

Job Titles Held:

  • Principal Performance Engineer
  • Lead Automation Engineer
  • Sr. Quality Assurance Engineer/ Analyst
  • Lead Quality Assurance Engineer
  • Sr. Software Test Engineer

Degrees

  • Bachelor of Science

By clicking Customize This Resume, you agree to our Terms of Use and Privacy Policy

*As seen in: