Software Engineer with thorough hands-on experience in all levels of testing, including performance, functional, integration, system, regression and user acceptance testing.
|Skills||Experience||Total Years||Last Used|
|ALM Performance Center 11.0||Medium||3||2013|
|ALM QualityCenter 11.0||Expert||10||2013|
Awarded with GM OnStar CIO award after successful implementation of OnStar in China.
Responsible for handling multiple high complex performance test projects end-to-end and advise the application teams with performance tuning recommendations.
Install, configure and maintain LoadRunner software, performance monitoring tools J2EE diagnostics and Site Scope.
Provide project teams and LOB partners with performance, capacity assessment and deployment risks using Key Performance Indicators.
Provide application teams with performance test sign off based on the success criteria.
Developed Test Strategies and Test Plans for Iterative Incremental Development Process.
Led multiple projects which involved in agile development methodologies for Performance and Functional testing.
Led the Testing activities on OnStar for Verizon API for activation, deactivation, Change Mobile Number, and ESN Swap.
Supported Test Coordination efforts between UAT, SIT, Performance and offshore (China, India and Brazil) teams.
Developed Test Summary Reports for GO / NO-GO decision making
Conducted high level requirements analysis, wrote proposals for performance test requirements, and Statements of Work for various client engagements.
Client: American Electric Power Inc.
Role: Performance Test EngineerSr.
Performance Tested PeopleSoft Financials 8.4. (Billing, Account Reconciliation, Account
Payable and General Ledger), PeopleSoft HRMS 8.3 (Time and Labor and Benefit Enrollment)
using WEB/HTML protocol.
Monitored WebServer, Application Server and Database Server operating systems using
LoadRunner Rstad and Performon.
Monitored Oracle Database V$SESSTAT, V$SYSSTAT. V$VIEW, V$INSTANCE tables and suggested with tuning recommendations.
Monitored Web logic queues and Threads in web logic console and UNIX logs.
Present the Final test repots to the customer.
Client: Ford Motor Company
Role: Performance Test Engineer Sr.
Gathered Load test requirements from Subject Matter Experts (SME) and Application
Designed and developed the load test framework for the WAS 4 to WAS 5 and Oracle8i to
Oracle9i migration in ADS porting center using LoadRunner 8.0.
Monitored the Memory leaks for Websphere applications using IBM's Tivoli Performance
Used SAR / RSTAT and PERMON remote monitoring utility to monitor respective Unix /
Linux and windows operating systems.
Created Final testing Report upon successful completion of the ported environment test.
Meet with the IT Application Owner and IT Subject Matter Expert to discuss testing results.
Client: Sterling Commerce (SBC)
Role: QA Analyst Sr.
Designed and developed the test framework for the Commerce: Centre and WAT application using Mercury‘s TSL language.
Performed System testing, Regression testing using WinRunner 7.01.
Designed, developed, and maintained automated regression scripts using TSL.
Tested the functionality of all protocol (SMTP, FTP, X400, AS1, AS2, GISB, Connect: Mailbox and POP3) using WinRunner and SilkPerformer.
Manually tested the Anti-Spam functionality for the protocols SMTP, FTP, X400, AS1, AS2 and GISB and functionality of MQ Pickup Time Processor, Loader and Extractor.
Performed Load and stress test using Segue's SilkPerformer 5.1.1 on the protocols by sending about 20,000 EDI messages/hour.
Monitored the Network traffic using Network protocol Analyzer tool Ethereal
Scripted automated test cases in 4Test using Silk Test.
Tested CPCPS commercial auto, property, general liability, personnel umbrella, commercial umbrella, pollution, crime functionalities.
Performed unit, system, integrated, and regressions tests, analyzing results and report application bugs in defect tracking system. Retest problems as fixed and update problem statues and notes.
Created SQL Scripts to run against the DB2 Database to do database testing.
Review test plans and identify tests, which should not be automated due to insufficient return-on-investment to the client, moved to separate test plans, and execute manually.
Participate in review and updates to exit criteria following final regression test run subsequent to production rollout.
Resumes, and other information uploaded or provided by the user, are considered User Content governed by our Terms & Conditions. As such, it is not owned by us, and it is the user who retains ownership over such content.
Job Titles Held: