Over 7+ years of professional experience in IT industry with hands-on experience in Developing,Testing, Implementing
and maintenance of various applications using Hadoop,Java, SPSS technologies.
Good Exposure on Apache Hadoop Map Reduce programming, Hive, PIG scripting
Designed and created product requirements,functional specifications, and involved in developing test strategy,test plan and test case documents.
Interacted with Business Analysts and Software Developers for bug reviews and participated in QA meetings
Performed data analysis using Hive
Enormous experience in
Software Development Life Cycle (SLDC) including requirements and system
analysis, design, programming, testing, implementation, and application
Hands on experience on writing Queries, Stored procedures, Functions
and Triggers by using SQL.
Support development, testing, and operations teams during new system
Evaluate and propose new tools and technologies to meet the needs of
Excellent understanding / knowledge of Hadoop architecture and
various components such as HDFS, Job
Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.
Senior Survey Programmer adept at survey development, testing and optimization. Excels in development, including coordinating ground-up planning, programming and implementation for core modules. MrInterview(SPSS) and ConfirmIt developer offering 5 years leading cross-functional teams and completing projects on-time. Seamlessly manages workload to meet deadlines.
Experience in creating dynamic web interfaces using Java script, jQuery,
Experience with creating metadata and testing database.
in Object oriented programming, integrating and testing
implementations collecting business specifications, user requirements, design
confirmation, development and documenting the entire software development life
cycle and QA.
in Managing and Coordinating team size from 4-7
to learn and adapt quickly to the emerging new technologies and paradigms.Learnt many technologies on job as per the requirement of project.
communication, interpersonal and analytical skills and a highly motivated team
ability to work independently.
Requirements analysis and design phases
Interface design and implementation
Questionnaire analysis with respect to data requirements
Hadoop DeveloperMar 2014 to Jan 2015 Bank of America － Newark , DE
of America is multinational banking and financial service provider. This
project is all about to capture the customer's login device details and
processing that data. For processing large amount data we used Hadoop
ecosystem. Once we gathered information we run strategies on that collected
data and loaded them in to Hive tables.
Involved in requirement
analysis, design, coding and implementation.
Processed data into HDFS by developing solutions,
analyzed the data using Map Reduce, Pig, Hive and
produce summary results from Hadoop.
Used Sqoop to import the data from Hadoop
Distributed File System (HDFS) to
Used Latin to clean
unwanted data analyse data.
Involved in loading and transforming sets of Structured, Semi-Structured and Unstructured data
and analyzed them by running
Hive queries and Pig scripts.
Worked on file formats like Text
files, Sequence Files.
Managed and reviewed log files.
Responsible for setting up password less Hadoop
Developed Pig Latin scripts to extract the data from the web server
output files to load into HDFS
Hadoop DeveloperJan 2013 to May 2014 AT&T － Alpharetta, GA
AT&T is an American
multinational telecommunication corporation. It is the largest provider for
both mobile and landline telephone service, and also provides broadband
subscription television services. Being one of the largest telecommunication
providers AT&T has huge customer data that can be analysed and taken
advantage of. To consumer marketing professionals, data about the users of
mobile network are highly valuable so that the US-based network operator is
turning access to and collaboration on its data into a new business service. In
order to ensure secure data sharing and at the same time easing access and use
of data, good management of data is required which involves data aggregation
from multiple sources. AT&T has created programmable interfaces to each of
its data sets that
Ensure read-only access
to the data.
Evaluated business requirements and prepared detailed specifications
that follow project guidelines required to develop written programs.
Responsible for building scalable distributed data solutions using Hadoop.
Analysed large amounts of data sets to determine optimal way to
aggregate and report on it.
Developed Simple to complex Map reduce Jobs using Hive and Pig.
Handled importing of data from various data sources, performed
transformations using Hive, MapReduce, loaded data into HDFS and Extracted the
data from MySQL into HDFS using Sqoop.
Exported the analyzed data to the relational databases using Sqoop for
visualization and to generate reports.
Extensively used Pig for
Created partitioned tables in Hive.
Managed and reviewed Hadoop
Involved in creating Hive
tables, loading with data and writing hive queries.
Installed and configured Pig
and also written Pig Latin scripts.
Developed Pig Latin
scripts to extract the data from the web server output files to load into HDFS.
Load and transform large sets of structured, semi structured and
Responsible to manage data coming from different sources.
Worked with application teams to install operating system, Hadoop
updates, patches, version upgrades as required.
SeniorSurveyProgrammerApr 2011 to Jan 2012 Toluna Technology Services Pvt. Ltd. － Gurgaon, India
Managed the road map for project design and development. Updated, modified and expanded existing software programs. Drafted
technical documents with detailed design objectives,architectural and
project design documents.
Coordinated with QA testers for end-to-end unit testing and post-production testing.
Supervised the process work of other survey developers.
Gathered analyzed, documented business and technical
requirements from PD fromclient side
Performed Requirement Analysis and developed Cases,Scenarios and Activity Diagrams.
Prioritized Requirements to ensure timely reviews and availability.
Intricately involved in Project Process Flow development, Process Modeling, and Analysis.
Identified the feeds/recons with issues and function as an escalation point to keep all parties aware of overall impact on the associated recon, facilitating management decisions at the earliest possible time.
HTML/HTML5, CSS3, DOM.
Front-end GUI development using browser friendly CSS,JQuery,
Implemented various validation
controls for form validation and implemented custom validation controls with
Applying the changes
and finalizing the reports and giving accurate data to customer.
scrutinizing the data. Preparing preliminary status reconciliation.
Keeping track of ongoing projects with junior developers.
Designing and implementing complex quotas in loop, least full quotas, multi-punch quotas.
Conducting and Managing both
Quantitative and Qualitative studies. Understanding of reports, objective of
the study and quality as per quotas, industry verticals and quantity required by
Enabled and facilitated projects improvement using all the advance
function of Ms-Excel like vlookup, hlookup, pivot table, sorting, if, count,
and monthly plan for projects with respect to time lines, priority,
complexity and resources.
& editing questionnaire according to subject of research of the survey.
in performance analysis of groups:- Executes reports timely and accurate,
supports overall analysis efforts as needed.
Coached Delivery teams, which resulted in early delivery of project and reduced process waste caused by lack of common understanding.
Senior Survey ProgrammerDec 2007 to Jul 2010 Annik Technology Services Pvt. Ltd. － Gurgaon, India
Primarily involved in development of projects and handing team projects while mentoring juniors developers. Analyzed requirements and used systematic approaches to implement and document the tasks .Led working groups to develop strategies and prepare standard project processes. Developed custom software solutions for clients. Recommended and executed plans to improve development tools and processes. Oversaw deployment, configuration and documentation procedures.
Designed , developed and executed Consumer surveys,B2B surveys,Retargeted surveys and Segment surveys.
Programming & handling multi-lingual surveys.
Complex surveys (including complicated
piping logic and conditional logic).
Designing, understanding and updating questionnaire with client .
Research methodologies (Conjoint/DCM, Max–Diff and Segmentation
Made projects inn different domains like: Medical Projects,Sports,Automobiles,Finance,Advertising & Marketing,Travel,Shopping,Internet,Automotive,Media & custom research,Video games,Beauty,Health care surveys,Employment etc.
Customization of appearance
Highly customized error
Constant sum scale questions
Randomization and Block rotation
Scaling questions & Ranking questions
different front-end forms & dynamic pages using CSS,HTML & given
various kind of report in Ms- Excel like Count Report, List Report, Mismatch
Report on weekly basis
junior programmers/programmers on complex surveys.
Estimating the time and sketching the project life cycle
Changing plan of projects while meeting deadlines
with frequent interruptions and redefined priorities from clients.
Demonstrated aptitude for working in a virtual team
environment across multiple time zones
Have strong model development disciplines: model
approach, transparency, interpretation, implementation, documentation and
Design and maintain summary reports, source &
Provide training for staff members and providers
when new procedures are implemented
Programmer/QAAug 2006 to Nov 2007 Exevo India Ltd. － Gurgaon, India
Responsibilities with QA team include to make test plan meeting requirements of the project that will serve the purpose effectively and test the project to make sure that it meets the expected standards and specified requirements. Developed surveys covering variety of domains and interacted with client throughout the project life cycle.
Ability to work in a
fast paced, iterative development environment with short turn-around time.
Executing the full
testing development lifecycle, creation of test cases after referring to
requirements & release notes documents, execution of test cases, Defect raising,
tracking, reporting & retesting.
at understanding of Requirements and Solution Design documents. Possess ability
to understand complex technical information related to functionality of
Created daily status report and arranged meetings with developers in fixing the issues on prompt basis & sharing the
reports with respective PD within prescribed timelines.
Tested web based applications with logic flow, design flow and data storage.
that tested deliverables are delivered as per the delivery plan within
Became part of development team
Hadoop DeveloperFeb 1753 Verizon － West Street, NY
Jan 2015 - Till Date
Verizon is an American broadband and telecommunication
Company and a corporate component of the Dow Jones Industrial Average.
Installed/Configured/Maintained Apache Hadoop
clusters for application development and Hadoop
tools like Hive, Pig and Sqoop.
Deployed Hadoop Cluster in the Standalone mode.
Developed Pig Latin scripts to analyse data with different expects.
Develop Hive queries for analysing data.
Collected the logs data integrated in to HDFS .
Managed and reviewed Hadoop log files.
Created and maintained Technical documentation for
launching Hadoop Clusters and for executing
Hive queries and Pig.
PROJECT: CEG-CoE Hadoop
Description: My task is to execute Proof of Concept, build competency for Hadoop and related projects across various ISU's. Installed, Configured and Administrated Hadoop Cluster for pseudo mode.
Solution Environment: Red Hat-6.0 Apache Hadoop-1.2.1 Shell Scripting
PROJECT : Call Data Records Anaylsis
Description: The Call Data Record (CDR) is downloaded from production servers. Created a Master table which contains extensions i.e first four digits of all number which will help us to know the details of vendors. Hive is used to perform ETL on data and conclusive reports were made using Excel sheet with graphs.
Solution Enviornment: Red Hat-6.2, Apache Hadoop-1.0.2, Apache Hive-0.10.0, Java-1.7, Sqoop-1.4.4, MySQL Server 5.6
PROJECT : Email log analysis
Description:Data is analysed in Hadoop. Various cases of email logs were analysed using Pig Scripts.
Solution Environment: Red Hat-6.2, Apache Hadoop-1.0.2, Apache, Pig-0.11.0, Java-1.7,Sqoop-1.4.4
Health Care Projects
Custom web based project in an agile development environment with high complexity level with respect to logic and design.
Creating program documentation, and generating report in support of the full production development lifecycle.
Implemented and programming of logical part.
Regular communication with client and team members for keeping check on flow of project in each phase until deployment of project.
SPSS DATA PROCESSING TRAINING:
Introduction to data management, Data Validation, Data cleaning and report
to 5 levels of CMMI. It provides understanding of the concepts of business
process. The program focuses on customer satisfaction maintaining high quality
and Data/Information Security.
SIX SIGMA PROGRAM
Data & Information Security Awareness as per “BS7799” (British Standard 7799)for Customer Data & Information
Culture training which includes the U.S. Culture and American lifestyle.
Guru Nanak Dev
University M.S (Computer
Guru Nanak Dev
University Bachelor of Computer
Application BBK DAV College,