• Understand the highlevel requirements and customer technology landscape (ETL/ BI tools) through review of documents (eg Component Design Document/ Requirements document).
• Document the understanding as part of the reverse KT.
• Seek signoff from the client.
• Partner with the Business Analyst to provide suggestions on the requirements to drive clarity based on experience in earlier projects.
• Update/Review KT documents created by Test Analyst.
• Seek review of updated documents with Business Analyst.
• Create an induction manual and share it with the onsite on need basis.
Test Requirements :
• Study Organize and drive the requirement walkthrough session.
• Analyze the requirements and brainstorm with the team to identify the gaps.
• Validate the requirements for testability.
• Identify the preliminary test data requirements.
• Seek clarification on requirement if any and update the clarification tracker with the same.
• Prepare/Review the data flow diagrams.
• Seek clarification on the requirements from the customer and participate in the requirement discussion with the customer , if required.
• Perform Proof of concepts or evaluate different tools( ETL/BI Tools) if required.
• Prepare high level flow documents, if required.
• Document of understanding and Requirement Traceability Matrix based on further inputs received from the customer during the study phase.
• Share updated documents with the stakeholders.
• Followup on query logs & ensure inputs from query logs are incorporated into KT documents.
• EIM QA : Seek inputs from the stakeholders on the data sources and the quality of data.
• Gather inputs on the data source analysis and data profiling results from the client.
• Understand the impact of data quality on the already captured requirements.
• Participate in the business data mapping meetings to understand the requirements.
• Participate in the data quality meeting to understand the requirements related to data cleansing.
• BI Coordinate with the EIM QA team in case of any discrepancy in the data quality.
Test Planning :
• Provide inputs to the Test Lead to create test strategy (that includes release date, types of testing, scope of the project, detail explanation of particular release, risk etc).
• Gather inputs from the client (eg Types of testing, Development teams, Client managers etc).
• Prepare test plan covering timelines, resource (Man /Machine) deployment (How it will be tested).
• Seek signoff on the test plan.
• Provide Test Environment Requests.
• Identify need for test automation along with Test Lead.
• Provide inputs exit/entry criteria and the functionality to be tested.
• Incorporate the previous learning from the earlier test cases in the current test plan.
• Provide inputs from EIM data quality.
Test the data quality and share the inputs with BI QA
Based on the inputs gathered from EIM QA, update the test strategy if required
Test Design :
• Identify the test scenarios based on the understanding of systems, interfaces and application.
• Identify end to end business critical scenarios.
• Create/Review the test scenarios (created by the Test Analyst) and RTM.
• Validate with BA to ensure comprehensiveness.
• Participate in customer review meetings if required.
• Based on the changes in the requirement, identify/create regression scenario and the impacted areas if required.
• Review the test cases as created by Test Analyst.
• If required, provide Test Case walkthrough and seek customer sign off inclusive of Prioritization of Test cases, Optimization options.
• Create/ review test scripts.
• Share test scripts with the Test Lead.
• Identify and validate test scenarios for automation.
• Create automation test scripts if required.
• Share created test scripts with Test Lead for review.
• Conduct pilot automation test run to validate the test scripts.
• Upload the test cases in the test management tool (eg QC).
• EIM QA Identify the reconciliation framework points for the ETL testing.
Test Development :
• Identify the test data sources based on the analysis of the requirements.
• Identify the test data requirements to ensure test coverage and share the same with the Test Analyst.
• Seek approval from the stakeholders on the same.
Test Execution/Functional Testing :
• Test the defect and update the status, if required.
• Perform high level/Sanity testing to ensure testing is intact.
• Identify defects and log failures.
• Track defects (defect log as generated from the QC tool) to closure.
• Participate in the defect triages to gather evidence for defect identification on periodic basis for defect prioritization and fix.
• Conduct acceptance testing, if required.
• Track defect metrics to ensure testing effectiveness as generated by QC tool.
• Track # of test cases executed as part of the daily tracker to ensure productivity.
• Provide automation/white box and other NFR status to Test Lead.
• Conduct performance testing for ETL batches/ reporting.
Validate the internal layers by the reconciliation model/ framework
Test Closure :
• Prepare Test summary Report and share the same for review.
• Identify learning from the projects.
• If requested, participate in release management to share impact of production dates.
• Periodically generate metrics based on the audience.
Production Support QA / Production Validation :
• Identify the known issues and track it in the known issue tracker.
• Share the same with the stakeholders.
• Conduct training for the new users if required.
• Participate in knowledge transition from incumbent or development stakeholders, gather and maintain test inventory (test scenarios, cases, scripts, Known issue tracker, Regression suite details etc).
• Conduct regression testing to approve minor enhancement and release etc.
• Create a deviation document capturing challenges etc.
• on need basis.
• Identify data checks for production data Perform production data checks post production release or ongoing production data validation.
• Collaborate with the support analyst within AVM team to support code development , test data preparation , test environment availability etc.
Delivery Management :
• Provide inputs on the test execution metrics to the Test Lead.
Knowledge Management :
• Create and upload the reusable assets (like Application flow document etc) on the KM portal.
• Review and Update KM documents (eg functional document for each release) as per the tool/application up gradation and customer/project needs.
• Capture and document the business/ application levels requirements details in WIKI that can be used for induction of new members to the project teams.
• Conduct KT for new team members.
• Stakeholder/ Client management Identify and track the risks along with the mitigation for closure.
• Prepare status reports (task planned for the current & next week, tasks accomplished, action items.
• analysis resource workload etc).
• Share information with the Test Lead on the project health.
Proficiency Level *
Webservices - SOAP
Infa MDM Testing
Java Message Service
Proficiency Level *
* Proficiency Legends
The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.