Monday, August 29, 2016

MACHTESTED by David Tzemach

After 4 years of hard work, my blog became one of the most popular software testing blogs in the world (Almost 250K page reviews and More than 20K followers). 

To celebrate the occasion, I decided to make a few major changes (with the great help of my colleague Jaden Turner) that will help me to take my blog into the next level.

I want to thank you, the readers for accessing my blog and share your thoughts and comments, I promise to answer all open questions and article requests soon as possible.

Friday, August 26, 2016

Test Planning (TP) General Checklist | David Tzemach

Overview

The main purpose of me creating this document is to provide a good solid checklist for use by any team/group/project manager to ensure that all appropriate activities related to the project test planning (TP) have been addressed prior to starting the project.

This checklist will provide the main activities that related to different phases of the project (Design, preparation, implementation, Review and reporting).


Checklist

Preparation and Research

  • Define the requirements traceability matrix to ensure that each requirement is developed and tested.
  • Create the guidelines that the team can use in any case, for deviating from the original plan.
  • Determine the levels of authority for handling different aspects of the project.
  • Is the project relevant to a specific customer that has specific demands?
  • How do you intend to measure the effectiveness of the testing process?
  • Did you read all mandatory documents (SRS, PRD, MRD, etc.)?
  • Is the project Should follow other international standards?
  • Does the test plan involve external testing services?
  • Are there any special cases that you need to cover?
  • Did you review the application technical design?
  • Determine the reporting channels.


Automation

  • Do you have the knowledge to use the selected framework?
  • Who will monitor and investigate the automation results?
  • Do you need to schedule a training for the team?
  • Do you intend to use an automation framework?
  • Who will be responsible to automate the tests (Dedicated automation team or the tester that design the tests?).

Defect Management

  • Did you determine the defect management system that you are going to use?
  • Who will decide whether each defect will be fixed or not?
  • Is there any Bug template that you want to enforce?
  • Are there any guidelines to open/close defects?

Risks

  • Identify risks related to the testing process.
  • Do you have the tools to remove the risks?
  • How do you intend to remove the risks?

Testing criteria

  • Determine the Exit criteria after the completion of the testing process.
  • Determine the Entry criteria in order to start testing.
  • Determine the conditions that can halt the testing process (in any case that you have a major issue during the tests).

Project Timelines

  • What is the amount of time that you determine for each testing cycle?
  • When do you receive the first build of the application from dev team?
  • Is there any time allocated to re-test the application after bug fixes?
  • When do you intend to release the first version of the software?
  • Did the test plan scheduler is consisting with the project plan?
  • Who can create a new build when the test team needs it?
  • When do you intend to start/End each testing cycle?

Test Environment

  • Make sure that you have the relevant resources to build the test environment.
  • Do you have enough budget to order the hardware equipment?
  • Is there any limitation to create realistic test scenarios?
  • Determine the topology of the environment.
  • Determine the hardware set up timelines.
  • Are any special testing tools required?
  • Prepare the test environment.

Test Design and Strategy

  • Determine the test strategy and methodology that you are going to use to accomplish the project goals.
  • Define the deliverables of the testing effort project (Test coverage, Test Design, Reports, etc.)?
  • What areas of the application are not going to be covered (what are the reasons for that?)?
  • Define the level of testing (unit, component, integration, system and acceptance).
  • What type of tests do intend to use (Performance, Usability, Exploratory, etc.)?
  • Determine the test execution approach (Manual Vs. Automated).
  • Define the conditions that will stop/continue the testing process.
  • Validate that all functions are identified and covered in the TP.
  • Write the test plan that you are going to use on the project.
  • What is the testing coverage that you intend to achieve?
  • What areas of the application are going to be covered?
  • Create the test data that you need to use per test case.
  • Determine the testing effort that is needed to be done.
  • Who is responsible to approve the test plan?
  • Determine the weekly/daily report template.
  • Determine the pass/fail criteria for all tests.
  • Create the test scenarios and test cases.
  • Determine the priority of the test cases.


Test execution

  • Validate that the test "Entry" criteria is fulfilled prior to the start of the tests.
  • Validate that each tester understands the tests that he needs to execute.
  • Monitor the test progress and validate that they meet the timelines.
  • Validate that you adjust the test plan based on the testing results (Example: many bugs in a single location can indicate that we need to make more and deeper tests.)
  • Validate that the test team understand their responsibilities.
  • Make sure that you review the defects that testers identify.
  • Review the daily/weekly testing reports.
  • Validate that the test plan is enforced.
  • Review the test plan with the team members (High-Level) and make sure that you provide answers to any open question.

At the End of the testing phase

  • Validate that the test "Exit" criteria is fulfilled prior to moving the application to 'Alpha' testing phase.
  • Make sure that the user guide contains the "known Issues" that you find during the testing cycles.
  • Have the requirements specifications been updated, examined and approved?
  • Validate that the team accomplished the testing goals.
  • Summarize the test results with a dedicated (STR).
  • Review the test results with the team members.

Saturday, August 6, 2016

Performance Testing Checklist | David Tzemach



Performance tests are performed to determine the response time of a system under a specific workload, these type of tests will help us improve the user experience while working with the software.  

For more Information about performance testing, you can read my previous articles:



My checklist for performance testing 


The preliminary questions

·         Do you need to compare the performance results against your competitors?
·         What is the time frame that you have to execute the process?
·         How do you intend to execute the tests in real world?
·         What is the purpose of doing performance testing?
·         What are the performance test objectives?
·         What type of failures do you anticipate?
·         How flexible the timelines are?


Test Analysis prior to starting the project

·         What do you need to measure during the testing process (Response Time, Volume, memory etc.)?
·         What types of performance tests do you need to perform (Volume, Spike, Load, etc.)?
·         All functional tests of the application are done prior to the performance tests?
·         What do you need to cover during the testing process?
·         What are the Entry criteria to start the tests?
·         What are the interfaces of the application?
·         What is the pass/fail criteria that we use to evaluate the test results (Response time, memory consumption, I/o etc.)?

Team

·         What are the areas of responsibilities (Developers, Testers, Managers etc.)?
·         Do you have the relevant people to perform the process?
·         Do you need to add training to your team?


The available resources

·         Do you need to integrate with external support teams that will consume the budget?
·         How do you intend to create the testing data (Auto tool or Manually)?
·         Do you have a permanent / temporary resources?
·         What budget is available to perform the process?


Reporting

·         Do you need to use a 3rd party tool to generate the reports?
·         Who will be responsible to report the testing process?
·         What data should be included in the report?
·         How the report is conducted?
·         What are the timelines that the report should be delivered (Monthly, Weekly, Daily)?


Performance testing consideration


Test Environment

·         Do you have the resources to create the performance test environment as similar to real production?
·         What is the testing environment that we need to use to execute the tests?
·         How much time do you need to invest to create the test environment?
·         Are secondary servers being available in any case of server failure?
·         Who will be responsible to create and maintain the environment?
·         Do you intend to run on virtual architecture (VMware/Hyper-V)?
·         Do you have the knowledge to build a virtual environment?
·         Do you need to buy new hardware to support the tests?

Test Data  

·         What amount of data do we need to create as to support the testing scenarios?
·         How do you intend to create the testing data (Auto tool or Manually)?
·         Do you have enough storage to handle the testing data?
·         Can you replicate the data to reduce the creation time?
·         Can you use the test data in future projects?
·         What is the effort to create the testing data?
·         Does the testing data should be unique?
·         Where do you intend to store the data?


Testing Baseline

·         Who is responsible to provide the current performance baseline?
·         What is the baseline that you use to compare the test results?
·         What is the acceptable time for a functionality response?
·         Who will validate that the baseline is good enough?


Consider the costs

·         What is the hardware costs that you need to use to accomplish test project goals?
·         Do you have the budget to perform all the testing cycle that you need?
·         How can you reduce the costs without affecting the testing process?
·         Do you need to use 3rd party tools that will increase the costs?
·         Can you use open-source tools to reduce the costs?
·         What are the costs if you need external support?
·         What is the cost for each day of testing?


Testing Considerations

·         How do you intend to monitor the process when running on large volumes?
·         Can you reset the testing environment between each testing cycle?
·         How many users should be involved in the testing process?
·         How many test cycles do you intend to run?


Resources to monitor

·         Define the physical resources to be monitored throughout the test execution, Examples:
-          Memory (RAM).
-          Network transactions.
-          Processor.
-          I/O Disks.
·         Define the software resources to be monitored throughout the test execution, Examples:
-          Web Server.
-          Database Server
-          .NET
-          Classification Engine
-          System Services.
-          Different Software Cache

My Presentations