Test Strategy

1. Document Revision History 

Date

Version No.

Author

Description

08.12.2022

1.1

 Viktoriia Malysh

 Writing the common information about the project test strategy.

 

 

 

 

2. Approvals 

The Test Strategy is approved by the following stakeholders: 

Stakeholder

Full Name

Status

Date of Sign Off

Client Representative

Easol

 

 

Project Manager

Anna Potiuk

 

 

FE + BE Lead

 Elvin Agaragimow

 

 

3. Purpose

The purpose of the test strategy is to define the testing approach, the types of tests, test environments, tools to be used for testing, and the high-level details of how the test strategy will be aligned with other processes. The test strategy document is intended to be a living document and will be updated when we get more clarity on Requirements, Test environment and Build management approach, etc.

4. Project Overview

The Easol: Bingo-Loco is a site that helps to buy tickets to the Bingo Loco game-rave in different countries. It is made with teh help of the Easol Sandbox.

5. Tools for QA planning and testing purposes

Confluence will be used for storing all project-related information.

JIRA will be used as a bug tracking system, as well as for planning, tracking, and analyzing project activities and tasks.

DevTools will be used for testing the styles and the animations.

Browserstack will be used for cross-browser/device testing.

6. Requirements references for the The Easol: Bingo-Loco project

7. Testing Types

The following testing types will be executed during the “The Easol: Bingo-Loco ” project:

7.1.  Smoke testing

The smoke testing will be performed to ensure that the most important functions work and all expected functional areas are available for testing. The results of smoke testing are used to decide if a build is stable enough to proceed with further testing. In other words, smoke tests play the role of acceptance criteria for each new build.

7.2. Functional testing

The functional testing will be executed to evaluate the compliance of a system or component or third-party with specified functional requirements and corresponding predicted results. Functional testing is performed for each planned feature and is guided by approved client requirements.

7.3. Regression testing

The regression testing will be performed to ensure that any bugs have been fixed and that no other previously working functions have failed as a result of the reparations and that newly added features have not caused any problems to previous versions of the software.

Regression testing is usually performed when all the components are tested based on created high-priority test cases; no critical and blocking bugs are open that were found during the component testing.

The regression testing is usually done after the code freeze and is always done before the deployment to production.

7.4. Design (Responsive) testing

The design testing will be performed for all testing levels to assure that it meets the design-related specifications.

Responsive testing on tablet and mobile devices is focused on business logic for the project in the scope of features.

Design testing will be based on the approved scope of the UI designs – https://xd.adobe.com/view/d139a8f5-a9b2-47b8-8865-15d45489887b-846a/

(синяя звезда) Design testing will NOT be based on the pixel-to-pixel verification.

(синяя звезда) Responsive testing on other intermediate resolution values is OOS.

7.5. Cross-browser compatibility testing

The cross-browser compatibility testing will be performed to check the ability of the solution to interact with the agreed list of browsers.

Cross-browser testing will be covered manually on Test Environment only on browsers defined for cross-browser testing.

8. Planned testing types on the test environments (browsers, devices)

Mobile:

Tablet:

Desktop (macOS):

Desktop (Windows):

Android:

Android:

  • 2560px (Chrome, Firefox)

  • 1920 x 1080 (Chrome, Firefox)

  • 1536 x 864 (Chrome, Firefox)

  • 1440 x 900 (Chrome, Firefox)

  • 1024 x 768 (Chrome, Firefox)

  • 2560px (Chrome, Firefox)

  • 1920 x 1080 (Chrome, Firefox)

  • 1536 x 864 (Chrome, Firefox)

  • 1440 x 900 (Chrome, Firefox)

  • 1024 x 768 (Chrome, Firefox)

  • Samsung Galaxy S21 Ultra Chrome

  • Samsung Galaxy S8 Chrome

  • Samsung Galaxy Tab S8

iOS:

iOS:

  • iPhone 12 Safari & Chrome

  • iPhone 13 Safari & Chrome

  • iPhone 6 Safari & Chrome

  • iPad 9th Gen

 

 

 

 

 

 

9. Approach for Process Flow

9.1. Work with Tasks

Tasks will be splitted on BE and FE.

(warning) Pay attention:

All specific statuses and labels should be defined according to the project.

  1. All Tasks which are selected to the current/next Sprint could be picked up for Test design.

  2. All Tasks that have the status “Ready for QA” should be assigned to QA.

  3. All found issues that relate to the Task should be linked to it.

9.2. Work with Bugs

Bug creation tips:

  • The bugs should be created according to the task titles – for example “Bugs – Homepage” (“Bugs – page name – block name”). If there is more than one bug that relies on one task ticket, the bugs are written in one ticket. If the QA Engineer found the new bugs after the written ticket were fixed, the QA Engineer needs to write the new bug ticket and assign it to the Developer.

  • In case the found bug is related to a certain task – it should be linked to the task.

  • In case QA found Blocker/Critical bug during the testing ticket which is not related to the task – it should be added to the Active sprint.

  • In case QA found a Major/Minor/Trivial bug during the testing ticket which is not related to the task – it should be reported and added to the backlog.

Bug verification tips:

  • In case the ticket is passed – QA should add a detailed comment with a screenshot (video if needed) and move it to the “Approved”/”Done” status.

  • In case the ticket is failed – QA should add a detailed comment with a screenshot (video if needed) and the ticket should have the “Reopened“ status.

10. Regression testing procedure

The regression testing will be performed before the UAT based on impact analysis to ensure that any bugs have been fixed, that no other previously working functions have failed as a result of the changes, and that newly added features have not caused any problems to previous versions of the software.

The scope for regression testing is planned based on priorities for planned test cases and covered by impact analysis if any.

Entrance criteria:

  • Planned Tasks are done; all the found defects are registered in JIRA;

  • All blocker and critical defects for all features are fixed and acceptance criteria are met;

  • The features are deployed to the test environment – DEV.

  • The Production Candidate build is accepted by the QA team.

Exit criteria:

  • All blocker and critical defects, found during Regression testing for all features are fixed and all acceptance criteria are met.

  • PO (product owner) confirms that all is good.

  • PO provides the final Go/NoGo decision.

General tips:

  • All blocker/critical bugs found during regression should be fixed prior to the release.

  • The tested product should be reviewed by the PO.

Comments

Leave a Reply