Category: QA Department

  • UAT Procedure (clients report feedback via JIRA) – Draft

    Stakeholder

    Name

    Status

    Date of Sign Off

    Client Representative

    BE Lead

    FE Lead

    Project Manager

    QA Lead (Author)

    1. Purpose

    The purpose of the document is to provide all needed information before the User Acceptance Testing (UAT) phase that includes readiness of build, test ware and instructions on how to work during the UAT phase.

    2. Environment list

    Environment

    Link to WP admin

    Link to the site

    Comment

    Development

    • Regression testing environment

    • UAT environment

    Staging

    • Content and functionality configuration 

    Production

    3. Browsers/Devices

    Browser/device

    Component testing (full scope)

    Smoke/Sanity testing

    Regression testing

     

    detailed testing per component using the whole testing scope based on created test cases
    use only high priority tests while testing and use exploratory testing (experience based technique)
    use regression test scenarios to confirm correct behavior for the previously delivered critical functionality when the new build is deployed

    Win10 + Chrome (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Edge (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Firefox (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    MacOS + Safari (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 14 Pro Max (the latest iOS + Safari)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 13 Pro (the latest iOS + Safari)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S22 Ultra (the latest Android + Chrome)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S21 Plus (the latest Android + Chrome)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    4. Requirements references

    Type

    Link

    Comment

    Backlog

     

     

    Requirements

     

     

    Design

     

     

    5. Testing documentation

    Document

    Link

    Test cases

    Test data

    6. Known Defects

    This section describes all defects which were found during the QA phase.
    (синяя звезда) IMPORTANT: Please use this information to avoid duplicates while creating defects within the UAT phase.

    <link to Existing Defects in Jira> – here you can find all open /in progress /resolved defects during the QA phase of the project.

    <UAT Defects Board> – here you can find all issues found by the client team during UAT.

    7. Types of tickets will be used

    • Bug – an issue that shows a difference between the site and designs. Sometimes it can be irregular behavior.

    • Task as a Change Request (appropriate "CR" label should be added to the ticket) – an issue that comes from Bug or recommendation. Need discussion with the client and approval as well (see <link to Change Request Log>). Type of issue to implement or update already implemented functionality or to develop the new one.

    8. Process Flow

    This section describes recommendations to organize the UAT phase that can help to avoid misunderstanding and improve efficiency in communication and problem solving.

    8.1 UAT Timeframe

    The UAT will last from dd/mm/yyyy till dd/mm/yyyy. During this time all the reported bugs and issues will be investigated by the Development team and qualified as Bugs or Change Requests. The bugs will be fixed in the order of their priorities. If the team will not have time to fix all bugs by the end of the UAT phase, the lower priority bugs will be transferred to the post-launch warranty phase.

    8.2 UAT Entry Criteria

    1. UAT environment is up and running.

    2. Resources are sufficient and available to support UAT.

    3. Roles and responsibilities are identified.

    4. The UAT procedure is defined, the communication flow is agreed.

    5. UAT Procedure page is available (link to be provided via e-mail).

    6. Reporting-related specifics are agreed.

    7. UAT exit criteria are revised/defined.

    8. The component testing for the planned devices and browsers has been completed.

    9. The regression testing for the planned devices and browsers has been completed.

    10. All bugs with priority blocker and critical have been fixed and closed.

    8.3 UAT Exit Criteria

    1. All planned UAT test cases have been executed and passed.

    2. All bugs with priority blocker and critical have been resolved and re-tested.

    3. Any remaining priority bugs have been placed in the development schedule.

    4. Any CR’s/overlooked requirements have been negotiated and a plan is in place for addressing them.

    5. Successful executive go/no-go decision for launch.

    6. Planned reporting and analysis have been done.

    UAT should start only when:

    • Project Development phase with all planned Development activities is finished for all Site Components.

    • UAT Entry Criteria are met.

    otherwise timeline need to be revised.

    (информация) Note: To get enough time for the bug-fix during the UAT phase it is strongly recommended to report found Defects for the site Components/Features right after it was found/discovered, to not get lots of JIRA defects reported during the last UAT days. If the development team will get all found defects reported in JIRA only during the last UAT days, there will be no possibility to fix them and retest till the Release Date.

    8.4 Communications during UAT

    UAT Sync-up meetings will be scheduled according to the UAT Timeframe.
    All defects/issues should be reported to the JIRA.

    Click Create Issue, do so through the <link to the project in JIRA> and follow the instruction below:

    Filed

    What to fill

    Project

    <Project Name>

    Issue type

    Bug

    Summary

    A brief one-line summary of the issue.
    Summary field name should be filled according template:
    [Area name] <Problem short and concrete description>

    Component

    <Component’s name> select from the dropdown. For each project, the list of components is specific.

    Description

    Description field should be filled as in following template:

    Preconditions:

    REPLACE the TEXT with needed configuration, if applicable.

    Steps to reproduce:

    1. Step_1.

    2. Step_2.

    3. Step_3.

    Actual result:

    Clear description of what actually happened.

    See attached screenshot for more details.

    <Screenshot>

    Expected result:

    Clear description of what should have been happened.

    See attached design for more details.

    <Screenshot> / <link to the design>

    Additional information:

    REPLACE the TEXT with Additional information, if applicable.

    Priority

    The degree of importance for the business to resolve the defect. It is driven
    by business value and indicates how soon the defect should be fixed.

    Labels

    UAT (required)

    Environment

    one of the (DEV, STG, PROD)

    Attachment

    If you can supplement your bug report with a picture that shows the
    problem, or a score that helps others reproduce, fix and verify the problem
    quickly, attach these files to the bug report. The attached files can be as
    follows: pictures, video-recording, other files types, if needed.

    Linked Issues

    <link to the related issue>

    Assignee

    The person whom the bug is assigned to (QA Lead)

    Epic Link

    Contains a link to Epic in accordance with overall structure of epics on the
    project. Choose from the drop down.

    Sprint

    Active sprint. Choose from the drop down.

    QA Scope

    <testing scope>

    (информация) NOTE THAT

    • Point Preconditions is optional and filled when the additional configuration is required to reproduce the bug. (Ex. Product with a specific variation (or allocation) is configured. Specific promotion is configured);

    • Point Additional information is optional and filled only in case of some additional info is required to reproduce the bug OR in case of any additional places where the issue can be reproduced.

    (синяя звезда) IMPORTANT: Defects that do not follow the template can be re-assigned back to their reporter in case of an unclear description. We encourage the UAT team to support any Bug report with a screenshot/attachment for better understanding.

    9. Ticket Flow

    Development team will work with UAT issues according to the priority of the issue (from highest to lowest), so please pay attention to the Defects priority section below when selecting issues priority.

    After some investigation and testing ticket can be:
    1. assigned to the Developer for fixing (it means that the ticket is clear enough, so the dev team can work on it).
         2. assigned back to the reporter:
              a) Resolve ticket with specific status using comments (it can be: Done, Cannot reproduce, Won’t Do, Duplicate, etc). In this case the reporter can:

    Close (move the ticket to Done status) if it is clear per comment;

    Reopen (move the ticket to Open status) if the ticket is still valid.

    3. after fixing the defect, Developer will assign the defect to QA for verification. There are two possible ways:
              a) QA will approve that fix is acceptable and assign the defect to the reporter for final verification (see actions described in #2);
              b) QA will reject the fix and will change the issue status to Reopened and assign it back to Developer.
         4. all fixes will be verified on the UAT environment by QA and assigned to the Reporter from the Client side with Ready for Testing status.

    10. Defects priority

    Priority shows the degree of importance for the business to resolve the issue. In other words, the priority is driven by business value and indicates how soon the Issue should be fixed.

    Priority
    Level

    Description

    Highest
    (Blocker)

    Indicates that Defect needs to be fixed immediately. It evidences that core functionality fails or test execution is completely blocked and/or (some part of) project is blocked.

    Examples:

    • Inability to add product to cart

    • Inability to proceed to Checkout

    • Inability to complete payment

    • Mini cart disappears, cart page can’t be opened

    • Inability to log in with valid credentials

    High
    (Critical)

    Indicates that this issue has a significant impact and Defect needs to be fixed soon, but as early as possible. It shows that important functionality fails but we don’t need to test it right now and we have a workaround.

    Examples:

    • Accordions can’t be expanded

    • Wrong validation behavior for input fields

    • Inability to add new address within MA

    • There is no validation for e-mail field on Login page

    • Error is occurred when try to add product to Cart from Wishlist

    Medium
    (Major)

    Indicates that this Issue is causing a problem and requires urgent attention.

    Examples:

    • Wrong displaying for alternative images carousel

    • Navigation arrows don’t work on Mobile

    • There is no fields validation when it’s required

    Low
    (Minor)

    Indicates that this issue has a relatively minor impact that not breaking the business logic, user interface problem.

    Examples:

    • Incorrect font size/type is used for content

    Lowest
    (Trivial)

    Indicates that fixing of this Defect can be deferred until all other priority Issues are fixed.

    Examples:

    • Minor look and feel issues

    • Minor glitches

    • Not so obvious spell mistakes

    • Any other minor issue causing an inconvenience

  • UAT Procedure (clients report feedback via doc) – Draft

    Stakeholder

    Name

    Status

    Date of Sign Off

    Client Representative

    BE Lead

    FE Lead

    Project Manager

    QA Lead (Author)

    1. Purpose

    The purpose of the document is to provide all needed information before the User Acceptance Testing (UAT) phase that includes readiness of build, test ware and instructions on how to work during the UAT phase.

    2. Environment list

    Environment

    Link to WP admin

    Link to the site

    Comment

    Development

    • Regression testing environment

    • UAT environment

    Staging

    • Content and functionality configuration 

    Production

    3. Browsers/Devices

    Browser/device

    Component testing (full scope)

    Smoke/Sanity testing

    Regression testing

     

    detailed testing per component using the whole testing scope based on created test cases
    use only high priority tests while testing and use exploratory testing (experience based technique)
    use regression test scenarios to confirm correct behavior for the previously delivered critical functionality when the new build is deployed

    Win10 + Chrome (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Edge (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Firefox (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    MacOS + Safari (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 13 Pro Max (the latest iOS + Safari)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 12 Pro Max (the latest iOS + Safari)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S22 Ultra (the latest Android + Chrome)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S21 Plus (the latest Android + Chrome)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    4. Requirements references

    Type

    Link

    Comment

    Backlog

     

     

    Requirements

     

     

    Design

     

     

    5. Testing documentation

    Document

    Link

    Test cases

    Test data

    6. Known Defects

    This section describes all defects which were found during the QA phase.
    (синяя звезда) IMPORTANT: Please use this information to avoid duplicates while creating defects within the UAT phase.

    <link to Existing Defects in Jira> – here you can find all open /in progress /resolved defects during the QA phase of the project.

    <UAT Defects Board> – here you can find all issues found by the client team during UAT.

    7. Types of tickets will be used

    • Bug – an issue that shows a difference between the site and designs. Sometimes it can be irregular behavior.

    • Task as a Change Request (appropriate "CR" label should be added to the ticket) – an issue that comes from Bug or recommendation. Need discussion with the client and approval as well (see <link to Change Request Log>). Type of issue to implement or update already implemented functionality or to develop the new one.

    8. Process Flow

    This section describes recommendations to organize the UAT phase that can help to avoid misunderstanding and improve efficiency in communication and problem solving.

    8.1 UAT Timeframe

    The UAT will last from dd/mm/yyyy till dd/mm/yyyy. During this time all the reported bugs and issues will be investigated by the Development team and qualified as Bugs or Change Requests. The bugs will be fixed in the order of their priorities. If the team will not have time to fix all bugs by the end of the UAT phase, the lower priority bugs will be transferred to the post-launch warranty phase.

    8.2 UAT Entry Criteria

    1. UAT environment is up and running.

    2. Resources are sufficient and available to support UAT.

    3. Roles and responsibilities are identified.

    4. The UAT procedure is defined, the communication flow is agreed.

    5. UAT Procedure page is available (link to be provided via e-mail).

    6. Reporting-related specifics are agreed.

    7. UAT exit criteria are revised/defined.

    8. The component testing for the planned devices and browsers has been completed.

    9. The regression testing for the planned devices and browsers has been completed.

    10. All bugs with priority blocker and critical have been fixed and closed.

    8.3 UAT Exit Criteria

    1. All planned UAT test cases have been executed and passed.

    2. All bugs with priority blocker and critical have been resolved and re-tested.

    3. Any remaining priority bugs have been placed in the development schedule.

    4. Any CR’s/overlooked requirements have been negotiated and a plan is in place for addressing them.

    5. Successful executive go/no-go decision for launch.

    6. Planned reporting and analysis have been done.

    UAT should start only when:

    • Project Development phase with all planned Development activities is finished for all Site Components.

    • UAT Entry Criteria are met.

    otherwise timeline needs to be revised.

    (информация) Note: To get enough time for the bug-fix during the UAT phase it is strongly recommended to report found Defects for the site Components/Features right after it was found/discovered, to not get lots of JIRA defects reported during the last UAT days. If the development team will get all found defects reported in JIRA only during the last UAT days, there will be no possibility to fix them and retest till the Release Date.

    8.4 Communications during UAT

    UAT Sync-up meetings will be scheduled according to the UAT Timeframe.

    All defects/issues found by the client will be shared via spreadsheet using prepared template (<link_to_spreadsheet_tempwwlate>).
    After some investigation and agreement those defects/issues should be reported to JIRA (by QA).

    Click Create Issue, do so through the <link to the project in JIRA> and follow the instruction below:

    Filed

    What to fill

    Project

    <Project Name>

    Issue type

    Bug

    Summary

    A brief one-line summary of the issue.
    Summary field name should be filled according template:
    [Area name] <Problem short and concrete description>

    Component

    <Component’s name> select from the dropdown. For each project, the list of components is specific.

    Description

    Description field should be filled as in following template:

    Preconditions:

    REPLACE the TEXT with needed configuration, if applicable.

    Steps to reproduce:

    1. Step_1.

    2. Step_2.

    3. Step_3.

    Actual result:

    Clear description of what actually happened.

    See attached screenshot for more details.

    <Screenshot>

    Expected result:

    Clear description of what should have happened.

    See attached design for more details.

    <Screenshot> / <link to the design>

    Additional information:

    REPLACE the TEXT with Additional information, if applicable.

    Priority

    The degree of importance for the business to resolve the defect. It is driven
    by business value and indicates how soon the defect should be fixed.

    Labels

    UAT (required)

    Environment

    one of the (DEV, STG, PROD)

    Attachment

    If you can supplement your bug report with a picture that shows the
    problem, or a score that helps others reproduce, fix and verify the problem
    quickly, attach these files to the bug report. The attached files can be as
    follows: pictures, video-recording, other files types, if needed.

    Linked Issues

    <link to the related issue>

    Assignee

    The person whom the bug is assigned to (QA Lead)

    Epic Link

    Contains a link to Epic in accordance with overall structure of epics on the
    project. Choose from the drop down.

    Sprint

    Active sprint. Choose from the drop down.

    QA Scope

    <testing scope>

    (информация) NOTE THAT

    • Point Preconditions is optional and filled when the additional configuration is required to reproduce the bug. (Ex. Product with a specific variation (or allocation) is configured. Specific promotion is configured);

    • Point Additional information is optional and filled only in case of some additional info is required to reproduce the bug OR in case of any additional places where the issue can be reproduced.

    9. Ticket Flow

    Development team will work with UAT issues according to the priority of the issue (from highest to lowest), so please pay attention to the Defects priority section below when selecting issues priority.

    Initially clients report and send feedback via spreadsheet using a prepared template (<link_to_spreadsheet_template>)

    After some investigation items from a spreadsheet can be:

    1. Raised as bugs in JIRA (if the issue is clear enough):

    a) QA creates a ticket in JIRA and assigns it to the responsible BE or FE developer and marks the appropriate item in the client’s spreadsheet as In Progress;;

    b) after fixing the defect, Developer assigns it to QA for verification in “Ready for Testing“ status:

    – QA approves that fix is acceptable, moves the ticket to “QA Approved“ status and marks the appropriate item in the client’s spreadsheet as Fixed;

    – client verifies the issue in the spreadsheet:

    * if it is fixed – mark the item in the spreadsheet as Done, then QA moves the appropriate ticket to “Done“ status in JIRA;

    * if it is not fixed – mark the item in the spreadsheet as Reopen, then QA moves the appropriate ticket to “Opened“/“Reopened“ status in Jira and backs it to the developer;

    2. Returned to the client with a comment (it can be: Done, Cannot reproduce, Won’t Do, Duplicate, etc).

    In this case the client can:

    a) Close item in the spreadsheet (mark as Done) if it is clear per comment → then QA moves the appropriate ticket to “Done“ status in JIRA;

    b) Reopen item in the spreadsheet (mark as Open) if the issue is still valid → then QA creates a ticket in JIRA and assigns it to the responsible BE or FE developer (see actions described in #1).

    3. Defined as CR (change request):

    a) QA sends the list of CR items to the PM

    b) PM discusses with the client

    c) Approved CRs are created as tasks in JIRA (with “CR“ label) and processed according to the workflow (see actions described in #1)

    10. Defects priority

    Priority shows the degree of importance for the business to resolve the issue. In other words, the priority is driven by business value and indicates how soon the Issue should be fixed.

    Priority
    Level

    Description

    Highest
    (Blocker)

    Indicates that Defect needs to be fixed immediately. It evidences that core functionality fails or test execution is completely blocked and/or (some part of) project is blocked.

    Examples:

    • Inability to add product to cart

    • Inability to proceed to Checkout

    • Inability to complete payment

    • Mini cart disappears, cart page can’t be opened

    • Inability to log in with valid credentials

    High
    (Critical)

    Indicates that this issue has a significant impact and Defect needs to be fixed soon, but as early as possible. It shows that important functionality fails but we don’t need to test it right now and we have a workaround.

    Examples:

    • Accordions can’t be expanded

    • Wrong validation behavior for input fields

    • Inability to add new address within MA

    • There is no validation for e-mail field on Login page

    • Error is occurred when try to add product to Cart from Wishlist

    Medium
    (Major)

    Indicates that this Issue is causing a problem and requires urgent attention.

    Examples:

    • Wrong displaying for alternative images carousel

    • Navigation arrows don’t work on Mobile

    • There is no fields validation when it’s required

    Low
    (Minor)

    Indicates that this issue has a relatively minor impact that not breaking the business logic, user interface problem.

    Examples:

    • Incorrect font size/type is used for content

    Lowest
    (Trivial)

    Indicates that fixing of this Defect can be deferred until all other priority Issues are fixed.

    Examples:

    • Minor look and feel issues

    • Minor glitches

    • Not so obvious spell mistakes

    • Any other minor issue causing an inconvenience

  • Checklist template

    ID

    Summary

    Status

    Comment / Link to the bug in JIRA

    #

    Summary of verification

    Pass

    Fail

    put here a link to the bug in Jira in case of a Failed verification

  • Test Case template

    ID

    Summary

    Priority

    Component

    Description

    Documentation / Based on

    Labels

    Test Step / Activity

    Expected Result

    Verification marks or comments / Test data

    #

    Summary of test case

    Critical

    Major

    Minor

    Optional

    Short description of test case or its pre-conditions

    Optional

    Use existing labels or create any new

    Steps to execute

    Expected result after test case execution

    Test data for test case execution (optional)

  • Testing checklists

    Frontend

    • design – pixel perfect by default (discuss with PM)

    • font, color, spacing, margin, padding etc

    • desktop/tablet/mobile + browsers

    • Scroll / Swipe на мишках/тачпадах MacOS

    Backend

    • submit forms → check data in WP admin

    • 404 page styling

    WP admin

    • usability

    • configure pages with different blocks, text, images etc

    • check menus for all required fields

    • check if all is understandable in the WP admin

    • check if all is configurable in the WP admin

    For all ONELINE projects, make sure that the information about CHE IT is deleted, i.m.

    • Information about devs, check site – here

    • Personal emails in the contact form configurations, admins emails

    • Names of users in the admin panel

    To discuss:

    • Дивитися наявність слеша в кінці посилань, як того просив SEO відділ (Павло)

    • Відписати тестового підписника і видалити з 3стор сервісу (Катя)

    • Обов’язкове і правильне виведення повідомлень про помилку, успішну підписку і те, що користувач вже підписався раніше (Павло)

  • Supported browsers/devices

    Approvals 

    Stakeholder

    Full Name

    Status

    Date of Sign Off

    Client Representative

     

     

     

    Project Manager

     

     

     

    DEV Lead

     

     REVIEW

     

    FE Lead

     

     NOT STARTED

     

    QA Lead

    APPROVED

    Browsers and Devices Scope

    Browser/device

    Component testing (full scope)

    Smoke/Sanity testing

    Regression testing

     

    detailed testing per component using the whole testing scope based on created test cases
    use only high priority tests while testing and use exploratory testing (experience based technique)
    use regression test scenarios to confirm correct behavior for the previously delivered critical functionality when the new build is deployed

    Win10 + Chrome (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Edge (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Firefox (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    MacOS + Safari (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 13 Pro Max (the latest iOS + Safari)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 12 Pro Max (the latest iOS + Safari)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S22 Ultra (the latest Android + Chrome)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S21 Plus (the latest Android + Chrome)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Clarification:

    (синяя звезда) – used

    (синяя звезда) – not used

    Email Clients

    Browser/Device

    Email Client

    Chrome, Windows

    Gmail web

    Android

    Gmail application

    iOS

    Default Mail application

    To test on different browsers/devices please refer to the BrowserStack:

    https://www.browserstack.com/

  • Bug report instructions

    1. Bug report template

    1.1 Summary

    Rule

    Example

    Title should be self-descriptive ("What?" "How behaves?" "While what conditions?").

    [Vacancies]. Error page is displayed after visiting the Vacancy page and changing its status.

    Bug specific should be stated on first place (ex.: reproducible only on some device type, intermittent issue, environment specific, any other unique attribute).

    [Tablet] [Mobile] [RT]. Profile page is not adapting to the new layout after rotating the device.
    [Intermittent] [PROD] [Companies]. Updates are not displayed on the company preview page after clicking the “Vorchau“ button.

    Component name should be stated at the beginning of the title.

    [Applicants]. User is not shown in the Applicants tab after applying on the publication.

    Avoid using not exact phrases such as "working not appropriately", "not proper way","not per design". Try to be as specific as possible.

    [PaaS]. Report is downloaded in a not proper format.

    1.2 Description

    • Steps should be as specific as possible;

    • Examples of pages, profiles, vacancies, etc. that could be used for easier bug reproducing should be provided in any suitable form (URL, ID, etc.);

    • Actual and Expected results should be provided with appropriate screenshots, whenever applicable;

    • During bug creation, separate critical/major/minor bugs and create a separate bug for each issue;
      You can combine the bugs if they have the same priority. But if the priority is different - please create 2 or more bugs.

    • If during retesting the bug, the issues described initially are fixed, close the bug, and for all new issues that appeared after the fix and weren't described in the bug initially create a NEW bug.

    1.3 Bug template

    Filed

    What to fill

    Project

    <Project Name>

    Issue type

    Bug / CB - client issue

    Summary

    A brief one-line summary of the issue.
    Summary field name should be filled according template:
    [Area name] <Problem short and concrete description>

    Component

    <Component's name> select from the dropdown. For each project, the list of components is specific.

    Description

    Description field should be filled as in following template:

    Preconditions:

    REPLACE the TEXT with needed configuration, if applicable.

    Steps to reproduce:

    1. Step_1.

    2. Step_2.

    3. Step_3.

    Actual result:

    Clear description of what actually happened.

    See attached screenshot for more details.

    <Screenshot>

    Expected result:

    Clear description of what should have been happened.

    See attached design for more details.

    <Screenshot> / <link to the design>

    Additional information:

    REPLACE the TEXT with Additional information, if applicable.

    Priority

    The degree of importance for the business to resolve the defect. It is driven
    by business value and indicates how soon the defect should be fixed.

    Labels

    Choose from the drop down. The label indicates the bug is related to some component. For each project, the list of labels is specific.

    Environment

    one of the (DEV, STG, PROD)

    Attachment

    If you can supplement your bug report with a picture that shows the
    problem, or a score that helps others reproduce, fix and verify the problem
    quickly, attach these files to the bug report. The attached files can be as
    follows: pictures, video-recording, other files types, if needed.

    Linked Issues

    <link to the related issue>

    Assignee

    The person whom the bug is assigned to (backend issues - BE/Dev Lead,
    Frontend issues - FE Lead)

    Epic Link

    Contains a link to Epic in accordance with overall structure of epics on the
    project.

    Severity

    The degree of impact that a defect has on the functionality of the tested component or system.

    Please, find the additional info here

    Sprint

    <sprint>

    QA Scope

    <testing scope>

    2. Verification statuses templates

     

     

     

    Passed

    Verification status: (синяя звезда) Passed

    Environment: <link>

    Device/Browser: Win10 + Chrome (the latest)

    Screenshot(s): <screenshot>

     

     

     Reopened

    Verification status: (синяя звезда) Reopened

    Environment: <link>

    Device/Browser: iPhone 13 Pro Max (the latest iOS + Safari)

    Actual result: <Actual result>

    Screenshot(s): <screenshot>

    Expected result: <Expected result>

     

     

    Blocked

    Verification status: (синяя звезда) Blocked

    Environment: <link>

    Device/Browser: Samsung Galaxy S22 Ultra (the latest Android + Chrome)

    Screenshot(s): <screenshot>

    Comment: <Additional info about blocking issue>

  • Test strategy

    1. Document Revision History 

    Date

    Version No.

    Author

    Description

    <dd/mm/yyyy>

    <x.y>

    <Document revision details/ Approvals>

    2. Approvals 

    The Test Strategy is approved by the following stakeholders: 

    Stakeholder

    Full Name

    Status

    Date of Sign Off

    Client Representative

    Project Manager

    DEV Lead

    FE Lead

    3. Purpose

    The purpose of the test strategy is to define the testing approach, the types of tests, test environments, tools to be used for testing, and the high-level details of how the test strategy will be aligned with other processes. The test strategy document is intended to be a living document and will be updated when we get more clarity on Requirements, Test environment and Build management approach, etc.

    4. Project Overview

    The main purpose of the <Project Name> project is to create and style the site based on the requirements.

    5. Tools for QA planning and testing purposes

    Confluence will be used for storing all project-related information.

    JIRA will be used as a bug tracking system, as well as for planning, tracking, and analyzing the project activities and tasks.

    Google docs will be used for creating different supporting documentation (for ex. checklists or test cases).

    Browserstack will be used for cross-browser/device testing.

    6. Requirements references for <Project Name>

    Type

    Link

    Comment

    Active Sprint

    Backlog

    Requirements

    Design

    7. Test Environment

    Name

    Link

    <Test environment 1>

    <link>

    <Test environment 2>

    <link>

    8. Testing Types

    The following testing types will be executed during the <Project Name> project:

    8.1.  Smoke testing

    The smoke testing will be performed to ensure that the most important functions work and all expected functional areas are available for testing. The results of smoke testing are used to decide if a build is stable enough to proceed with further testing. In other words, smoke tests play the role of acceptance criteria for each new build.

    8.2. Functional testing

    The functional testing will be executed to evaluate the compliance of a system or component or third-party with specified functional requirements and corresponding predicted results. Functional testing is performed for each planned feature and is guided by approved client requirements.

    8.3. Regression testing

    The regression testing will be performed to ensure that any bugs have been fixed and that no other previously working functions have failed as a result of the reparations and that newly added features have not caused any problems to previous versions of the software.

    Regression testing is usually performed when all the components are tested based on created high-priority test cases; no critical and blocking bugs are open that were found during the component testing.

    The regression testing is usually done after the code freeze and is always done before the deployment to production.

    8.4. Design (Responsive) testing

    The design testing will be performed for all testing levels to assure that it meets the design-related specifications.

    Responsive testing on tablet and mobile devices is focused on business logic for the project in the scope of features.

    Design testing will be based on the approved scope of the UI designs - <link to the design>

    (синяя звезда) Design testing will NOT be based on the pixel-to-pixel verification, but generally look only (elements positions, colors).

    (синяя звезда) Responsive testing on other intermediate resolution values is OOS.

    8.5. Cross-browser compatibility testing

    The cross-browser compatibility testing will be performed to check the ability of the solution to interact with the agreed list of browsers.

    Cross-browser testing will be covered manually on Test Environment only on browsers defined for cross-browser testing.

    9. Planned testing types on the test environments (browsers, devices, email clients)

    9.1. Browsers and Devices Scope

    Browser/device

    Component testing (full scope)

    Smoke/Sanity testing

    Regression testing

     

    detailed testing per component using the whole testing scope based on created test cases
    use only high priority tests while testing and use exploratory testing (experience based technique)
    use regression test scenarios to confirm correct behavior for the previously delivered critical functionality when the new build is deployed

    Win10 + Chrome (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Edge (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Win10 + Firefox (the latest)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    MacOS + Safari (the latest)

    * primary device for Desktop scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 13 Pro Max (the latest iOS + Safari)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    iPhone 12 Pro Max (the latest iOS + Safari)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S22 Ultra (the latest Android + Chrome)

    * primary device for Mobile scope

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Samsung Galaxy S21 Plus (the latest Android + Chrome)

    (синяя звезда)

    (синяя звезда)

    (синяя звезда)

    Clarification:

    (синяя звезда) - used

    (синяя звезда) - not used

    9.2. Email Clients

    Browser/Device

    Email Client

    Chrome, Windows

    Gmail web

    Android

    Gmail application

    iOS

    Default Mail application

    10. Approach for Process Flow

    10.1. Work with Tasks

    Tasks will be splitted on BE and FE.

    (warning) Pay attention:

    All specific statuses and labels should be defined according to the project.

    1. All Tasks which are selected to the current/next Sprint could be picked up for Test design.

    2. All Tasks that have the status “Ready for QA” should be assigned to QA.

    3. All found issues that relate to the Task should be linked to it.

    4. In case acceptance criteria are met, QA leaves a comment that the Task is "Ready for Demo".

    5. After the Demo, all Accepted Tasks will be closed by the responsible person.

    10.2. Work with Bugs

    Bug creation tips:

    • In case the found bug is related to a certain task - it should be linked to the task.

    • In case QA found Blocker/Critical bug during the testing ticket which is not related to the task - it should be added to the Active sprint.

    • In case QA found a Major/Minor/Trivial bug during the testing ticket which is not related to the task - it should be reported and added to the backlog.

    Bug verification tips:

    • In case the ticket is passed - QA should add a detailed comment with a screenshot (video if needed) and move it to the “Approved”/”Done” status.

    • In case the ticket is failed - QA should add a detailed comment with a screenshot (video if needed) and the ticket should have the “Reopened“ status.

    Bugs priority

    Priority shows the degree of importance for the business to resolve the Issue. In other words, the priority is driven by business value and indicates how soon the Issue should be fixed.

    Blocker

    Indicates that Defect needs to be fixed immediately. It evidences that core functionality fails or test execution is completely blocked and/or (some part of) project is blocked.

    Critical

    Indicates that this Issue is causing a problem and requires urgent attention.

    Major

    Indicates that this issue has a significant impact and Defect needs to be fixed soon, but as early as possible. It shows that important functionality fails but we don’t need to test it right now and we have a workaround.

    Minor

    Indicates that this issue has a relatively minor impact. So, it may be fixed after the release / in the next release.

    Trivial

    Indicates that fixing of this Defect can be deferred until all other priority Issues are fixed.

    10.3. Regression testing procedure

    The regression testing will be performed before the UAT based on impact analysis to ensure that any bugs have been fixed, that no other previously working functions have failed as a result of the changes, and that newly added features have not caused any problems to previous versions of the software.

    The scope for regression testing is planned based on priorities for planned test cases and covered by impact analysis if any.

    Entrance criteria:

    • Planned Tasks are done; all the found defects are registered in JIRA;

    • All blocker and critical defects for all features are fixed and acceptance criteria are met;

    • The features are deployed to the test environment - DEV.

    • The Production Candidate build is accepted by the QA team.

    Exit criteria:

    • All blocker and critical defects, found during Regression testing for all features are fixed and all acceptance criteria are met.

    • PO (product owner) confirms that all is good.

    • PO provides the final Go/NoGo decision.

    General tips:

    • After the QA team finished regression testing - an official email with the results and the list of issues should be sent to the client team;

    • All regression bugs have to be reviewed by the PO to confirm the business priority;

    • All blocker/critical bugs found during regression should be fixed prior to the release.

    11. Test Cases creation approach

    Test design will be started once any initial requirements (project passport, design etc) will be provided to the development team.

    As a result the high-level test cases for component testing will be created in the internal checklists for component and regression testing.

    Whenever existing requirements are changed or new ones are received, the test documentation will be expanded with new checklists and test cases.

    12. Build procedure

    Below are described the main QA activities after a new build is deployed:

    12.1. In case it's a Standard build with new features and bug fixing:

    QA team performs smoke testing

    • if there are no blocker/critical issues found and the build is stable - the QA team prepares a list of found bugs during smoke if any which informs that build is accepted. Product Owner prioritizes the bugs which should be a part of Sprint and which can go to the backlog.

    • if during smoke testing blocker/critical issues are found - the QA team prepares a list of found bugs during smoke if any which informs that build is declined. (After this the issue should be fixed and a new build should be deployed asap)

    After Smoke testing is done and the build is accepted, QA Team verifies fixed bugs and testing included tasks.

    12.2. In case it's a Release Candidate (Code Freeze):

    QA team performs smoke testing

    • if there are no blocker/critical issues found and the build is stable - the QA team prepares a list of found bugs during smoke if any which informs that build is accepted.

    • if during smoke testing blocker/critical issues are found - the QA team prepares a list of found bugs during smoke which informs that build is declined. (After this the issue should be fixed and a new build should be deployed asap)

    After Smoke testing is done and the build is accepted, QA Team verifies fixed bugs and starts/continuous Regression testing.

    12.3. In case it's a Release build:

    QA team performs smoke testing

    • if there are no blocker/critical issues found and the build is stable - the QA team prepares a list of found bugs during smoke if any which informs that build is accepted.

    • if during smoke testing blocker/critical issues are found - the QA team prepares a list of found bugs during the smoke. (After discussion with the Product Owner, two ways possible: 1. Rollback 2. Hotfix)

  • QA Department Home

    Welcome to your new space

    Use it to create something wonderful.

    To start, you might want to:

    • Customise this overview using the edit icon at the top right of this page.

    • Create a new page by clicking the + in the space sidebar, then go ahead and fill it with plans, ideas, or anything else your heart desires.


    Need inspiration?