QA workflow
Note: Specific teams may develop their own variations on this workflow. This is generally how QA works at Mattermost.
Daily
PR reviews (see below)
Jira tickets (see below)
Investigate and reproduce potential issues from reports in channels
Submit bug tickets as issues arise
Help engage with contributor questions in channels
Weekly
Participate in team triage/planning meetings
Participate in QA Guild meeting
Consider giving a demo in QA meeting and/or R&D meeting
(QA Platform) Rainforest release testing and maintenance
(QA Platform) Cypress webapp release testing and maintenance
(QA Platform) Manual release testing and maintenance
Monthly
SDET (Software Development Engineer in Test, the test engineers in QA) rotation to monitor nightly automated tests and investigate and update as needed (see below)
Oversee your team's release testing and QA-approve releases (cadence may vary by team/product)
(QA Platform) Detox mobile release testing and maintenance
As needed
Develop test plans for new features, add test cases to Zephyr test management
Approve new features after manual testing passes
(QA Platform) Oversee testing infrastructure and server needs
Help tend to community campaigns such as test automation Hackathons and QA community feature testing
Add or update documentation as you find need or opportunity
PR reviews
PR author adds
3: QA Review
label to every PRPR author adds a specific QA person as a reviewer after PM/UX and dev reviews are done
Exceptions can be made for general smoke testing or other agreed-upon scenarios when it makes sense for QA review to be done in parallel with other reviewers
QA review assignee defaults to the QA person on that team
Read through the PR and its associated Jira ticket to establish context
Create a test build if needed
Labels can be added to trigger test builds for webapp/server, mobile apps
Test server info auto-posts in a comment on the PR
Mobile test builds auto-post in QA: Builds channel; uninstall existing Mattermost app from your device, and install the test build from APK file or pList link
Desktop: Ask for a test build or new RC on the PR or request in the Developers: Desktop App channel
Establish or verify test plan and scope if not already spelled out
Can use Test Plan Template as appropriate, especially for major features
Ensure test cases are added or updated in Zephyr test management
For most PRs/bug fixes, note test details in the PR and related JIRA ticket (no separate document needed)
Ask questions and report issues in comments on the PR, mentioning the PR author (and the PM/another dev as needed)
Request changes as part of the Review Changes flow in GitHub, which can make it easier to manage in your PR to-do list
Verify unit and E2E tests are present and provide adequate coverage
If needed, add comment mentioning the author to ask about adding tests
Important: Ensure that E2E tests contain the proper mapping key to connect with Zephyr test management
Note that some pre-approved scenarios such as many Server team PRs that don't require QA testing may use the label
QA Deferred
When complete, select Review Changes, add comment, and approve
Remove label
3: QA Review
If other reviews are still open, add label
QA Review Done
If yours was the last review (commonly the case), add label
4: Reviews Done
You may be the one to merge the PR if you're the last reviewer; this varies by team
On the related Jira ticket, assign yourself as QA Assignee
Verifying and closing resolved Jira tickets
Handling various ticket resolutions
Won't Fix
,Won't Do
,Invalid
: Leave in Resolved (PMs go through these periodically)Duplicate
: Ensure the duplicate ticket is linked in a comment and close as duplicateDone
: Test according to test steps / test plan as appropriate
May not need to re-test for self-managed release if already satisfactorily tested for previous Cloud release
Be sure to test across Web App, Mobile apps, and Desktop App
Ensure it is noted in the ticket and test case if any of the expected behavior or the bug itself was environment-specific (e.g. Android-only, Safari-only, etc.)
Ensure related Zephyr test cases are linked to the ticket for traceability (can link to the test case from the ticket in the Zephyr section of the ticket)
Update test cases in Zephyr as needed
Always leave a comment on the Zephyr test case explaining your changes
Add a link to the related ticket in the test step field, as it allows testers an easy way to see if it’s a known issue or perhaps a “Won’t Fix” situation.
And triple-check that the test case shows the related ticket in Traceability
Issues found? Communicate with the dev Assignee and either reopen or file a new ticket as desired. Can at-mention the Assignee in a comment and/or in a channel to discuss what's best
Tested and all passed? Note testing results in a comment and close the ticket
Leaving open to re-test on other fix versions
If you test on Cloud and want to re-test when the next self-managed RC is cut, add label
done-cloud
and leave inResolved
If a hotfix is being backported to additional versions, add
done-5.38
for example, as the fix is tested on each versionAlso always leave a comment explaining what testing has been completed and what is left to test
UI test automation
Happy path for new features and bug fixes should be written or updated by the dev writing the feature or fix
Help to review E2E submissions and to add tests for more in-depth test cases
Ensure test case keys from Zephyr test cases are included in E2E tests
Help maintain nightly automated tests
Monitored by SDET on weekly rotation
See QA: UI Test Automation channel and QA: Test Automation Reports channel
Test failures can be routed to the appropriate team, but SDET rotation can help monitor and answer questions
Demote and promote nightly tests from and to production as needed
Help review and shepherd community contributions
Last updated