Mattermost Handbook
Need help?How to spend company moneyHow to update the HandbookRelease overview
0.2.1
0.2.1
  • Mattermost Handbook
  • Company
    • About Mattermost
      • List of terms
      • Business model
      • Mindsets
    • "How to" guides for staff
      • How to set up a 1-1 channel
      • How to update the handbook
      • How to manage Handbook notifications
      • How to change mobile device
        • How to handle a lost mobile device
      • How to do a mini-retrospective
      • How to autolink keywords in Mattermost
  • Operations
    • Company operations
      • Areas of Responsibility
      • Mattermost Leadership Team (MLT)
        • MLT cadence
      • Company measures
        • Metrics definitions
        • FY23 goals board
        • MLT metrics
      • Company cadence
      • Company policies
        • Community response policy
        • Security policy
      • Company processes
        • Issue/solution process
        • Company agreements
        • Publishing
          • Public web properties
          • Publishing guidelines
            • Brand and visual design guidelines
            • Voice, tone, and writing style guidelines
              • Contribute to documentation
            • Confidentiality guidelines
          • Post-publication quality control process
      • Handbook processes and policies
        • Handbook onboarding
      • Fiscal year planning
    • Research and Development
      • Organization
        • Tech Writing
        • Data engineering
        • Delivery
        • Cloud Platform
        • Site Reliability Engineering
        • GRC
        • Product Security
        • Security Operations
      • Processes
        • Feature Labels
      • Product
        • Product planning
          • Product philosophy and principles
          • Prioritization process
          • Release planning process
          • Roadmap views
          • Release plan
          • Launch plan
          • Feature requests
        • Development process
          • Mobile feature guidelines
          • Deprecation policy
          • Mattermost software requirements process
          • Jira ticket lifecycle
          • Creating new Jira bug tickets
            • Priority levels for tickets
            • Jira fix versions
        • Release process
          • Release overview
          • Feature release process
          • Dot release process
          • Security release process
          • Mobile app release process
          • Desktop app release process
          • Release tips
          • Release scorecard definitions
        • How-to guides for Product
          • How to use productboard
          • How to record a roadmap video
          • How to update integrations directory
          • How to write a feature release announcement
        • Product Management team handbook
          • Product Management Areas of Ownership
          • Product Manager onboarding
          • Product Manager levels
          • Professional development
        • Product Design team handbook
          • Product Design levels
        • Technical Writing team handbook
          • Work with us
          • User interface text guidelines
          • Documentation style guide
          • Our terminology
          • Guidelines for PMs and developers
          • Guidelines for community contributions
          • Technical Writer levels
          • Docathon 2021
            • Getting started with contributing
        • Growth
          • A/B testing methodology
          • PQL definition
        • Analytics
          • Product Analyst Engineer levels
          • Looker
            • Dashboards
            • Explores
          • Telemetry
        • Developer relations
        • Product team hangouts
      • Engineering
        • Infrastructure engineering
          • Cloud infrastructure cost KPIs
          • Cloud data export process
          • Cloud churn process
          • Reliability Manifesto
          • Production Readiness Review
          • Infrastructure Library
        • Integrations team processes
        • Plugin release process
        • Data Engineering
        • Sustained Engineering
          • On call
        • How to go to a conference
        • Public speaking
        • Core contributor expanded access policy
      • Quality Assurance
        • QA workflow
        • QA testing tips and tools
        • Rainforest process
    • Messaging and Math
      • How-to guides for M&M
        • How to create release announcements
        • How to create screenshots and GIFs
        • How to write Mattermost case studies
        • How to write guest blog posts for Mattermost apps and services
        • How to write Mattermost recipes
        • How to compose tweets
        • How to create a split test for web page
        • How to run meetups
        • How to run executive dinners
      • Checklists for M&M
        • Blog post checklist
        • Bio checklist
      • Mattermost websites
      • Demand generation reporting
      • M&M Asana guidelines
      • Content marketing
        • How to use the editorial calendar
        • Content development and distribution
        • Video content guidelines
        • How to contribute content
    • Sales
      • Deal Desk
      • Partner programs
      • Lead management
    • Deployment Engineering
      • Overview
      • Workflows
      • Frequently Asked Questions
      • Playbook for MME Sev 1 Outages
      • Status Update Template
    • Program Management
    • Customer Success
      • Customer Support
    • Legal
      • Contracts
      • Ironclad Basics
        • Company-Wide Workflows
        • Sales Contracts and Workflows
        • Signing a Contract and Contract Repository
    • Finance
      • Budget
      • How to use Airbase
        • Access Airbase
        • Navigate Airbase
        • How to submit a purchase request
        • How to submit a reimbursement request
        • How to review a reimbursement request
        • Vendor portal guide
        • Frequently asked questions
      • Onboarding
        • Vendor onboarding
        • ROW staff onboarding
      • Staff member expenses
        • How to spend company money
        • How to spend company money: Internships
        • Corporate credit card policy
        • How to access Airbase
        • Gifting policy
        • How to book airfare and travel
        • How to reimburse the company
        • How to convert currencies
        • How to get paid
      • Arrange a Bounty Program
      • Naming files and agreements
      • Risk management
        • Mattermost U.S. consulting agreements
      • Operations playbook
    • Security
      • Policies
      • Privacy
        • Data deletion requests
        • Data subject access requests
      • Product Security
        • Product Vulnerability Process
        • Working on security-sensitive pull requests
        • Secure Software Development guide
      • Security Operations
        • User guides
    • Workplace
      • PeopleOps
        • HR cadences
        • HR systems
        • HR Processes
        • Working at Mattermost
          • Onboarding
            • Things everyone must know
            • Staff onboarding
            • Engineer onboarding timeline and expectations
            • Manager onboarding
            • Frequently asked questions
          • Learning and development
          • Mattermost communication best practices
          • Paid time off
            • Out of office email example
          • Travel
            • Business travel insurance
          • Leaves of absence
            • Pregnancy leave
            • Baby bonding parental leave
            • Jury duty
          • Workplace program
          • Relocation
          • Total rewards
        • Performance reviews
          • Formal review process
          • New staff performance review
          • Informal review process
        • Transfers and promotions
        • Offboarding instructions for managers
        • People compliance
      • People policies
      • Groups
        • Staff Resource Groups
      • Approvals and iteration
      • IT
        • IT helpdesk
        • Hardware and software purchases
        • Hardware buy back policy
        • Software systems
  • Contributors
    • Contributors
      • Equity, diversity, and inclusion
      • How to contribute to Mattermost
        • Community Content program
        • Documentation contributions
        • Help Wanted tickets
        • Localization
        • Contribution events
      • Mattermost community
      • Contributor kindness
      • Community systems
      • Guidelines and playbooks
        • Social engagement guidelines
        • Contribution guidelines and code of conduct
        • Mattermost Community playbook
        • How to run a Hackathon
        • Hacktoberfest event organizer guide for Mattermost
    • MatterCon
      • Staff information privacy management
      • Mattermost events code of conduct
      • MatterCon2021
    • Join us
      • Ice-breakers
      • Help Wanted tickets
      • Localization
      • Mattermost GitHub sponsorship
      • Things candidates should know
      • Staff recruiting
      • Recruiting cadences
        • Product Manager hiring process
      • Exec recruiting
        • EA logistics
  • Help and support
    • Contact us
Powered by GitBook
On this page
  • Running tests, reviewing failures, and commenting in Zephyr
  • Run groups
  • Which run groups to run for which release?
  • Webhook for reporting to Zephyr
  • Preparing a run group to run
  • Reviewing failures
  • Crowd-tested run groups
  • Automated run groups
  • Commenting on test failures in Zephyr

Was this helpful?

Edit on Git
Export as PDF
  1. Operations
  2. Research and Development
  3. Quality Assurance

Rainforest process

PreviousQA testing tips and toolsNextMessaging and Math

Last updated 3 years ago

Was this helpful?

Running tests, reviewing failures, and commenting in Zephyr

Which run groups to run for which release?

Note: The browser tests for Cloud releases have been split into groups as the original groups were getting too big. For releases going out in the first half of the month, run the (1st Half of Month) groups. Similarly, for releases going out in the second half of the month, run the (2nd Half of Month) groups.

  1. Self-managed releases

    a. All run groups starting with (RFA) Browser_.

  2. Mobile releases

    a. All run groups starting with Android_ and iOS_.

  3. Cloud releases

    a. All run groups starting with Cloud Workspace_.

Additional notes

  • Run groups starting with Browser_ are the old crowd-tested groups. Once conversion of all groups to Rainforest automation (RFA) is complete, these groups will be deprecated.

  • Currently, we are not running any Desktop App tests in Rainforest.

Webhook for reporting to Zephyr

  1. Click on Staging:

  2. Check the box to the left of Webhook Enabled, then click on the Save button:

  3. Running results in this folder will have the same title with run ID as the run group you set off e.g. : Android - Account Settings, so all you need to do is drag and drop it to the release folder it belongs to.

Preparing a run group to run

  1. Click on the run group you want to run e.g. Android - Account Settings.

  2. You will see this screen:

  3. Click on the blue Run button, top right of the screen. You’ll then see this screen:

  4. Click on Additional Options bottom left. You’ll then see this screen:

  5. Type the release name into the Should this run be tied to your release process field. e.g. v1.50 release.

  6. Click the back arrow to the left of Additional Options which will take you back to this screen:

  7. Scroll down so that Platform is visible and check that the correct platform to test on is selected (for mobile runs, you can click on Mobile VM to display only the mobile options).

  8. Once you are certain that the correct platform is selected, (for this example Android 12.0 (Beta) is correct), click on Start Run.

Note: On the run results page, you can set a filter to show only “Standard” runs to filter out any draft runs.

Reviewing failures

Crowd-tested run groups

  1. You’ll be taken to this screen:

  2. Notice that 2 tests failed. Click on the first test that failed. You’ll be taken to this screen:

  3. Scroll down until you see a red X to the right of the step that failed, then click on the X.

    Note: Sometimes one crowd tester will get further along in a test than another tester, so there may be more than one point of failure, as there is in this example. In cases like this, click on both X’s to see what failed.

    Let’s look at the failure in the first red X.

    You can see that 2 testers passed the step but one tester failed the step due to the language in Mattermost not being in English. (This sometimes happens when previous testers have not reset the app to English before ending the test). In cases like this, add a comment that this was the case.

    Let’s look at the failure in the 2nd red X.

    You can see that one tester failed the step as he didn’t know how to change the language back to English in the incognito browser window. In this case, he did not read the instructions or follow them as they read: “Do you see language change to Chinese in the Mobile app? IMPORTANT: IF THIS STEP FAILS, do not LEAVE THE TEST BEFORE CHANGING THE LANGUAGE BACK TO ENGLISH IN THE INCOGNITO WINDOW BY FOLLOWING STEPS 9 AND 10”.

  4. When it starts to look like testers are confused, open the test and preview it to ensure instructions are clear and test steps are correct. To do this, click on Open Test at the top of the screen.

    a. Once you’ve clicked on Open Test, the test will open in a new browser window. Click on the Preview button at the top of the screen.

    b. In preview mode, the platform is not selected, and you need to do this before previewing the test. In this example, we need to click on Android 12. Once you’ve selected the platform, click on the blue Preview button bottom right of the screen.

    c. A new browser window will open showing the test preview window. Click on I understand and go ahead with previewing the test.

    d. The test preview is exactly as the testers would see it as they are testing. Usually, it's helpful to have the test open in another window to edit any steps that are unclear as I go through the preview.

  5. a. Click on the Categorize Failure button, top right of the screen.

    b. Select Needs Refactor, then click on the blue Next button (the option to refactor the test yourself is already checked).

    c. Type in your comments/reason for refactor, then click the blue Save button.

    d. Once you’ve clicked Save, you will see that the Categorize Failure button has changed to Categorized: Needs Refactor.

To review the next failure in a run group, you can click on the Next Failure or Previous Failure buttons at the bottom of the screen. Once all failures have been reviewed, these buttons are both grayed out.

Automated run groups

  1. You’ll be taken to this page. Click on the first failed test.

  2. This page will open and automatically scroll to the failed step, showing the corresponding UI on the right-hand side.

  3. At first glance, this seemed to be a bug as Team Open was not in view. However, when reviewing previous steps in the automated test run, I couldn’t work out why the test had failed. Sometimes it’s difficult to see exactly what was clicked on in the automated replay. In this instance, I opened the test and ran a preview. To do this, click on the Open Test option, top right of the screen.

  4. You’ll be taken to the test preview page. Once the VM has loaded on the right-hand-side, click the Select All button, top left of the screen then manually deselect the first 2 steps as these are to load the VM to the starting page.

  5. Click on the blue Preview xx Actions button and the automated preview of the test will start. If any step fails while you are previewing, you can determine why, recapture the screenshot, add in a “wait” step, etc etc. We can do some practice runs to review automated failures :)

  6. For this particular example, I ran the test preview and it passed so I categorized this test as Other as I had no idea why the test had failed in the run.

  7. To review the next failure in the run group, click on the Next Failure or Previous Failure options at the bottom of the screen.

Commenting on test failures in Zephyr

Note: The run groups in Zephyr can be dragged and dropped to the release folder they belong to.

For ease of workflow, I usually have the failed test, the group of results in ~QA: Rainforest Results channel and the run group in Zephyr open.

RAINFOREST

MATTERMOST

ZEPHYR

We will use the example run above to explain the commenting process in Zephyr:

  1. The test in Rainforest (RF-B577 MM-T4173 - Boards Plugin - (Mac/Catalina) - Empty Board - Groups) failed due to browser rendering issues since Rainforest updated the browsers on their VMs so, after updating the screenshot, the test can be categorized as “System Issues” as per screenshot below:

  2. In Mattermost, reply to the run group with failures in it, for this example, Boards Plugin - macOS, and start a thread giving reasons for failures:

  3. In Zephyr, find the failed test (for this example “Boards App - Empty Board - Groups”), then click on Start a new test execution:

  4. Click on Not executed and select Comment. This will put a pink bar to the left of the test.

  5. In the Comment field, add a reason for the failure (in this example, browser rendering issues).

  6. Continue with the process above until all failures have been commented in Zephyr.

    Note: Of course, if the failure was due to a bug, it will remain in the “Fail” status in Zephyr and you can add a link to the Jira bug ticket in the Comment field. (No need to “Start a new test execution” for failures due to actual bugs).

Before setting off any group of tests to run, the webhook must be enabled in Rainforest so that it reports to Zephyr. See the for details. To set the webhook, go to the in Rainforest.

The webhook is now enabled and all runs set off after enabling it will report to Zephyr in the folder .

Go to .

The run will begin and you can view results here: .

From this page: , click on the run group you want to review. For this example, we’ll use the Android - Account Settings.

Once you have previewed the test (and made any updates to the test itself), go back to the failed test in the run group. For this example, it’s this page: . As these examples have already been reviewed and categorized, you can see that I refactored this test for clarity but, if you were reviewing the test on failure, you’d:

Note: Although the button shows NEEDS REFACTOR, you have already refactored the test. To see tests that have been categorized to Needs Refactor, go to the . Here you can click the radio buttons to the left of the tests that you’ve categorized as Needs Refactor and they will disappear from the screen. This is the process I’ve followed as it’s easier for me to refactor the test as I’m reviewing failures. If you’d prefer to review all tests first then go to the Dashboard and work through the tests you need to refactor, that’s also totally possible :)

From this page, click on the run group you want to review failures in: . For this example, we’ll look at (RFA) Browser - Multi-Team & DM (Mac/Catalina).

Having enabled the webhook before running tests and with the run groups having completed, results will be posted in the in Mattermost as well as the in Zephyr.

wiki
Sites page
Rainforest Tests
https://app.rainforestqa.com/run_groups
https://app.rainforestqa.com/runs?page=1&mode=standard
https://app.rainforestqa.com/runs?page=1&mode=standard
https://app.rainforestqa.com/runs/880602/tests/237283/browsers/android_phone_12/steps/103213942
Dashboard
https://app.rainforestqa.com/runs?page=1&mode=standard
~QA: Rainforest Results channel
Rainforest Tests folder
Run groups
Run groups
Which run groups to run for which release?
Webhook for reporting to Zephyr
Preparing a run group to run
Reviewing failures
Crowd-tested run groups
Automated run groups
Commenting on test failures in Zephyr