Overview

A visual builder that helps Jira admins create automation without feeling like they need to be experts first. Clear steps, safer testing, fewer “did I just break something?” moments.

See it in action
A short walkthrough showing how Jira admins build, test, and validate automations using the Visual Workflow Builder.

The problem

Automation is powerful, but most builders assume deep knowledge. For everyday admins, setup felt risky, slow, and easy to mess up.

Context
Automation is essential at scale

Jira teams rely on automation to reduce repetitive work, but the setup experience often assumes expert-level comfort.

Goals
  • Help more admins successfully set up automation
  • Increase the number of working rules created and reused
  • Reduce trial-and-error mistakes during setup
What admins felt
High risk, low confidence

It wasn’t obvious what would happen after a change, so people avoided exploring or kept rules overly basic.

What that caused
Drop off before value
  • More abandoned setups
  • More mistakes and rework
  • Low reuse across teams

Research and insights


Methodology

We ran a moderated usability study to understand how Jira admins actually build automations today, where they lose confidence, and what would make setup feel safer and faster.


Sessions were recorded, transcribed, and scored using task completion and a short post-session questionnaire.

Recruiting and participants

We recruited 16 participants across 3 methods to get a realistic mix of experience levels and environments.


User Sample:

  • 7 from UserTesting (general pool)
  • 5 internal Appfire users (not on the JMWE team)
  • 1 HubSpot, 2 LinkedIn, 1 Roblox
Personas

We needed the experience to feel approachable for day-to-day admins, without slowing down power users who build complex automations. The personas helped us sanity check every decision, simple when you start, scalable when you grow.

Journey map

We mapped the end-to-end flow from “install” to “first working rule” and marked the exact moments where confidence drops, usually right before committing changes.

Study design

To keep the study grounded in real work, we asked participants to complete 3 common JMWE tasks, then fill a short questionnaire at the end of the 60 minute session.


  • Task 1: Automatically set an Epic to Done when all issues under it are Done
  • Task 2: Reopen a parent issue when one of its subtasks is reopened
  • Task 3: Create multiple subtasks based on parent issue settings
Post test questionnaire

This survey helped us understand how admins prioritize speed, simplicity, and control when evaluating automation tools.


How we scored the experience

Alongside task success rate, we aggregated the post-test questionnaire into a single UX score out of 10, based on core UX criteria.


2 / 10
Effectiveness

Users couldn’t reliably complete the tasks.

0 / 10
Efficiency

Progress was slow, with heavy assistance required.

0 / 10
Satisfaction

The experience felt frustrating and discouraging.

0 / 10
Learnability

People didn’t build confidence as they went.

0 / 10
Loyalty

Participants didn’t feel confident enough to reuse or recommend the product.

0.5 / 10
Overall UX score

A single roll-up score across effectiveness, efficiency, satisfaction, learnability, and loyalty.

What we learned
  • The experience feels disjointed, users move between screens without clear context.
  • Critical choices are buried in long lists, forcing users to scan, guess, and reread.
  • Walls of text create cognitive overload, especially for first-time setup.
  • The lack of visual guidance makes even simple workflows feel intimidating.

Shipped solution

Instead of listing every small issue, we grouped the work into the few changes that drove the biggest clarity and confidence gains.

Structure

Make the workflow visual and readable, even as it grows in complexity.


Before
Users had to choose from a long, flat list of actions with little context. It was hard to understand what each option did or whether it was the right one.
After
Actions are grouped by intent, explained in plain language, and previewed in context, so admins can confidently pick the right step without guessing.

Clarity

Make it obvious what a control does before you click it.


Before
Long lists, no guidance. Users had to guess what to pick and hope it was right.
After
Tooltips guide users through their first setup, explaining each step and pointing to what to do next.

Validation

Moderated and recorded sessions across a range of experience levels, focusing on whether admins could complete key setup tasks confidently.

100%
Task success

Participants completed the core tasks.

2 min
Avg completion

Fast time-to-first working rule.

92%
Felt intuitive

Participants rated the experience as enjoyable and clear.

Impact

Clearer setup, fewer mistakes, and faster time-to-value for Jira admins.

+30%
Activation

More users successfully reached their first working automation.

8%→100%
Task success

Usability testing moved from near-zero success to reliable completion.

~2 min (-92%)
Time to automation

Admins could build a basic automation in minutes.

3.5→4.2
Customer Effort Score

CES improved meaningfully after the rollout.


Next case study
Want the next story? This one focuses on how users reach value faster, and what we changed to reduce drop-off.
Next Case Study: Getting Users to Value →