Debugging flows
Diagnose failing Maestro flows with the view hierarchy, debug output, and step-by-step replay in Maestro Deck.
Debugging a failing Maestro flow is a question of three things: what was on screen, what selector did the flow look for, and what actually matched (or didn't). Maestro Deck and the CLI give you the tools to answer all three. This guide is the minimum you need to fix a failure without resorting to "I'll just rerun it."
TL;DR
When a step fails, open the run in Maestro Deck and read the view hierarchy at the failure frame, not the screenshot. Most failures fall into three buckets: (1) the selector is wrong, (2) the timing is wrong, (3) the screen never reached the expected state. The fix differs for each.
Read the failure frame
Maestro Deck records a view hierarchy snapshot for every step. Click the failed step in the run timeline; the right pane shows:
- A screenshot at that frame.
- The full view hierarchy (XML/JSON tree).
- The selector your step used.
- Every node that matched the selector.
If the selector matched zero nodes, the screenshot is misleading — the element you see may not be the one you wrote a selector for. Native containers, web views, and shadow elements all show up identically to the eye.
Three failure patterns
1. Selector mismatch
tapOn: "Sign in" fails because the button reads "Sign In" or "Log in" or has no text at all (icon-only). The hierarchy will show the actual text/id; copy it into the flow.
Prefer accessibility IDs over text where possible:
IDs survive translations and copy changes; text doesn't.
2. Timing
tapOn runs before the element is hittable. Symptoms: passes locally, fails in CI, fails after a backend slowdown.
Fix with extendedWaitUntil:
Avoid blanket sleep — it makes the suite slower and doesn't actually fix the flake. See reducing flaky tests for the deeper take.
3. Wrong screen
The flow asserts on a screen the app never reached. Usually a backend error, a logged-out session, or a feature flag that flipped. The screenshot at failure shows a login screen, an error toast, or a paywall.
This is not a flow bug. Stop fixing the flow; fix the environment.
Capturing evidence
Run with --debug-output ./debug to write screenshots, hierarchy, and logs to disk on every step:
In CI, upload ./debug as an artifact. When a flow fails on the runner but passes locally, the debug output is the only way to know why.
Per-step screenshots
Force a screenshot at any step to compare runs:
Combined with --debug-output, this gives you a frame-by-frame diff of the run.
Live debugging in Maestro Deck
Open the flow in the editor pane and click "Step". The runner pauses at each step, and you can inspect the view hierarchy of the running app between steps. Useful for authoring new flows or working out why a tapOn finds the wrong element.
Reproducing CI failures locally
The mismatch between "passes on my laptop, fails on CI" usually comes from one of:
- Different Maestro CLI version. Pin it.
- Different OS version on the simulator. Match the CI image.
- Different locale. Set
LANG=en_US.UTF-8on both. - Network isolation. CI may not reach
localhost; staging may not reach a developer endpoint.
When you can't reproduce, run the exact CI workflow locally with act or the GitLab equivalent before assuming it's a Maestro issue.
When to give up and instrument
If a flow has failed three times for three different reasons, stop. The flow is testing too much. Split it: one flow for sign-in, one for cart add, one for checkout. Smaller flows fail at the right place and are easier to fix.