Troubleshoot a Workflow

Most of the time, your workflow just works - and that’s the point!

When something doesn’t look right, the cause is usually something small. This article helps you get comfortable diagnosing those moments quickly.

Ask AI

The easiest way to root cause an error is to chat with the built-in AI expert that has extensive knowledge about the StackAI platform.

The Ask AI feature can also help you when there is not an error. It is located at the bottom of the screen, and it can handle advanced tasks, such as updating and adding nodes into the workflow based on your prompt.

Inspect Nodes

The Analytics page provides node-level information to help you deep dive on the data flowing through the workflow and trace any actions taken at various steps.

Inspect Versions

For each iteration, it is recommended to add a description of the changes for future references.

Using Version History, you can revert back to a previous snapshot of the workflow.

Ways to Improve Testing

  1. Test within the nodes

Some nodes, such as Python and Send HTTP Request, require iterations with the code and parameters.

To speed up the building process, you can run incremental tests directly in the node and save time from running the entire workflow end to end.

Similarly, for many actions, there is a Test Action option to see the results.

  1. Pin and Skip nodes

Skip node allows you to skip some nodes in the workflow while observing the results of the rest of nodes.

Pin node allows you to fixate the output from large language model (LLM) nodes. LLM responses can vary for the same question due to their nature. Pinning the nodes can help maintain consistency during testing.

Enable Preventative Settings

Builders often encounter two common errors related to the large language model (LLM) nodes when they first get started.

  1. The input length exceeded the context window of LLMs.

  2. The LLM provider experienced an outage.

For each of the common errors, there is a setting in the LLM node to prevent workflow failure. See LLM node for details on Safe Context Window and Fallback Branch.

Last updated

Was this helpful?