Prompting

In the Prompting section of the LLM Node, you will see two sections: Instructions, and Prompt.

System Prompt

The system prompt sets the overall behavior, tone, and role of the AI assistant for the entire conversation. It acts as a set of instructions or context that the model should always keep in mind when generating responses. This message is passed into the model's context each time you interact with it, so the model with always "remember" what you say here. It's important to keep this part as short and informative as you can. Put only the most important information into this section. It’s best used for setting rules, style, or persona (e.g., β€œYou are a helpful tutor. Always explain things simply.”).

User Prompt

The user prompt is the main message or question that the LLM will answer. It can include direct user input, references to other nodes, or additional context. This is the main content the LLM will respond to, after considering the system prompt.

Include placeholders in your user prompt if you'd like to import output from other nodes (e.g., user input). You can do this by typing backslash and then selecting the node whose output you'd like to include.

Last updated

Was this helpful?