Tools

Add tools directly to your LLM Node to let the LLM decide when to use them. The LLM intelligently determines when and how to call these tools based on the context of the conversation and user inputs. Unlike an outside node, whose input is always passed in to the LLM, a tool is integrated into the LLM itself. This approach works best when:

  • You don't need the tool to be accessed at every query, it's okay for the LLM to autonomously decide when to use the tool.

  • You would like the LLM to have access to multiple tools at once.

Tool Provider

Before using a tool, it's important to understand how tools are organized in Stack AI. Tools are grouped under "Providers" - these are the main services or systems that contain related functionality. Think of a Provider as a container for multiple related tools.

For example:

  • Salesforce (Provider)

    • Create Lead (Tool)

    • Update Contact (Tool)

    • Search Records (Tool)

This organization makes it easy to find and use related tools. Additionally, providers share authentication headers and common access methods, allowing tools within the same provider to seamlessly utilize the same authentication and connection details when performing their actions.

LLM tool to execute another StackAI project

How It Works

  • The LLM analyzes the user's request or query to understand what action needs to be taken

  • It identifies which tool (API endpoint) is most appropriate for fulfilling that request

  • It automatically constructs the API request by filling in:

    • Query parameters

    • Body parameters

    • Path parameters

    • Headers

    • Any other required request data

Tools vs. Separate Node

When should you use a tool and when should you use a separate node? This depends on what you want to accomplish. If you want to enforce using the app at every invocation, then use an outside node--the LLM will have to use the app every time. If you only want the app to be used when necessary and you want the LLM to decide--use a tool!

Tools are also a great choice if you want the LLM to have options. For example, if you want it to search LinkedIn, the Web, and your own knowledge base, you can add those tools to the same LLM and it may use one, two, or all of the options to answer your query.

On the other hand, if you want to make sure that a search is carried out across LinkedIn, the Web, and your KB--then its better to have three separate nodes delivering their output to the LLM. In this case, be careful! Concatenating inputs could exceed you chosen model's context window.

Prompt Optimization with Tools

If you'd like to reference the tool directly in your user prompt, type @ and then select the tool.

When using custom tools with an LLM node, it's important to provide clear prompting to help the LLM understand how and when to use your tools effectively:

  1. Describe the Tool's Purpose: Include a clear description of what the tool does and when it should be used in your system prompt. For example: "Use the addPet tool to add a new pet to the store database."

  2. Provide Usage Examples: Give examples of proper tool usage in your prompts to demonstrate the expected input/output patterns. For example: "addPet(name='Max', category='dog', status='available')"

  3. Set Clear Instructions: Specify any requirements or constraints for using the tool in your prompts. For example: "When using addPet, ensure all required fields (name, category, status) are provided."

  4. Handle Errors: Include guidance on how to handle potential errors or edge cases when using the tool. For example: "If addPet returns an error, verify the input data and try again with corrected values."

Example system prompt:

When the user wants to include a new pet, follow these steps:

1. Ask for the name of the pet
2. Use the listPets tool to check if the name already exists. If it does, ask the user for a different name that is not in the list.
3. If the pet name is unique, collect all required information for addPet.
   3.1. If any information is missing, ask the user for it.
4. Use the addPet tool to create the new pet entry
5. Use the getPetById tool to retrieve the newly created pet
6. Provide a summary confirming the successful pet addition with the key details

Custom Tools

Custom Tools enable AI agents to execute custom actions by integrating with your API systems and services. When you define API endpoints in your custom tools, each endpoint becomes a distinct tool that the LLM can utilize.

A custom tool represents a specific API endpoint and its functionality. Each tool has several key components:

  • Name: A unique identifier for the tool that can be referenced in LLM prompts. For example, if you name a tool addPet, you would reference it as "addPet" when instructing the LLM to use it.

  • Description: A clear explanation of what the tool does. This helps the LLM understand when and how to use the tool appropriately.

  • Path: The API endpoint path that the tool will call (e.g., /api/v1/pets)

  • Method: The HTTP method to use (GET, POST, PUT, DELETE, etc.)

When the LLM needs to create a new pet, it can reference the addPet tool by name and provide the necessary parameters based on the tool's description and requirements.

Create a Custom Tool

Custom tools are defined through API services, allowing you to integrate external functionality into your LLM. When you create a custom tool, you'll describe your API endpoints and their capabilities.

Each API endpoint becomes a distinct tool that represents a specific action or operation in your system. The LLM will automatically understand how to use these endpoints and fill in the required parameters (like body and query parameters) based on the context and user input.

For example, if you have an e-commerce API:

  • The /products endpoint becomes a tool for retrieving product information, where the LLM can fill search parameters

  • The /orders/create endpoint becomes a tool for placing new orders, with the LLM providing order details in the request body

  • The /inventory/update endpoint becomes a tool for managing stock levels, where the LLM determines the updated quantities

This approach lets you transform your existing APIs into reusable tools that can be easily incorporated into any LLM, making your external services and systems accessible to AI agents. The LLM handles the complexity of constructing proper API requests by intelligently filling parameters based on the conversation context. Custom tools help you build more maintainable and scalable flows by promoting code reuse and modular design.

To create a custom tool:

  1. Navigate to an LLM Node that supports Tools (like GPT-4 or Claude)

  2. Click the "Tools" button in the Tools section

  3. Select the "Custom tools" tab where your custom tools will appear. Click the "Add Custom Tool" button.

This will open the custom tool creation interface where you can define your tool's functionality.

Adding Tool Information

To create a custom tool, you need to include:

  1. Tool Provider Name: Give your tool provider a descriptive name that represents the service or system

  2. OpenAPI Schema: Provide the OpenAPI specification that defines your API endpoints. The schema must include:

    • Server URLs for the API endpoints

    • Complete endpoint definitions with:

      • Important! Clear descriptions explaining what each endpoint does and its purpose to help the LLM understand how to use them correctly

      • HTTP methods (GET, POST, PUT, etc.)

      • Path parameters

      • Query parameters for GET requests

      • Detailed request body schemas for POST/PUT requests

      • Response schemas

      • Required headers specific to endpoints

  3. Common Headers (Optional): Define headers that should be applied across all endpoints, such as:

    • Authentication headers (e.g. API keys)

    • Custom headers required by your API

Each API endpoint defined in your OpenAPI schema will be automatically transformed into an individual tool that you can use in your LLMs. Taking time to properly configure these settings will make your tools more user-friendly and reliable.

Your custom tool will now appear in the tools panel and can be used in any LLM!

Last updated

Was this helpful?