Get Started with Workflow Builder
Our no-code approach is anchored in a visual workflow builder that prioritizes ease of use. This is achieved through an intuitive drag-and-drop interface, with built-in chatbot assistance, and an optimal level of abstraction that caters to both technical and non-technical teams.

Technical teams can extend capabilities even further by using a custom “Code” node (e.g. Python), a custom “API” node, or by building their own tools to orchestrate LLM-driven actions.
Our users can integrate with a broad ecosystem of applications, enabling support for the most common use cases across departments.
Department / Use Case
Integrations
Data & Analytics
Power BI, BigQuery, Databricks, Snowflake, Fred, Excel (Sharepoint), GSheets, Typeform, etc.
Engineering & Dev Tools
Github, Regex, SerpAPI, Weaviate, etc.
AI & Machine Learning
E2B, Pinecone, Wolfram Alpha, HyperBrowser, Reducto, VLM
CRM & Sales
Salesforce, HubSpot, LinkedIn, PitchBook, Yahoo Finance
Marketing
HubSpot, LinkedIn, Gmail, Outlook, YouTube
Project & Task Management
Asana, Clickup, Jira, Notion, Make, Coda, Miro
Collaboration & Communication
Slack, Loom, Gmail, Outlook, GDocs, Knowledge Base
ERP & Business Operations
Oracle, NetSuite, Workday, Veeva
Storage & File Systems
Google Drive, Dropbox, OneDrive, SharePoint, SharePoint (NTLM), Azure Blob Storage, AWS S3
Finance & Reporting
Excel, Airtable, Power BI, Yahoo Finance
Forms & Surveys
Typeform, GSheets
HR & People Ops
Workday, Outlook, LinkedIn
Automation & Integration
Hightouch, Make, Slack
Web & Social Monitoring
YouTube, Firecrawl, Exa AI
Our users can connect to a wide range of AI models:
AI Model
Integrations
LLMs
OpenAI, Anthropic, Google, Meta, xAI, Mistral, Perplexity, TogetherAI, Cerebras, Replicate, Groq, Azure, Bedrock, any local LLM via end-point, StackAI’s fine-tuned text-to-SQL model
Voice models
Deepgram, Elevenlabs
Image models
Stable Diffusion, Flux

StackAI also supports connections with MCP servers, allowing your workflows to use not only integrations developed by StackAI’s team, but also integrate with tools served by third parties using the MCP protocol.
Making LLMs take autonomous actions is very easy. Users can add ‘tools’ (e.g., function calling) directly in the LLM node by selecting the one they want from a long list of possible actions. More advanced users can also add their own custom tools.


Users can interact with the assistant directly within the workflow builder to ask questions, get suggestions for improving their project, and easily access documentation for specific features.

Last updated
Was this helpful?