LLM Provider Governance

Control which LLMs your organization has access to, and where information is sent & stored.

Local LLMs

Stack AI allows organizations to connect and use their own local LLMs (such as models hosted on private infrastructure or on-premise servers) instead of relying solely on cloud-based models like OpenAI or Bedrock.

Governance Benefits:

  • Data Privacy: Your data never leaves your infrastructure, ensuring compliance with strict privacy or regulatory requirements.

  • Custom Control: You can select, update, or fine-tune models as needed, and restrict which users or workflows can access them.

  • Auditability: All usage of the local LLM can be logged and monitored within your organization’s security perimeter.

How it works in Stack:

  • Admins can add a local LLM as a provider in the Stack AI admin console. See our guide.

  • Once added, the local LLM appears as an option in the workflow builder, just like any other provider.

  • You can set permissions to control which users or teams can access the local LLM.


Turning Off Stack API Keys

By default, Stack AI provides hosted API keys for popular providers (like OpenAI, Anthropic, etc.) so users can get started quickly. However, for governance and security, organizations may want to require the use of their own API keys or connections.

Governance Benefits:

  • Credential Control: Prevents users from accidentally or intentionally using Stack’s shared keys, ensuring all API usage is billed to and controlled by your organization.

  • Security: Reduces risk of data leakage or misuse of shared credentials.

  • Compliance: Ensures all API access is auditable and tied to your organization’s own accounts.

How it works in Stack:

  • Admins can disable Stack-provided API keys for any provider in the admin console.

  • Once disabled, users must add their own connection (API key) to use that provider in workflows.

  • This setting can be enforced globally or per-provider.


Deactivating Certain Providers

Stack AI supports a wide range of providers (OpenAI, Bedrock, Google, Slack, etc.). For governance, you may want to restrict which providers are available to your users.

Governance Benefits:

  • Risk Mitigation: Prevents use of unapproved or high-risk providers.

  • Simplified Compliance: Ensures only vetted and compliant services are available.

  • User Experience: Reduces clutter and confusion by hiding unused or irrelevant providers.

How it works in Stack:

  • Admins can deactivate (hide or block) any provider from the admin console.

  • Deactivated providers will not appear in the workflow builder or connection menus for end users.

  • This can be managed at the organization or workspace level.


Summary Table

Feature
What It Does
Governance Benefit

Add Local LLM

Use your own on-prem/private LLM

Data privacy, control, auditability

Turn Off Stack API Keys

Require org-owned API keys for providers

Credential control, security, compliance

Deactivate Certain Providers

Hide/block specific providers from user access

Risk mitigation, compliance, simplicity

Last updated

Was this helpful?