Skip to main content
Using AI assistance in OpenOps requires enabling AI in OpenOps settings and configuring at least one connection to a large language model (LLM) provider. To enable AI and configure an LLM connection:
  1. In the OpenOps left sidebar, click the Settings icon at the bottom: Settings icon
  2. In the Settings view, click AI. The AI providers section opens: AI providers
  3. Switch on the Enable AI toggle.
  4. In the Connection dropdown, select Create Connection. The Create AI Connection view opens: Create AI Connection
  5. In the Provider dropdown, select one of the supported LLM providers. Anthropic, Azure OpenAI, Cerebras, Cohere, Deep Infra, DeepSeek, Google Generative AI, Google Vertex AI, Groq, Mistral, OpenAI, OpenAI-compatible providers, Perplexity, Together.ai, and xAI Grok are currently supported.
  6. In the Model dropdown, select one of the models your LLM provider supports.
  7. (Optional) If the model you’re looking for is not listed, specify a custom model in the Custom model field. This overrides whatever you’ve selected under Model.
  8. Enter your API key for the selected LLM provider in the API Key field.
  9. (Optional) Enter a Base URL for the selected model. This is useful if you want to use a proxy or if your LLM provider does not use the default base URL. If you selected OpenAI Compatible as the provider, then you are required to enter the base URL.
  10. (Optional) Use the Provider settings and Model settings fields to specify custom parameters as JSON. The JSON schema varies depending on the chosen provider and model. For example, if you’ve selected OpenAI, you can use Provider settings for JSON that you’d normally pass to the createOpenAI function, and Model settings for JSON that you’d normally pass to the streamText function. For more details, see the OpenAI documentation.
  11. Click Save to apply your changes in the Create AI Connection view.
  12. (Optional) Back in the AI providers section, if you’re working with AWS and you want your AI connection to have access to AWS MCP servers, click AWS Cost in the MCP section, and select an AWS connection to use. This enables access to AWS Cost Explorer MCP Server, AWS Pricing MCP Server, and AWS Billing and Cost Management MCP Server. AWS Cost MCP connection
  13. Click Save to apply your changes in the AI providers section.
Configuring an LLM connection enables all AI assistance features in OpenOps.