Skip to main content
Version: PromptQL

Providers & Models

Introduction

PromptQL supports configuring different LLM providers and models to tailor the experience to your application's needs.

This gives you the flexibility to choose the most performance-efficient and cost-effective solution for your use case, and the freedom to switch between providers and models as needed depending the task.

For example, you can use one model for conversational tasks and another for advanced AI primitives:

kind: PromptQlConfig
version: v1
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o

llm

The llm configuration is used to define the LLM provider and model for conversational tasks in your application. In the example above, we're using openai as the provider with the o3-mini model.

ai_primitives_llm

The ai_primitives_llm configuration is used to define the LLM provider and model for AI primitives in your application. This is used for tasks such as program generation and execution. In the example above, we're using openai as the provider with the gpt-4o model.

Available providers & models

Anthropic

You can use any Anthropic model; the following have been tested with PromptQL:

  • claude-3-5-sonnet-latest
  • claude-3-7-sonnet-latest

AWS Bedrock

You can use any Bedrock-wrapped model; the following have been tested with PromptQL:

  • Claude 3.5 Sonnet
  • Claude 3.7 Sonnet

NB: For Bedrock models, you'll need to provide a model_id that resembles this string:

arn:aws:bedrock:<AWS region>:<AWS account ID>:inference-profile/us.anthropic.claude-3-5-sonnet-20241022-v2:0

Google Gemini

You can use any Google Gemini model; the following have been tested with PromptQL:

  • gemini-1.5-flash
  • gemini-2.0-flash

Hasura

With Hasura—used as the default provider—there is no specific model necessary in your configuration.

Microsoft Azure

You can use any Azure foundational model.

OpenAI

You can use any OpenAI model; the following have been tested with PromptQL:

  • o1
  • o3-mini
  • gpt-4o
Base Model

When using the hasura provider, the default model is claude-3-5-sonnet-latest. This is the recommended model for PromptQL program generation.

Considerations

  • The model key is not supported when using the hasura provider.
  • The value for a model key is always in the dialect of the provider's API.
  • If ai_primitives_llm is not defined, it defaults to the provider specified in the llm configuration.
  • system_instructions are optional but recommended to customize the behavior of your LLM.