Skip to main content
Version: PromptQL

PromptQL Configuration

Overview

You can manage PromptQL's LLM settings and system instructions from a single promptql_config.yaml file, automatically created at the root of your DDN project when you initialize a project with the --with-promptql flag using the Hasura DDN CLI.

Examples

Minimal configuration:
kind: PromptQlConfig
version: v1
definition:
llm:
provider: hasura
Custom providers for LLM & AI primitives and custom system instructions:
kind: PromptQlConfig
version: v1
definition:
llm:
provider: openai
model: o3-mini
ai_primitives_llm:
provider: openai
model: gpt-4o
system_instructions: |
You are a helpful AI Assistant.

With this file, you can:

  • Set the LLM provider and model used across the application.
  • Define a separate LLM for AI Primitives such as Classification, Summarization, and Extraction.
  • Add system instructions that apply to every PromptQL interaction.

Next steps