Skip to main content
The Gateway server is configured by means of environment variables, which are set in a .env file placed alongside the Terraform files used to deploy on your cloud. Here’s a skeleton for your convenience:
QUALITY_LEVEL=medium

CLASSIFIER_LLM_NAME=gpt-4o

OPENAI_API_KEY=
OPENAI_BASE_URL=https://api.openai.com/v1

BETA_ENABLE_ATOMIC_FILTER=true
BETA_LLM_SNIPPETS=true

Environment variables

QUALITY_LEVEL

The quality level of the generated content. Can be low or medium (we’re working on high).

CLASSIFIER_LLM_NAME

The name of the LLM used for classification. We recommend gpt-4o, but any model capable of structured output will work. On Azure OpenAI, this should be your deployment name.

OPENAI_API_KEY

Your OpenAI API key. Not necessary if you are using the local client.

OPENAI_BASE_URL

The base URL for the OpenAI API. If using Azure OpenAI, pass https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/. Like the API key, this is unnecessary if using the SDK in local mode.

BETA_ENABLE_ATOMIC_FILTER

Whether to enable the atomic filter (i.e. only judging information-heavy parts of the claim). TRUE or FALSE.

BETA_LLM_SNIPPETS

Whether to use an experimental, LLM-based evidence filter. TRUE or FALSE.