Providers
Elyra supports subscription-based providers via OAuth and API-key providers via environment variables or auth file. The model list is updated with every release.
Subscriptions
Use /login in interactive mode, then select a
provider. Use /logout to clear credentials.
Tokens are stored in
~/.elyra/agent/auth.json and auto-refresh when
expired.
API Keys
Use /login in interactive mode and select a
provider to store an API key in auth.json, or
set credentials via environment variable:
export ANTHROPIC_API_KEY=sk-ant-...
elyra
All Providers
| Provider | Environment Variable | auth.json Key |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY |
anthropic |
| OpenAI | OPENAI_API_KEY |
openai |
| Google Gemini | GEMINI_API_KEY |
google |
| Azure OpenAI | AZURE_OPENAI_API_KEY |
azure-openai-responses |
| DeepSeek | DEEPSEEK_API_KEY |
deepseek |
| Mistral | MISTRAL_API_KEY |
mistral |
| Groq | GROQ_API_KEY |
groq |
| Cerebras | CEREBRAS_API_KEY |
cerebras |
| xAI | XAI_API_KEY |
xai |
| OpenRouter | OPENROUTER_API_KEY |
openrouter |
| Hugging Face | HF_TOKEN |
huggingface |
| Fireworks | FIREWORKS_API_KEY |
fireworks |
| Together AI | TOGETHER_API_KEY |
together |
| Cloudflare AI Gateway |
CLOUDFLARE_API_KEY +
CLOUDFLARE_ACCOUNT_ID +
CLOUDFLARE_GATEWAY_ID
|
cloudflare-ai-gateway |
| Cloudflare Workers AI |
CLOUDFLARE_API_KEY +
CLOUDFLARE_ACCOUNT_ID
|
cloudflare-workers-ai |
| Vercel AI Gateway | AI_GATEWAY_API_KEY |
vercel-ai-gateway |
| ZAI | ZAI_API_KEY |
zai |
| OpenCode Zen | OPENCODE_API_KEY |
opencode |
| OpenCode Go | OPENCODE_API_KEY |
opencode-go |
| Kimi For Coding | KIMI_API_KEY |
kimi-coding |
| MiniMax | MINIMAX_API_KEY |
minimax |
| MiniMax (China) | MINIMAX_CN_API_KEY |
minimax-cn |
| Xiaomi MiMo | XIAOMI_API_KEY |
xiaomi |
| Xiaomi Token Plan (China) | XIAOMI_TOKEN_PLAN_CN_API_KEY |
xiaomi-token-plan-cn |
| Xiaomi Token Plan (Amsterdam) | XIAOMI_TOKEN_PLAN_AMS_API_KEY |
xiaomi-token-plan-ams |
| Xiaomi Token Plan (Singapore) | XIAOMI_TOKEN_PLAN_SGP_API_KEY |
xiaomi-token-plan-sgp |
Auth File
Store credentials in ~/.elyra/agent/auth.json.
The file is created with 0600 permissions (user
read/write only). Auth file credentials take priority over
environment variables.
{
"anthropic": { "type": "api_key", "key": "sk-ant-..." },
"openai": { "type": "api_key", "key": "sk-..." },
"deepseek": { "type": "api_key", "key": "sk-..." },
"google": { "type": "api_key", "key": "..." },
"together": { "type": "api_key", "key": "..." }
}
Key Resolution
The key field in
auth.json supports three formats:
Shell command
Prefix with ! to execute a command and use its
stdout (cached for process lifetime):
{ "type": "api_key", "key": "!security find-generic-password -ws 'anthropic'" }
{ "type": "api_key", "key": "!op read 'op://vault/item/credential'" }
Environment variable
Uses the value of the named variable:
{ "type": "api_key", "key": "MY_ANTHROPIC_KEY" }
Literal value
Used directly:
{ "type": "api_key", "key": "sk-ant-..." }
OAuth credentials are also stored in
auth.json after /login and are
managed automatically.
Azure OpenAI
export AZURE_OPENAI_API_KEY=...
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
# Or use resource name instead of base URL
export AZURE_OPENAI_RESOURCE_NAME=your-resource
# Optional
export AZURE_OPENAI_API_VERSION=2024-02-01
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o
Root endpoints are auto-normalized to
/openai/v1. Both
.openai.azure.com and
.cognitiveservices.azure.com endpoints are
supported.
Amazon Bedrock
# Option 1: AWS Profile
export AWS_PROFILE=your-profile
# Option 2: IAM Keys
export AWS_ACCESS_KEY_ID=AKIA...
export AWS_SECRET_ACCESS_KEY=...
# Option 3: Bearer Token
export AWS_BEARER_TOKEN_BEDROCK=...
# Optional region (defaults to us-east-1)
export AWS_REGION=us-west-2
Also supports ECS task roles
(AWS_CONTAINER_CREDENTIALS_*) and IRSA
(AWS_WEB_IDENTITY_TOKEN_FILE).
elyra --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0
Prompt caching is enabled automatically for Claude models.
For application inference profiles, set
AWS_BEDROCK_FORCE_CACHE=1 to enable cache
points.
Bedrock Proxy
# Set the URL for the Bedrock proxy
export AWS_ENDPOINT_URL_BEDROCK_RUNTIME=https://my.corp.proxy/bedrock
# Set if your proxy does not require authentication
export AWS_BEDROCK_SKIP_AUTH=1
# Set if your proxy only supports HTTP/1.1
export AWS_BEDROCK_FORCE_HTTP1=1
Cloudflare AI Gateway
export CLOUDFLARE_API_KEY=... # or use /login
export CLOUDFLARE_ACCOUNT_ID=...
export CLOUDFLARE_GATEWAY_ID=... # create at dash.cloudflare.com
elyra --provider cloudflare-ai-gateway --model "claude-sonnet-4-5"
Routes to OpenAI, Anthropic, and Workers AI through
Cloudflare AI Gateway. Authentication uses
CLOUDFLARE_API_KEY as
cf-aig-authorization.
| Mode | Request Auth | Upstream Auth |
|---|---|---|
| Workers AI | Cloudflare token only | Cloudflare-native |
| Unified billing | Cloudflare token only | Cloudflare handles upstream |
| Stored BYOK | Cloudflare token only | Cloudflare injects stored keys |
| Inline BYOK | Cloudflare token + upstream header | Request supplies provider key |
Cloudflare Workers AI
export CLOUDFLARE_API_KEY=... # or use /login
export CLOUDFLARE_ACCOUNT_ID=...
elyra --provider cloudflare-workers-ai --model "@cf/moonshotai/kimi-k2.6"
Elyra automatically sets x-session-affinity for
prefix caching discounts.
Google Vertex AI
Uses Application Default Credentials:
gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=your-project
export GOOGLE_CLOUD_LOCATION=us-central1
Or set GOOGLE_APPLICATION_CREDENTIALS to a
service account key file.
Custom Providers
Via models.json: Add Ollama, LM Studio, vLLM, or any provider that speaks a supported API (OpenAI Completions, OpenAI Responses, Anthropic Messages, Google Generative AI).
Via extensions: For providers that need custom API implementations or OAuth flows, create an extension. See the Extensions documentation.
Resolution Order
When resolving credentials for a provider:
- CLI
--api-keyflag -
auth.jsonentry (API key or OAuth token) - Environment variable
- Custom provider keys from
models.json