Supported model providers and how to use them with OpenResponses
provider@model_name
convention. This page provides examples of how to use each supported provider with curl commands.
Provider | API Endpoint |
---|---|
openai | https://api.openai.com/v1 |
claude | https://api.anthropic.com/v1 |
anthropic | https://api.anthropic.com/v1 |
groq | https://api.groq.com/openai/v1 |
togetherai | https://api.together.xyz/v1 |
gemini | https://generativelanguage.googleapis.com/v1beta/openai/ |
https://generativelanguage.googleapis.com/v1beta/openai/ | |
deepseek | https://api.deepseek.com |
ollama | http://localhost:11434/v1 |
xai | https://api.x.ai/v1 |
model_endpoint@model_name
convention by directly specifying the full API endpoint URL:
provider@model-name
or model_endpoint@model_name
, where:
provider
: The model provider (e.g., claude, deepseek, google, openai)model_endpoint
: For locally deployed models or any custom model provider, the endpoint URL where chat/completions is availablemodel-name
: The specific model to use from that provider or endpoint