OpenAI direct
OpenAI direct is a special variation of the OpenAI adapter, designed for interaction with proxied GPT services without restrictions from Caila.
- OpenAI adapter allows interaction with proxy services through a single standard API.
- OpenAI Direct allows interaction with proxy services through their own APIs, while Caila only controls the billing.
API
The API is available at: https://caila.io/api/adapters/openai-direct
Supported methods:
- chat-completion
To access Caila services via OpenAI direct:
- In the Authorization header, specify the API key created in Caila.
- In the model field in the request, specify the model ID in the format:
<author>/<service>[/model]
. - author is the first part of the model identifier, the account name of the service owner (not your own, but the one who posted the service). For example, just-ai.
- service is the name of the service in Caila. For example, openai-proxy.
- model is an optional part, defines the value of the “model” field, which will be sent in the request to the service. For example, gpt-4o.
Request examples
Chat-GPT
curl https://caila.io/api/adapters/openai-direct/chat/completions \
-H 'Authorization: <key from Caila>' \
-H 'Content-Type: application/json' \
-d '{"model":"just-ai/openai-proxy/gpt-3.5-turbo","messages":[{"role":"user","content":"Write a text in 20 words"}],"stream":true}'
Claude
curl https://caila.io/api/adapters/openai-direct/chat/completions \
-H 'Authorization: <key from Caila>' \
-H 'Content-Type: application/json' \
-d '{"model":"just-ai/claude/claude-3-5-sonnet-20240620","max_tokens":1024,"messages":[{"role":"user","content":[{"type":"text","text":"Write a text in 20 words"}]}],"stream":true}'
Please note:
- the max_tokens field becomes mandatory.
- the content field changes format: a string value in direct mode is not supported by the Claude service.
Technical details
OpenAI direct uses the same services as the OpenAI adapter, but requests to the service are sent with a different data type:
- https://caila.io/specs/mlp-data-gpt.yml#/ChatCompletionDirectProxy for the openai-proxy service
- https://caila.io/specs/mlp-data-common.yml#/JsonObject for all other services.
A similar “direct” request can also be sent through the Predict API, for example:
export IMAGE=`base64 -w 0 cat.jpg`
curl -X 'POST' \
'https://caila.io/api/mlpgate/account/just-ai/model/claude/predict-with-config' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-H "MLP-API-KEY: ${MLP_API_KEY}" \
-d '{"config":{},"data":{"model":"claude-3-5-sonnet-20240620","max_tokens":1024,"messages":[{"role":"user","content":[{"type":"image","source":{"type":"base64","media_type":"image/jpeg","data":"'"$IMAGE"'"}},{"type":"text","text":"What is depicted in the picture?"}]}]},"dataType":"https://caila.io/specs/mlp-data-common.yml#/JsonObject"}'