Claude (Anthropic) Compatibility Guide
Important Note: We provide API services compatible with Anthropic standards. Therefore, we strongly recommend directly referring to the official Anthropic API documentation to obtain the most comprehensive, up-to-date parameter details and examples. The following is an overview of the simplified interface, focusing on core fields and usage instructions.
Overview
UModelverse platform provides a Messages interface compatible with the Anthropic Claude API, allowing developers to directly invoke models on Modelverse using the Anthropic SDK or other supported tools.
- Request Method: POST
- Endpoint:
https://api.umodelverse.ai/v1/messages - Authentication: Use API key, pass through
x-api-key: {api_key}orAuthorization: Bearer {api_key}
Note: The
/v1/messagesendpoint only supports the Claude series models. For other models (such as GPT, Gemini, DeepSeek, etc.), use the OpenAI-compatible/v1/chat/completionsendpoint. Although Claude and OpenAI both use/v1/modelsto obtain a list of models, the call interfaces are different.
Key Core Fields
Request Fields
| Field | Type | Required | Default | Description |
|---|---|---|---|---|
| model | string | Yes | None | Model ID, such as claude-sonnet-4-5-20250929. Specifies the model for generating responses. |
| messages | array | Yes | None | List of conversation messages. Each message includes role (user/assistant) and content (text/image). |
| max_tokens | integer | Yes | None | Maximum number of tokens to generate. Controls the response length to prevent overly lengthy outputs. |
| system | string/array | No | None | System prompts. Defines model behavior and roles. |
| temperature | number | No | 1.0 | Sampling temperature (0 to 1). Controls randomness, with higher values being more creative and lower values being more deterministic. |
| top_p | number | No | None | Nucleus sampling (0 to 1). Controls diversity. |
| top_k | integer | No | None | Top-K sampling. Samples only from the top K tokens with the highest probabilities. |
| stop_sequences | array | No | None | List of stop sequences. Stops generation upon reaching these sequences. |
| stream | boolean | No | false | Whether to stream responses. Returns content incrementally in real-time. |
| metadata | object | No | None | Metadata object, which can include user_id for tracking. |
| tools | array | No | None | List of available tools. Enables function calling features. |
| tool_choice | object | No | auto | Tool selection strategy. For example, {"type": "auto"} lets the model decide to invoke tools. |
Response Fields
| Field | Type | Description |
|---|---|---|
| id | string | Response ID. Uniquely identifies this completion. |
| type | string | Object type: message. |
| role | string | Role: assistant. |
| content | array | List of content blocks. Each block contains type and corresponding content (such as text). |
| model | string | The model ID used. |
| stop_reason | string | Stop reason: end_turn, max_tokens, stop_sequence, tool_use. |
| stop_sequence | string | Sequence that triggered the stop (if applicable). |
| usage | object | Usage statistics. Includes input_tokens and output_tokens. |
Quick Start
Install Anthropic SDK
pip install anthropicSample Code
** Python **
import anthropic
client = anthropic.Anthropic(
api_key="<MODELVERSE_API_KEY>",
base_url="https://api.umodelverse.ai"
)
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, how are you?"}
]
)
print(message.content[0].text)** Python Streaming **
import anthropic
client = anthropic.Anthropic(
api_key="<MODELVERSE_API_KEY>",
base_url="https://api.umodelverse.ai"
)
with client.messages.stream(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{"role": "user", "content": "Write a short story about AI."}
]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)** curl **
curl https://api.umodelverse.ai/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: $MODELVERSE_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-5-20250929",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'** curl Streaming **
curl https://api.umodelverse.ai/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: $MODELVERSE_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-5-20250929",
"max_tokens": 1024,
"stream": true,
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'Multimodal Support
Claude-compatible interfaces support image input, which can be delivered via base64 encoding or URL:
Using URL
import anthropic
client = anthropic.Anthropic(
api_key="<MODELVERSE_API_KEY>",
base_url="https://api.umodelverse.ai"
)
# Using URL to pass image
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{
"role": "user",
"content": [
{
"type": "image",
"source": {
"type": "url",
"url": "https://umodelverse-inference.cn-wlcb.ufileos.com/ucloud-maxcot.jpg"
}
},
{
"type": "text",
"text": "What's in this image?"
}
]
}
]
)
print(message.content[0].text)Using base64 Encoding
import anthropic
import base64
client = anthropic.Anthropic(
api_key="<MODELVERSE_API_KEY>",
base_url="https://api.umodelverse.ai"
)
# Using base64 to encode image
with open("image.png", "rb") as f:
image_data = base64.standard_b64encode(f.read()).decode("utf-8")
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{
"role": "user",
"content": [
{
"type": "image",
"source": {
"type": "base64",
"media_type": "image/png",
"data": image_data
}
},
{
"type": "text",
"text": "What's in this image?"
}
]
}
]
)
print(message.content[0].text)Tool Calling (Function Calling)
import anthropic
client = anthropic.Anthropic(
api_key="<MODELVERSE_API_KEY>",
base_url="https://api.umodelverse.ai"
)
tools = [
{
"name": "get_weather",
"description": "Get the current weather in a given location",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
}
},
"required": ["location"]
}
}
]
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
tools=tools,
messages=[
{"role": "user", "content": "What's the weather like in Beijing?"}
]
)
print(message.content)Model List
For more supported Claude-compatible models, please see [Get Model List] or visit the UModelverse Model Center .
For more details on the fields, see the official Anthropic documentation