Amazon Bedrock
Auto-instrument Amazon Bedrock for multi-model inference.
Risicare automatically instruments Amazon Bedrock via the boto3 SDK.
Bedrock Runtime Only
Only bedrock-runtime operations are traced. Other AWS services (S3, DynamoDB, etc.) are not affected.
Installation
pip install risicare boto3Auto-Instrumentation
import risicare
import boto3
import json
risicare.init()
client = boto3.client("bedrock-runtime", region_name="us-east-1")
# Automatically traced
response = client.invoke_model(
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
body=json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
})
)Captured Attributes
| Attribute | Description |
|---|---|
gen_ai.system | bedrock |
gen_ai.request.model | Full model ID (shortened) |
gen_ai.request.bedrock.operation | Converse, ConverseStream, InvokeModel, or InvokeModelWithResponseStream |
gen_ai.usage.prompt_tokens | Input tokens (from inputTokens) |
gen_ai.usage.completion_tokens | Output tokens (from outputTokens) |
gen_ai.usage.total_tokens | Total tokens |
gen_ai.response.stop_reason | Stop reason |
gen_ai.completion.tool_uses | Number of tool use blocks |
gen_ai.completion.tool_use.{j}.name | Name of each tool use |
gen_ai.response.bedrock.latency_ms | Bedrock-specific response latency |
gen_ai.latency_ms | Request latency in milliseconds |
gen_ai.response.stream | Whether the response was streamed |
Converse API
The newer Converse API is also supported:
response = client.converse(
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
messages=[
{"role": "user", "content": [{"text": "Hello!"}]}
]
)Streaming
response = client.invoke_model_with_response_stream(
modelId="anthropic.claude-3-sonnet-20240229-v1:0",
body=json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Write a story"}]
})
)
for event in response["body"]:
chunk = json.loads(event["chunk"]["bytes"])
if chunk["type"] == "content_block_delta":
print(chunk["delta"]["text"], end="")Supported Models
| Provider | Model ID |
|---|---|
| Anthropic | anthropic.claude-3-5-sonnet-20241022-v2:0 |
| Anthropic | anthropic.claude-3-sonnet-20240229-v1:0 |
| Anthropic | anthropic.claude-3-haiku-20240307-v1:0 |
| Meta | meta.llama3-70b-instruct-v1:0 |
| Meta | meta.llama3-8b-instruct-v1:0 |
| Mistral | mistral.mistral-large-2407-v1:0 |
| Amazon | amazon.titan-text-express-v1 |
| Cohere | cohere.command-r-plus-v1:0 |
Model ID Shortening
Risicare automatically shortens verbose Bedrock model IDs:
anthropic.claude-3-sonnet-20240229-v1:0 → claude-3-sonnet
meta.llama3-70b-instruct-v1:0 → llama3-70b