Add AWS Bedrock to Praison Labs
AWS Bedrock provides access to high-performing foundation
models from leading AI companies like Anthropic, Cohere, Meta,
Stability AI, and Amazon through a single API.
Prerequisites
Make sure you have AWS credentials configured:
Environment Variables
Set up your AWS credentials:
export AWS_ACCESS_KEY_ID=your_access_key_id
export AWS_SECRET_ACCESS_KEY=your_secret_access_key
export AWS_REGION=us-east-1
Using AWS Bedrock Models
Available Models
AWS Bedrock supports various model providers:
-
Anthropic Claude:
bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
-
Anthropic Claude Instant:
bedrock/anthropic.claude-instant-v1
-
Amazon Titan:
bedrock/amazon.titan-text-express-v1
-
Cohere Command:
bedrock/cohere.command-text-v14
-
Meta Llama:
bedrock/meta.llama2-70b-chat-v1
agents.yaml Configuration
framework: crewai
topic: create movie script about cat in mars
roles:
researcher:
backstory: Skilled in finding and organizing information, with a focus on research efficiency.
goal: Gather information about Mars and cats
role: Researcher
llm:
model: "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0"
temperature: 0.7
tasks:
gather_research:
description: Research and gather information about Mars, its environment, and cats, including their behavior and characteristics.
expected_output: Document with research findings, including interesting facts and information.
tools:
- ''
Python Code Example
from praisonaiagents import Agent
# Using Anthropic Claude via Bedrock
agent = Agent(
instructions="You are a helpful assistant",
llm={
"model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0",
"temperature": 0.7
}
)
# Using Amazon Titan via Bedrock
titan_agent = Agent(
instructions="You are a helpful assistant",
llm={
"model": "bedrock/amazon.titan-text-express-v1",
"temperature": 0.7
}
)
response = agent.ask("What is artificial intelligence?")
print(response)
IAM Permissions
Ensure your AWS IAM user/role has the necessary permissions to
access Bedrock:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "*"
}
]
}
Regional Availability
AWS Bedrock is available in the following regions:
us-east-1
(N. Virginia)
us-west-2
(Oregon)
ap-southeast-1
(Singapore)
ap-northeast-1
(Tokyo)
eu-central-1
(Frankfurt)
eu-west-3
(Paris)
Make sure to set your AWS_REGION
environment
variable to a supported region.
Cost Optimization
AWS Bedrock charges are based on:
- Input tokens: Text sent to the model
-
Output tokens: Text generated by the model
Consider using smaller models for development and testing to
optimize costs.
Error Handling
Common errors and solutions:
-
AccessDeniedException: Check your IAM
permissions
-
ResourceNotFoundException: Verify the model
ID is correct and available in your region
-
ThrottlingException: Implement retry logic
with exponential backoff
-
ValidationException: Check your input
parameters and format
Advanced Configuration
Custom Endpoint
For specific regions or custom endpoints:
agent = Agent(
instructions="You are a helpful assistant",
llm={
"model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0",
"aws_region": "us-west-2",
"temperature": 0.7
}
)
Streaming Responses
For real-time responses:
agent = Agent(
instructions="You are a helpful assistant",
llm={
"model": "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0",
"stream": True
}
)
Responses are generated using AI and may contain
mistakes.