An open API service indexing awesome lists of open source software.

https://github.com/aws-samples/sample-agentic-cost-optimizer

Sample Agentic Cost Optimizer
https://github.com/aws-samples/sample-agentic-cost-optimizer

agent agentcore ai aws aws-lambda bedrock lambda strands-agents

Last synced: 27 days ago
JSON representation

Sample Agentic Cost Optimizer

Awesome Lists containing this project

README

          

# Sample Agentic Cost Optimizer

> [!NOTE]
> This is a sample project and work in progress. Use it as a reference for building AI agents with Amazon Bedrock Agent Core.

An AI agent that analyzes your AWS Lambda functions and generates cost optimization reports. It discovers AWS Lambda resources, collects metrics, identifies savings opportunities, and produces detailed recommendations.

## What It Does

The agent runs a two-phase workflow:

1. **Analysis Phase**: Discovers AWS Lambda functions, collects Amazon CloudWatch metrics and logs, analyzes memory usage and invocation patterns, identifies optimization opportunities
2. **Report Phase**: Generates a cost optimization report with specific recommendations, estimated savings, implementation steps, and supporting evidence

Results are saved to Amazon Simple Storage Service (Amazon S3) and tracked in Amazon DynamoDB for audit trails.

## How It Works

- **Agent Runtime**: Python agent using [Strands Agents](https://strandsagents.com/latest/) and Amazon Bedrock models
- **Orchestration**: Step Functions workflow triggered by EventBridge (scheduled or manual)
- **Storage**: Amazon S3 for reports, Amazon DynamoDB for event journaling
- **Infrastructure**: AWS CDK (TypeScript) for deployment

## Architecture

![Architecture Diagram](docs/architecture_diagram.png)

The diagram shows the agent workflow with 4 tools:

- **use_aws tool**: Collects cost data, AWS Lambda logs, and AWS Lambda configurations from your AWS account
- **calculator tool**: Performs time range calculations and cost computations
- **storage tool**: Saves analysis results and final reports to Amazon S3
- **journal tool**: Records workflow events to Amazon DynamoDB for audit trails

The **Analysis Agent** uses all 4 tools to discover resources and analyze costs. The **Report Agent** uses storage and journal tools to generate reports. **Amazon Bedrock** provides the AI model powering both agents.

## Prerequisites

In order to deploy the agent you will need to have the following tools installed on your machine:

- **Python 3.12+** - [Download Python](https://www.python.org/downloads/)
- **uv** - Python package manager - [Installation guide](https://docs.astral.sh/uv/getting-started/installation/)
- **Node.js 20+** - Required for AWS CDK - [Download Node.js](https://nodejs.org/)
- **AWS CLI** - [Installation guide](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html)

Configure AWS credentials:

```bash
aws configure
```

## Quick Start

1. **Install dependencies**:
```bash
make setup
```

2. **Deploy to AWS** (first time only):
```bash
make cdk-bootstrap
```

3. **Enable AgentCore observability** (one-time setup):

To view metrics, spans, and traces generated by the AgentCore service, you need to enable Amazon CloudWatch Transaction Search. Follow the [observability configuration guide](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability-configure.html) to complete this setup.

4. **Deploy the stack**:
```bash
make cdk-deploy
```

5. **Trigger the agent**:
```bash
make trigger-workflow
```

The agent will analyze your AWS Lambda functions and save a cost optimization report to Amazon S3.

## Project Structure

```
.
├── src/
│ ├── agents/ # Agent code and prompts
│ ├── tools/ # Custom tools (journal, storage)
│ └── shared/ # Shared utilities
├── infra/ # CDK infrastructure (TypeScript)
│ ├── lib/ # Stack definitions
│ └── lambda/ # Lambda function handlers
├── tests/ # Test files
└── Makefile # Common commands
```

## Configuration

Set these environment variables (or use defaults):

- `ENVIRONMENT`: Environment name (default: `dev`)
- `VERSION`: Version tag (default: `v1`)
- `MODEL_ID`: Amazon Bedrock model ID (default: Claude Sonnet 4)
- `TTL_DAYS`: Amazon DynamoDB record retention (default: `30`)

## Deployment

Deploy changes:

```bash
make cdk-deploy # Full deployment
make cdk-hotswap # Fast AWS Lambda-only updates
make cdk-watch # Auto-deploy on file changes
```

## Local Development

Run the agent locally for testing:

**Terminal 1** - Start the agent:
```bash
make run-agent-local
```

**Terminal 2** - Invoke the agent:
```bash
make invoke-agent-local
```

The agent runs on `http://localhost:8080` with reload. Logs appear in Terminal 1.

## Triggering the Agent (AWS)

**Scheduled**: Runs daily at 6am UTC (configured in EventBridge)

**Manual**: Trigger anytime:

```bash
make trigger-workflow
```

**Custom trigger**:

```bash
aws events put-events --entries '[{
"Source": "manual-trigger",
"DetailType": "execute-agent",
"Detail": "{}"
}]'
```

## Monitoring

Check the Step Functions console for workflow execution status. View reports in the Amazon S3 bucket (output in CDK deployment). Query Amazon DynamoDB for event history and audit trails.

## Testing

Run tests:

```bash
make test # All tests
make check # Linting and formatting
```

## Cleanup

Remove AWS resources:

```bash
make cdk-destroy
```

Remove local artifacts:

```bash
make clean
```

## Common Issues

**Agent fails to start**: Check AWS credentials are configured and have necessary permissions (AWS Lambda, Amazon CloudWatch, Amazon S3, Amazon DynamoDB, Amazon Bedrock).

**No AWS Lambda functions found**: The agent analyzes AWS Lambda functions in `us-east-1`. Ensure you have functions in that region or modify the region in the code.

**Deployment fails**: Run `make cdk-bootstrap` first if this is your first CDK deployment in the account/region.

## Contributing

See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.

## Security

See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for security issue reporting.

## License

This library is licensed under the MIT-0 License. See the LICENSE file.