jparkerweb/bedrock-proxy-endpoint

🔀 Bedrock Proxy Endpoint ⇢ Spin up your own custom OpenAI API server endpoint for easy AWS Bedrock inference (using standard baseUrl, and apiKey params)

48
/ 100
Emerging

This tool helps developers who are building applications using Large Language Models (LLMs) and are working within the AWS Bedrock ecosystem. It allows you to use the familiar OpenAI API client to send requests, which are then seamlessly converted and routed to AWS Bedrock's LLM services. Essentially, you feed it standard OpenAI API calls, and it handles the complex translation to interact with Bedrock, returning the LLM inference results.

Use this if you need to integrate existing applications or new projects built with the OpenAI API standard to use AWS Bedrock's LLMs without rewriting your code to specifically use the Bedrock SDK.

Not ideal if you are exclusively developing directly with AWS Bedrock's native SDKs and do not require compatibility with the OpenAI API format.

LLM application development cloud infrastructure API integration AWS Bedrock developer tools
No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

16

Forks

6

Language

JavaScript

License

MIT

Last pushed

Feb 20, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jparkerweb/bedrock-proxy-endpoint"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.