bedrock-wrapper and bedrock-proxy-endpoint
These are complementary tools where the wrapper provides a programmatic abstraction layer for Bedrock integration, while the proxy endpoint offers a drop-in server solution that exposes Bedrock through a standard OpenAI-compatible interface—solving the same problem at different architectural levels.
About bedrock-wrapper
jparkerweb/bedrock-wrapper
🪨 Bedrock Wrapper is an npm package that simplifies the integration of existing OpenAI-compatible API objects with AWS Bedrock's serverless inference LLMs.
This tool helps developers who are building applications that use Large Language Models (LLMs) from AWS Bedrock but want to use the more familiar OpenAI API format. It takes your existing OpenAI-compatible chat completion requests and routes them to various powerful Bedrock models, allowing you to easily integrate AWS's LLMs into your projects. This is for software developers or AI engineers.
About bedrock-proxy-endpoint
jparkerweb/bedrock-proxy-endpoint
🔀 Bedrock Proxy Endpoint ⇢ Spin up your own custom OpenAI API server endpoint for easy AWS Bedrock inference (using standard baseUrl, and apiKey params)
This tool helps developers who are building applications using Large Language Models (LLMs) and are working within the AWS Bedrock ecosystem. It allows you to use the familiar OpenAI API client to send requests, which are then seamlessly converted and routed to AWS Bedrock's LLM services. Essentially, you feed it standard OpenAI API calls, and it handles the complex translation to interact with Bedrock, returning the LLM inference results.
Scores updated daily from GitHub, PyPI, and npm data. How scores work