Reverse proxy server for various LLM APIs. Features translation between API formats, user management, anti-abuse, API key rotation, DALL-E support, and optional prompt/response logging.
Go to file
nai-degen bae6574926 adds documentation for huggingface deployments 2023-04-09 20:11:07 -05:00
docs adds documentation for huggingface deployments 2023-04-09 20:11:07 -05:00
src correctly decompresses gzip/deflate payloads 2023-04-09 02:30:59 -05:00
.env.example
.gitattributes
.gitignore
.replit adds preliminary replit support 2023-04-09 18:55:24 -05:00
.replit.nix adds preliminary replit support 2023-04-09 18:55:24 -05:00
Dockerfile
README.md adds documentation for huggingface deployments 2023-04-09 20:11:07 -05:00
package-lock.json correctly decompresses gzip/deflate payloads 2023-04-09 02:30:59 -05:00
package.json adds preliminary replit support 2023-04-09 18:55:24 -05:00
tsconfig.json

README.md

title emoji colorFrom colorTo sdk pinned
oai-reverse-proxy 🔁 green purple docker false

OAI Reverse Proxy Server

Simple reverse proxy server for the OpenAI API.

Run on Repl.it

What is this?

If you have an API key you want to share with a friend, you can use this to keep your key safe while still allowing them to generate text with the API.

Why?

OpenAI keys have full account permissions. They can revoke themselves, generate new keys, modify spend quotas, etc. You absolutely should not share them.

If you still want to share access to your key, you can put it behind this proxy to ensure it can't be used to do anything but generate text. You can also set a separate key on the proxy to gatekeep access.

How to use

Since this is a server, you'll need to deploy it somewhere. A few options are available:

Deploy to Huggingface Space

See here for instructions on deploying to a Huggingface Space.