Serverless LLM Proxy ✨ /׎-Compatible/

An OpenAI-compatible proxy running on AWS Lambda and Amazon API Gateway, powered by LiteLLM — seamless model switching without changing your code.

👨‍💻 All code and documentation is available at github.com/JGalego/Serverless-LLM-Proxy.


© João Galego | Built with ❤️ using Jekyll