
LLM Gateway
Launching today
Use any AI model with just one API
236 followers
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
236 followers
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
LLM Gateway
Hey all! We're excited to finally launch llmgateway.io on Product Hunt 🚀
A fully open-source Ai Gateway for your LLM needs, route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
Since launch, LLM Gateway continued to grow, mostly through subreddits, X and LinkedIn. Now it's time to launch it on Product Hunt.
Luca and I (Ismail) are excited to launch LLM Gateway with the Product Hunt community with an exceptional offer of 50% off the pro plan forever! use Promo code: "PRODUCTHUNT"
Here is a list of our top features:
◆ Simple usage overview dashboard (total requests, tokens used, cost estimate, organization credits, etc.)
◆ BYOK (Bring your own keys), use credits or go Hybrid mode (we support up to 11+ providers and over 60+ models)
◆ Configure caching to your preferences
◆ Activity stats: Keep track of each model usage, charts showcasing cost estimates per provider, request volume etc...
◆ Advanced activity: Detailed overview of each prompt, model, cost, provider, time, response metrics and cost information etc...)
◆ Self-hostable: Checkout the Docs
We've put a lot of time, effort & care into building LLM Gateway, and we honestly hope you like the product! Let us know if you have any feedback, we'll take your feedback and work on it! Thank you for your support!
@smakosh Firstly, congratulations on the first Product Hunt launch. And btw, genuinely liked the idea that you guys are working on. Will try to support this further !
Tidyread
Great to see an open-source solution like LLM Gateway! 🚀 The BYOK feature is intriguing. How does it manage security when users bring their own keys?
LLM Gateway
@jaredl They are stored in plaintext at this time, but we may add some encryption wrapper around it to make it harder to leak the secret. This applies to both cloud & self hosted.
BestPage.ai
i saw it is open sourced, so does it mean we can combine it with our own product and let users charge LLMs to your product to use or let them bring their own keys to use our own product without charging. And in both ways, they can manage the token usage in Gateway dashboard right??
LLM Gateway
@joey_zhu1 not sure I understand to be honest, right now you’re free to do whatever you want. we might restrict reselling llmgateway directly by charging users credits since that’s not the intention. but using the gateway as an end user will always stay free and open source