Banana.dev

Banana.dev

Serverless GPUs for Machine Learning Inference

508 followers

Banana provides inference hosting for ML models in three easy steps and a single line of code. Stop paying for idle GPU time and deploy models to production instantly with our serverless GPU infrastructure. Use Banana for scale. 🍌
Banana gallery image
Banana gallery image
Banana gallery image
Banana gallery image
Banana gallery image
Banana gallery image
Free Options
Launch Team

What do you think? …

Blake Peeling
Hi all πŸ‘‹, makers Kyle + Erik + Sahil + Blake + Candice here! We’re excited for you to try out Banana. Banana is an ML inference hosting platform on serverless GPUs. Why did we build Banana? We used to operate an ML consulting firm that helped companies build & host models in production. Through this work we realized how difficult it was to get models deployed, and how incredibly expensive it was. Customer models had to run on a fleet of always-on GPUs that would get maybe 10% utilization, which felt like a really big money pit and waste of compute. During our time as consultants we built really efficient deployment infrastructure underneath us. Six months ago we made a pivot to focus solely on productizing our deployment infra into a hosting platform for ML teams to use that would remove the pain of deployment and reduce the cost of hosting models. That brings us to today. Banana provides inference hosting for ML models in three easy steps and a single line of code. We are 100% self serve, meaning users can sign up and deploy to production without ever talking to our 🍌 team. And thanks to being on serverless GPUs, customers see their compute costs reduced by upwards of 90%. Try it out and let us know what you think!
Erik Dunteman
@bp_banana Hi friends :) Looking forward to a great launch day. Let us know any questions you may have.
Abishek Muthian
@bp_banana Congratulations on the launch! Banana could be addressing the need-gap: 'Democratisation of AI/ML/DL hardware', Posted on my problem validation forum - https://needgap.com/problems/50-... . You're welcomed to explain how Banana addresses that problem there, So those who need it can find Banana easily.
Morgan Gallant
Been using Banana in production for a good bit now, nothing but great things to share! Few notes: - Product is insanely good. Specifically, we use it for indexing jobs requiring a good bit of GPU compute. These jobs are huge, sometimes involving up to 1M inferences of a large NL model. Banana is perfect for this use case, as we can burst up to 10+ GPUs, only pay for the compute we use, and quickly scale back down to near zero. - Team is very strong, super responsive to questions and are experts at deploying & scaling ML models. We often get advice and recommendations from their team on how to best do something, and it's been really appreciated! - Lastly, velocity / speed of iteration has been ridiculous. They're moving really quick, have an ambitious roadmap, and ship new features and improvements daily. It's been really cool to watch. Would highly recommend anyone check them out!
Kyle Morris
@gallantlabs Thank you for your kind words & support, Morgan! Fantastic having you as a customer and inspiring seeing your progress as a team! Always a message away :)
Erik Dunteman
@gallantlabs Morgan is an awesome customer! Thanks for the love
Ryan Hoover
Did you invest, @turnernovak?
Erik Dunteman
@turnernovak @rrhoover Not sure he could handle as much potassium as us