Aqueduct

Aqueduct

Taking Data Science to Production

5.0
2 reviews

227 followers

Aqueduct automates the engineering required to take data science to production. By abstracting away low-level cloud infrastructure, Aqueduct enables data teams to run models anywhere, publish predictions where they're needed, and monitor results reliably.
This is the 2nd launch from Aqueduct. View more
Aqueduct

Aqueduct

The easiest way to run open source LLMs
Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!
Aqueduct gallery image
Aqueduct gallery image
Aqueduct gallery image
Aqueduct gallery image
Aqueduct gallery image
Free
Launch Team

What do you think? …

Vikram Sreekanti
Hi everyone! LLMs have taken the world by storm, but using them is a pain (or a non-starter) for most people, due to concerns around data privacy, IP ownership, and cost. Open-source LLMs, like LLaMa, Dolly, and Vicuna have enabled enterprises to think about using LLMs, but they're a pain to operate. At Aqueduct, our goal has been to enable ML teams to use the best technology without the operational nightmare of running ML in the cloud, and we're super excited to share that Aqueduct now allows you to run open-source LLMs with a single API call. ➡️ Aqueduct's Python API allows you to call an LLM with a single line of code (see the first image above). No need to worry about installing drivers and library dependencies and debugging configuration parameters. ☁️ Aqueduct is designed to work with any infrastructure you use; you can run your LLMs on a large server or on a Kubernetes cluster. You can even have Aqueduct spin up a cluster for you. 🔁 You can publish your LLM-based workflows to run ad hoc or on a fixed schedule using Aqueduct. 💡 Aqueduct's visibility features extend naturally to LLMs, so you can see what parameters or prompts you used and how performance evolves over time. We'd love to hear what you think! Check out our open-source project or join our Slack community. GitHub: https://github.com/aqueducthq/aq... Slack: https://slack.aqueducthq.com
Joseph Gonzalez
I am really excited to see how people actually use LLMs to solve real problems. How will you use LLMs?
Joe Hellerstein
The next generation of AI is in all our hands; not behind superscalar moats. This launch lets us run our own LLMs, on prem or in a secure cloud. Aqueduct makes it easy, using infrastructure you already understand.