PromptForge

PromptForge

The ultimate prompt engineering workbench

656 followers

AI prompt engineering workbench for crafting, testing, and systematically evaluating prompts with powerful analysis tools. - insaaniManav/prompt-forge
Interactive
PromptForge gallery image
PromptForge gallery image
PromptForge gallery image
PromptForge gallery image
Free
Launch Team / Built With

What do you think? …

Manav Sethi
Hey Product Hunt! I built PromptForge because I was tired of the endless trial-and-error cycle of prompt engineering. As someone who works with AI daily, I found myself constantly tweaking prompts, losing track of what worked, and having no systematic way to evaluate improvements. The problem: Most prompt tools are just fancy text editors. You write, you test, you hope for the best. But there's no real engineering discipline. What makes PromptForge different: Systematic evaluation - Generate comprehensive test suites automatically Built-in analytics - Track what actually works across different scenarios Variable testing - Test edge cases, robustness, and consistency Prompt library - Never lose a working prompt again Dual analysis - AI-powered feedback on your prompts before you even test them Why I built it: I wanted to bring the same rigor to prompt engineering that we have in software engineering - version control, testing, systematic improvement. Currently supporting Claude, GPT-4, and Azure OpenAI with more providers coming soon. Docker deployment makes it dead simple to get started. What would you want to see in a prompt engineering tool? Always looking for feedback from fellow AI builders!
Django vinci

@insaanimanav nice work. Good luck.

Smart name tag.

Manav Sethi

@django_vinci Haha thanks

Dan League

@insaanimanav  awesome idea, thanks for sharing. when will this be ready for actual use?

Manav Sethi

@the_league Its out right now you can clone the repository and use the setup instructions to start using it

Chris Messina

This is cool — would love to be able to run it locally and use @LM Studio or @Ollama for inference. Would that be possible? (I also have Docker installed but it'd be nice to avoid unnecessary $$ cloud bills)

Manav Sethi

@chrismessina Thanks, Love this feedback! Local inference is definitely on the roadmap!

The Docker setup already runs locally - you just need to plug in your API keys. But adding LM Studio/Ollama support would be AMAZING for cost-free local inference!

Would you prefer Ollama integration first? And what models are you running locally? Always looking to prioritize based on real usage!

Thanks for the great suggestion!

Manav Sethi

@chrismessina  We now have support for local inference in prompt-forge you can now use ollama for local inference thanks to the work by https://github.com/halilugur

Nishant Arora

This is great! Congratulations on the launch @insaanimanav

Manav Sethi

@nshntarora Thanks