Prompt shouldn’t be a new kind of programming.
When I first started using AI tools to code, I ran into this wall:
“Wait… I thought I could just describe what I wanted? Why do I need to learn how to prompt like a pro?”
If I have to study how to talk to the AI, something’s broken.
That’s not the future. That’s just a new kind of gatekeeping.
We’re building a tool that tries to actually understand what you mean —
even if you’re vague, even if you’re unsure, even if you’re just brainstorming out loud.
And honestly? Sometimes the best prompt is when you don’t know exactly what you want.
That’s when the AI should inspire you, not punish you for not being specific.
What do you think?
Should AI adapt to us more — instead of us adapting to AI?
64 views
Replies
Velocity
I think the move to engage with prompting in a more productive way for the purposes of business is actually going to be a boom time for philosophy and the humanities. Imagine a whole world interested in reading and writing to communicate more clearly and to impress up on others wonderful creations with words? Sounds like more writers to me.
Trickle
@kevin_mcdonagh1 Love this take — it’s such a refreshing lens on prompting and creativity.
At the same time, for a lot of people, expressing themselves clearly is hard, especially when ideas are still half-formed.
That’s why we think AI tools should help people explore, not just execute — and support them even when the words aren’t perfect yet.
That’s a fascinating question, it really comes down to who you’re building for. Seasoned developers thrive on precision as they develop, but newcomers might need AI to read between their lines and spark ideas. The difference between the two feels a lot less like a communication issue and more users not knowing exactly what it is they want. AI adapting to developers might need specificity though I'm not too sure about newcomers who rely heavily on natural language to build.
Trickle
@dheerajdotexe That’s such a great point — totally agree it depends on who you're designing for.
Experienced devs thrive on clear, specific prompts — but for newcomers, the idea is often still forming while they’re typing. That’s why we’re trying to make our tool more forgiving and exploratory, especially when the prompt is fuzzy.
We want AI to help shape ideas, not just respond to instructions.
Appreciate your thoughtful take — this is exactly the nuance we care about 🙌
Vibe coding is tough in a different way. I asked to fix a simple bug, it changed the entire layout, which it didn't have to. I think most of my credits are spent in fixing bugs, reverting to last working versions than adding features. I feel, at times the models try to be extra helpful than needed and thats a big drawback for me.
Trickle
@sanchitak This is such a key issue — and I think it points to a deeper challenge in how AI “interprets intent.”
Sometimes, when we give vague prompts, the model fills in the blanks too eagerly. Instead of asking “do you want me to change X?”, it just assumes “let me help by changing everything.”
Maybe what we need isn’t just smarter models — but better feedback loops, so users can say: “Yes to this change, no to that.”
Curious — would you prefer a tool that gives multiple fix options before applying changes? Or do you prefer instant but tightly-scoped updates?