Product Hunt logo dark
  • Launches
    Coming soon
    Upcoming launches to watch
    Launch archive
    Most-loved launches by the community
    Launch Guide
    Checklists and pro tips for launching
  • Products
  • News
    Newsletter
    The best of Product Hunt, every day
    Stories
    Tech news, interviews, and tips from makers
    Changelog
    New Product Hunt features and releases
  • Forums
    Forums
    Ask questions, find support, and connect
    Streaks
    The most active community members
    Events
    Meet others online and in-person
  • Advertise
Subscribe
Sign in
Subscribe
Sign in
Chathouse: Chat with AI

Chathouse: Chat with AI

AI chat for iPhone/iPad, with ollama and reasoning support

22 followers

AI chat for iPhone/iPad, with ollama and reasoning support

22 followers

Visit website
AI Chatbots
•
Writing assistants
Chathouse is your all-in-one gateway to a universe of AI-powered conversations. It supports lots of state-of-the-art models, including the best reasoning LLMs, and even lets you connect to self-hosted ollama instances. Of course, there's a free tier as well!
  • Overview
  • Launches1
  • Reviews
  • Team
  • More
Company Info
apple.co/3F2Rrl2App Store
Chathouse: Chat with AI Info
Launched in 2025View 1 launch
Forum
p/chathouse-chat-with-ai
  • Blog
  • •
  • Newsletter
  • •
  • Questions
  • •
  • Forums
  • •
  • Product Categories
  • •
  • Apps
  • •
  • About
  • •
  • FAQ
  • •
  • Terms
  • •
  • Privacy and Cookies
  • •
  • X.com
  • •
  • Facebook
  • •
  • Instagram
  • •
  • LinkedIn
  • •
  • YouTube
  • •
  • Advertise
© 2025 Product Hunt
Chathouse: Chat with AI gallery image
Chathouse: Chat with AI gallery image
Free Options
Launch tags:
Productivity•Artificial Intelligence•Tech
Launch Team / Built With
Marco QuintenJann Schafranek
Swift

What do you think? …

Marco Quinten
Marco Quinten
Xpolyglot

Xpolyglot

Maker
📌
Hello, one of the developers here! Chathouse is designed to be as private as possible. We're a EU company, and do not log any of your chats. Speech recognition is done on-device, fully local. If you connect to your own ollama/exo/etc. instance, your chats never leave your local network. If you use the built-in or provider-specific models, the data protection policies and terms of use of the specific providers apply. We do not log any prompts, but we can't guarantee that the inference provider doesn't either. We hope you enjoy Chathouse, and if you have any questions, feel free to ask!
Report
6mo ago
Logan King
Logan King

@fivesheep I love the multi model support.

Report
6mo ago
Logan King
Logan King

Is Chathouse provide performance insights to help users choose the best model for their needs?

Report
6mo ago
Marco Quinten
Marco Quinten
Xpolyglot

Xpolyglot

Maker

@logan_king Thanks for the question! Chathouse doesn't provide performance insights, but I think we do something that not many people are doing right now.


The Chathouse models are basically "proxy" models with a dynamic underlying model. The user doesn't have to care what the model is, and I think most users have no idea which model to use anyway. You gotta be very deep into the LLM rabbit hole to have any idea about the specific differences between different state-of-the-art models.


So in the Free and Pro plans, we automatically use one of the latest and greatest models available. Currently, that's Gemini 2.0 Flash Lite for free and Gemini 2.0 Flash for Pro, which is performing incredibly well in benchmarks, and provides great latency and performance, paired with good instruction following and problem solving capabilities.


The Chathouse Pro+ model goes one step further and uses Not Diamond's prompt routing, so the prompt will actually be sent to the model that's most likely to give the best answer. So with Pro+, not only can you freely choose between all available models; You are also very likely to get the best possible answer across all providers by just staying on the "default" Chathouse Pro+ model.


So TLDR: You'll get very good results without having to know anything about LLMs, and on Pro+ you'll even get better results on average than if you were to choose the model manually.

Report
6mo ago
Charles Reveley
Charles Reveley

Great job switching between models might be overwhelming for non tech users. Are there any plans for a beginner mode with simplified options?

Report
6mo ago
Marco Quinten
Marco Quinten
Xpolyglot

Xpolyglot

Maker

@charles_reveley Hi Charles, you are absolutely correct. That's why the default model choice is our in-house "Chathouse" proxy model, which is running a modern state-of-the-art LLM underneath.


New users or beginners never have to bother with switching between models; they can literally just write messages and get high-quality responses by default.


Similarly, users can switch between "regular" and "reasoning" models simplify pressing the "Deep Thinking" button. They don't have to know anything about the underlying models: Everything will just work.


Of course, advanced users have full control over the model they're using. You can always connect to providers using your own API key completely for free, or use any model from our huge model catalog on the Pro+ plan.

Report
6mo ago
Deepgram Voice Agent API
Deepgram Voice Agent API — Build production-ready voice agents with a unified speech-to-speech API.
Build production-ready voice agents with a unified speech-to-speech API.
Promoted

Do you use Chathouse: Chat with AI?

Reviews
Helpful
Review Chathouse: Chat with AI?Be the first to review Chathouse: Chat with AI