Launches
Coming soon
Upcoming launches to watch
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
Products
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
Subscribe
Sign in
Qwen 1.5 MoE Reviews (2025) | Product Hunt
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
1 review
•
51 followers
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
1 review
•
51 followers
Visit website
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
Overview
Launches
1
Reviews
1
Alternatives
Team
More
Blog
•
Newsletter
•
Questions
•
Forums
•
Product Categories
•
Apps
•
About
•
FAQ
•
Terms
•
Privacy and Cookies
•
X.com
•
Facebook
•
Instagram
•
LinkedIn
•
YouTube
•
Advertise
© 2025 Product Hunt
Similar Products
Airtrain.ai LLM Playground
Vibe-check many open-source and proprietary LLMs at once
4.7
(3 reviews)
Data analysis tools
AI Coding Assistants
LLM Explorer
Find the best large language model for a local inference
AI Coding Assistants
LLMs
Qwen 1.5 MoE reviews
The community submitted 1 review to tell us what they like about Qwen 1.5 MoE, what Qwen 1.5 MoE can do better, and more.
5.0
Based on 1 review
Review Qwen 1.5 MoE?
Leave a review
Reviews
Helpful
Founder Reviews (1)
Salman Paracha
used Qwen 1.5 MoE to build
Arch
(262 points)
Highly performant base models that can be used for task-specific training. Such as the function calling experience built into Arch
Helpful
Share
Report
9mo ago
Other Reviews
View all
Review Qwen 1.5 MoE?
Be the first to review Qwen 1.5 MoE
Leave a review