Launches
Coming soon
Upcoming launches to watch
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
Products
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
Subscribe
Sign in
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
1 review
•
51 followers
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
1 review
•
51 followers
Visit website
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
Overview
Launches
1
Reviews
1
Alternatives
Team
More
Blog
•
Newsletter
•
Questions
•
Forums
•
Product Categories
•
Apps
•
About
•
FAQ
•
Terms
•
Privacy and Cookies
•
X.com
•
Facebook
•
Instagram
•
LinkedIn
•
YouTube
•
Advertise
© 2025 Product Hunt
Similar Products
Airtrain.ai LLM Playground
Vibe-check many open-source and proprietary LLMs at once
4.7
(3 reviews)
Data analysis tools
AI Coding Assistants
LLM Explorer
Find the best large language model for a local inference
AI Coding Assistants
LLMs
Qwen 1.5 MoE launches
Launch date
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
Launched on April 3rd, 2024
3
111