Product Hunt logo dark
  • Launches
    Coming soon
    Upcoming launches to watch
    Launch archive
    Most-loved launches by the community
    Launch Guide
    Checklists and pro tips for launching
  • Products
  • News
    Newsletter
    The best of Product Hunt, every day
    Stories
    Tech news, interviews, and tips from makers
    Changelog
    New Product Hunt features and releases
  • Forums
    Forums
    Ask questions, find support, and connect
    Streaks
    The most active community members
    Events
    Meet others online and in-person
  • Advertise
Subscribe
Sign in
Subscribe
Sign in
Qwen 1.5 MoE

Qwen 1.5 MoE

Highly efficient mixture-of-expert (MoE) model from Alibaba

5.0
•1 review•

51 followers

Highly efficient mixture-of-expert (MoE) model from Alibaba

5.0
•1 review•

51 followers

Visit website
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
  • Overview
  • Launches1
  • Reviews1
  • Alternatives
  • Team
  • More
Company Info
huggingface.co/Qwen
Qwen 1.5 MoE Info
Launched in 2024View 1 launch
Forum
p/qwen-1-5-moe
  • Blog
  • •
  • Newsletter
  • •
  • Questions
  • •
  • Forums
  • •
  • Product Categories
  • •
  • Apps
  • •
  • About
  • •
  • FAQ
  • •
  • Terms
  • •
  • Privacy and Cookies
  • •
  • X.com
  • •
  • Facebook
  • •
  • Instagram
  • •
  • LinkedIn
  • •
  • YouTube
  • •
  • Advertise
© 2025 Product Hunt

Similar Products

Airtrain.ai LLM Playground
Airtrain.ai LLM Playground
Vibe-check many open-source and proprietary LLMs at once
4.7(3 reviews)
Data analysis toolsAI Coding Assistants
LLM Explorer
LLM Explorer
Find the best large language model for a local inference
AI Coding AssistantsLLMs

Qwen 1.5 MoE launches

Launch date
Qwen 1.5 MoE
Qwen 1.5 MoE Highly efficient mixture-of-expert (MoE) model from Alibaba

Launched on April 3rd, 2024