
Inference engine for Apple Silicon
We launched our own inference engine written from scratch for Apple Silicon. It’s open sourced
https://github.com/trymirai/uzu
So you can easily launch LLMs on your Mac. You can explore more on our website here https://trymirai.com/
25 views
Replies
Amazing work! What about iPhones?