Mistral Small 4 Collection A state-of-the-art model, open-weight, with a granular Mixture-of-Experts architecture that fuses instruct, reasoning and agentic skills. • 3 items • Updated Mar 16 • 71
FastRTC Custom UIs Collection A collection of FastRTC demos that showcase how to built a Custom UI for your server • 4 items • Updated Apr 7, 2025 • 2
MOE/Mixture of Experts Models (see also "source" cll) Collection Mixture of Expert Models by me. This leverages the power of multiple models at the same time during generation for next level performance. • 60 items • Updated 3 days ago • 17
MobileLLM Collection Optimizing Sub-billion Parameter Language Models for On-Device Use Cases (ICML 2024) https://arxiv.org/abs/2402.14905 • 49 items • Updated Mar 2 • 141
LLM in a flash: Efficient Large Language Model Inference with Limited Memory Paper • 2312.11514 • Published Dec 12, 2023 • 264