Decision Workspace
mullama vs llama-gguf vs ushi
Side-by-side comparison of Rust crates
49
mullama
experimentalv0.1.1
Comprehensive Rust bindings for llama.cpp with memory-safe API and advanced features
45
llama-gguf
experimentalv0.14.0
A high-performance Rust implementation of llama.cpp - LLM inference engine with full GGUF support
41
ushi
experimentalv0.1.1
High-performance LLM inference server with llama.cpp FFI bindings
Core Metrics
| mullama | llama-gguf | ushi | |
|---|---|---|---|
| Health Score | 49 | 45 | 41 |
| Total Downloads | 66 | 796 | 754 |
| 30d Downloads | 7 | 346 | 5 |
| Dependents | 0 | 2 | 0 |
| Releases | 2 | 27 | 2 |
| Last Updated | 70d ago | 2d ago | 229d ago |
| Age | 3m | 1m | 7m |
Health Breakdown
mullama
Maintenance
13
Quality
13
Community
6
Popularity
2
Documentation
15
llama-gguf
Maintenance
13
Quality
12
Community
7
Popularity
3
Documentation
10
ushi
Maintenance
8
Quality
11
Community
6
Popularity
3
Documentation
13
Technical Details
| mullama | llama-gguf | ushi | |
|---|---|---|---|
| Version | 0.1.1 | 0.14.0 | 0.1.1 |
| Stable (≥1.0) | ✗ No | ✗ No | ✗ No |
| License | MIT | MIT OR Apache-2.0 | GPL-3.0-or-later |
| Dependencies | 53 | 43 | 41 |
| Crate Size | 426KB | 708KB | 177KB |
| Features | 17 | 16 | 2 |
| Yanked % | 0.0% | 0.0% | 0.0% |
| Edition | 2021 | 2024 | 2024 |
| MSRV | 1.75 | — | — |
| Owners | 1 | 1 | 1 |
Links
Quick Verdict
- •mullama leads with a health score of 49/100, but none of the options score above 80.
- •llama-gguf has the most downloads (796), suggesting wider adoption.