Decision Workspace
ushi vs mullama vs llama-gguf
Side-by-side comparison of Rust crates
41
ushi
experimentalv0.1.1
High-performance LLM inference server with llama.cpp FFI bindings
49
mullama
experimentalv0.1.1
Comprehensive Rust bindings for llama.cpp with memory-safe API and advanced features
45
llama-gguf
experimentalv0.14.0
A high-performance Rust implementation of llama.cpp - LLM inference engine with full GGUF support
Core Metrics
| ushi | mullama | llama-gguf | |
|---|---|---|---|
| Health Score | 41 | 49 | 45 |
| Total Downloads | 754 | 66 | 796 |
| 30d Downloads | 5 | 7 | 346 |
| Dependents | 0 | 0 | 2 |
| Releases | 2 | 2 | 27 |
| Last Updated | 229d ago | 70d ago | 2d ago |
| Age | 7m | 3m | 1m |
Health Breakdown
ushi
Maintenance
8
Quality
11
Community
6
Popularity
3
Documentation
13
mullama
Maintenance
13
Quality
13
Community
6
Popularity
2
Documentation
15
llama-gguf
Maintenance
13
Quality
12
Community
7
Popularity
3
Documentation
10
Technical Details
| ushi | mullama | llama-gguf | |
|---|---|---|---|
| Version | 0.1.1 | 0.1.1 | 0.14.0 |
| Stable (≥1.0) | ✗ No | ✗ No | ✗ No |
| License | GPL-3.0-or-later | MIT | MIT OR Apache-2.0 |
| Dependencies | 41 | 53 | 43 |
| Crate Size | 177KB | 426KB | 708KB |
| Features | 2 | 17 | 16 |
| Yanked % | 0.0% | 0.0% | 0.0% |
| Edition | 2024 | 2021 | 2024 |
| MSRV | — | 1.75 | — |
| Owners | 1 | 1 | 1 |
Links
Quick Verdict
- •mullama leads with a health score of 49/100, but none of the options score above 80.
- •llama-gguf has the most downloads (796), suggesting wider adoption.