Decision Workspace
simple-llm-client vs ezllama vs llama-runner
Side-by-side comparison of Rust crates
38
simple-llm-client
experimentalv0.2.8
A Rust crate for interacting with Large Language Model APIs
55
ezllama
experimentalv0.3.1
An opinionated, simple Rust interface for local LLMs, powered by llama-cpp-2
47
llama-runner
experimentalv1.0.0
A straightforward Rust library for running llama.cpp models locally on device
Core Metrics
| simple-llm-client | ezllama | llama-runner | |
|---|---|---|---|
| Health Score | 38 | 55 | 47 |
| Total Downloads | 3.9K | 3.6K | 36 |
| 30d Downloads | 11 | 45 | 36 |
| Dependents | 0 | 0 | 0 |
| Releases | 10 | 6 | 3 |
| Last Updated | 232d ago | 342d ago | 3d ago |
| Age | 9m | 11m | 3d |
Health Breakdown
simple-llm-client
Maintenance
10
Quality
8
Community
6
Popularity
4
Documentation
10
ezllama
Maintenance
15
Quality
15
Community
6
Popularity
4
Documentation
15
llama-runner
Maintenance
14
Quality
15
Community
6
Popularity
2
Documentation
10
Technical Details
| simple-llm-client | ezllama | llama-runner | |
|---|---|---|---|
| Version | 0.2.8 | 0.3.1 | 1.0.0 |
| Stable (≥1.0) | ✗ No | ✗ No | ✓ Yes |
| License | MIT | MIT | Apache-2.0 |
| Dependencies | 11 | 7 | 10 |
| Crate Size | 22KB | 19KB | 291KB |
| Features | 3 | 3 | 4 |
| Yanked % | 30.0% | 0.0% | 0.0% |
| Edition | 2021 | 2024 | 2024 |
| MSRV | — | 1.85.0 | — |
| Owners | 1 | 1 | 1 |
Links
Quick Verdict
- •ezllama leads with a health score of 55/100, but none of the options score above 80.
- •simple-llm-client has the most downloads (3.9K), suggesting wider adoption.
- •simple-llm-client, ezllama are pre-1.0 — API may change.