Decision Workspace
compression-prompt vs zeta-quantization vs llm-utl
Side-by-side comparison of Rust crates
47
compression-prompt
experimentalv0.1.2
Fast statistical compression for LLM prompts - 50% token reduction with 91% quality retention
45
zeta-quantization
experimentalv0.1.0
Advanced quantization engine for efficient LLM inference
58
llm-utl
experimentalv0.1.5
Convert code repositories into LLM-friendly prompts with smart chunking and filtering
Core Metrics
| compression-prompt | zeta-quantization | llm-utl | |
|---|---|---|---|
| Health Score | 47 | 45 | 58 |
| Total Downloads | 261 | 651 | 182 |
| 30d Downloads | 9 | 52 | 25 |
| Dependents | 0 | 3 | 2 |
| Releases | 2 | 1 | 6 |
| Last Updated | 141d ago | 186d ago | 100d ago |
| Age | 5m | 6m | 3m |
Health Breakdown
compression-prompt
Maintenance
10
Quality
13
Community
6
Popularity
3
Documentation
15
zeta-quantization
Maintenance
5
Quality
14
Community
8
Popularity
3
Documentation
15
llm-utl
Maintenance
20
Quality
13
Community
7
Popularity
3
Documentation
15
Technical Details
| compression-prompt | zeta-quantization | llm-utl | |
|---|---|---|---|
| Version | 0.1.2 | 0.1.0 | 0.1.5 |
| Stable (≥1.0) | ✗ No | ✗ No | ✗ No |
| License | MIT | MIT OR Apache-2.0 | MIT |
| Dependencies | 14 | 6 | 18 |
| Crate Size | 258KB | 18KB | 72KB |
| Features | 4 | 1 | 2 |
| Yanked % | 0.0% | 0.0% | 0.0% |
| Edition | 2024 | 2021 | 2024 |
| MSRV | 1.85 | 1.70 | 1.85.0 |
| Owners | 1 | 1 | 1 |
Links
Quick Verdict
- •llm-utl leads with a health score of 58/100, but none of the options score above 80.
- •zeta-quantization has the most downloads (651), suggesting wider adoption.