50
rusty-gradients
v0.2.0 ExperimentalA full-stack deep learning framework in Rust for training and deploying Transformer models. Features multi-backend support (CPU/CUDA/Metal/WASM), 62x GPU acceleration, Safetensors serialization, and BPE tokenization.
MIT Edition 2021
Quick Verdict
- โActively maintained (updated 67d ago)
- !Pre-1.0: API may have breaking changes
- !Heavy dependency tree (26 direct deps)
- โPermissive license (MIT)
Security
Checking security advisories...
Downloads
683
Dependents
0
Releases
3
Size
129KB
Deep Insights
๐
Stable downloads
5 downloads in the last 30 days (0/day). Volume is roughly flat compared to the previous period.
๐ฆ
Heavy dependency tree
26 direct dependencies. Consider the impact on compile times and supply chain complexity.
Health Breakdown
Maintenance 15/25
Recency, release consistency, active ratio
Quality 13/25
Yanked ratio, deps, size, maturity, features
Community 6/20
Reverse deps, ownership, ecosystem
Popularity 3/15
Downloads, momentum, growth trend
Documentation 13/15
Docs, repo, license, metadata
Download Trend
Daily downloads ยท last 90 days
0/day avg-72%
Version Adoption
v0.1.1
55%
v0.1.0
43%
v0.2.0
3%
Release Timeline
3 releasessince 2025
J
F
M
A
M
J
J
A
S
O
N
D
2025
2
2026
1
LessMore
Feature Flags
default =["cpu"]
cpu*cudasimdmetalcandlelegacycpu-blasacceleratewasm-debughuggingfacetokenizationdata-parallelmetal-backendserialization
README
Loading README...
Maintainers
Dependencies
26
direct dependencies
Dependents
0
crates depend on rusty-gradients