37
tllama
v0.1.1 ExperimentalLightweight Local LLM Inference Engine
MIT Edition 2024
Quick Verdict
- !Pre-1.0: API may have breaking changes
- !Heavy dependency tree (23 direct deps)
- โPermissive license (MIT)
Security
Checking security advisories...
Downloads
283
Dependents
0
Releases
1
Size
43KB
Deep Insights
๐
Download activity
3 downloads in the last 30 days (0/day avg).
๐ฆ
Heavy dependency tree
23 direct dependencies. Consider the impact on compile times and supply chain complexity.
๐
Compact crate
At 42KB, tllama is lightweight. Small crate size correlates with focused, well-scoped functionality.
Health Breakdown
Maintenance 5/25
Recency, release consistency, active ratio
Quality 15/25
Yanked ratio, deps, size, maturity, features
Community 1/20
Reverse deps, ownership, ecosystem
Popularity 3/15
Downloads, momentum, growth trend
Documentation 13/15
Docs, repo, license, metadata
Download Trend
Daily downloads ยท last 90 days
0/day avg+58%
Version Adoption
v0.1.1
100%
Release Timeline
1 releasessince 2025
J
F
M
A
M
J
J
A
S
O
N
D
2025
1
2026
LessMore
Feature Flags
default =["tpl-gtmpl", "engine-llama-cpp", "api", "chat"]
api*chat*hw-cudahw-metalengine-hfhw-nativehw-vulkantpl-gotpltpl-gtmpl*tpl-minijinjaengine-llama-cpp*
README
Loading README...
Maintainers
Dependencies
23
direct dependencies
Dependents
0
crates depend on tllama