AXIOM TOKEN MONITOR

Context Reasoning Manager — Local LLM Token Tracking  |  Contexter Dashboard →
DB
Ollama
API

Token Summary

uptime: --
0
Total Tokens
$0.0000
Est. Cost
0
Requests
--%
Avg Headroom
0ms
Avg Latency

Bucket Distribution

0 tokens
Bucket A — Map
0
max: 5%
Bucket B — Target
0
max: 20%
Bucket C — Deltas
0
max: 10%
Bucket D — Compliance
0
max: 5%
0%
Reasoning Free
0%
Context Used
0
Model Window
None
Compression

Token Flow Over Time

Auto-refresh: 10s

Recent Context Assemblies

0 entries
Time File Model Total A B C Tier Latency Cost Headroom

Local LLM (Ollama) Statistics

checking...
--
Embed Model
--
Summarize Model
--
Embed Dim
0
Compressions
0
Embeddings

Model Context Windows

ModelContext WindowTypeBucket A (5%)Bucket B (20%)Bucket C (10%)