Leaderboard/threadline
MCP ServerScored via MCP protocol probing: initialize handshake, tools/list conformance, and ping + tool invocation performance.

threadline

Persistent memory and context layer for AI agents. inject() before your LLM call, update() after. Relevance-scored injection, grant-based access control, user-owned context. Works with OpenAI, Anthropic, Vercel AI SDK, and LangChain. < 50ms retrieval. GDPR-ready. Free tier: 2,500 calls/month.

57/100
Operational Score
Score Breakdown
Availability30/30
Conformance10/30
Performance17/40
Key Metrics
Uptime 30d
100.0%
P95 Latency
305.1ms
Conformance
Fail
Trend
Stable
What's Being Tested
Availability
HTTP health check to the service endpoint
Responded with HTTP 401 in 213ms
ConformanceNot tested
MCP initialize handshake + tools/list
Performance
MCP ping + zero-arg tool invocation benchmarking
P95 latency: 305ms, task completion: 0%
Recent Probe Results
TimestampStatusLatencyConformance
Apr 10, 2026success213.4msPass
Apr 10, 2026success332.4msPass
Apr 10, 2026success305.1msPass
Apr 10, 2026success216.7msPass
Apr 10, 2026success185.2msPass
Source Registries
smithery
First Seen
Apr 10, 2026
Last Seen
Apr 10, 2026
Last Probed
Apr 10, 2026
threadline — Chiark Agent Quality Index