Skip to content
Documentation GitHub
Data Flow

Agent Session

How an agent session flows from initial orientation through tool execution to memory persistence.


An agent session has three phases: Orient (deterministic context preparation), Work (tool execution loop), and Persist (memory and mechanical write-back). The Orient phase is performed by the Context Pipeline — a deterministic infrastructure component, not an agent — before any LLM tokens are consumed.


The Context Pipeline is deterministic infrastructure — not an agent and not an LLM call. It prepares context in two steps before any tokens are consumed:

Step 1 — Select (~100ms, 0 tokens)

The pipeline executes three parallel queries against the workspace:

  • FTS5 full-text search against the prompt
  • Semantic embedding search (nearest neighbours in the embedding index)
  • Wiki-link graph traversal from recently accessed pages

The output is a ranked set of candidate context items (pages, blocks, memory entries).

Step 2 — Refine (~500ms, cheap model)

The Refinement Gate uses a small, cheap LLM to rank and trim candidates to fit the context window. It produces the final SessionContext handed to the Orchestrator.

The Context Pipeline is activated by three verbs, each with different selection strategies:

VerbUsed ByStrategy
selectPipelineDeterministic retrieval (FTS5 + embeddings + links)
executeWorkerDirect tool call with known parameters
investigateResearcherIterative exploration across multiple tool calls

The Orchestrator delegates tasks to Worker or Researcher process types based on task structure. All tool execution is permission-gated:

  • All tools are visible in the LLM schema (the model can see every tool)
  • Enforcement happens at execution time via PermissionGuard::require(capability)
  • A denied capability returns a structured error — the agent does not crash
TypeRole
OrchestratorDecomposes goals into tasks; delegates to Workers/Researchers
ResearcherInvestigates via iterative tool calls; returns structured findings
WorkerExecutes discrete tool calls with known parameters
Skill ComposerAssembles and executes skill artifacts (prompt templates, DSPy modules)

After the work loop completes, the Orchestrator triggers memory persistence. Persist is:

  • Background — Does not block the user-facing response
  • Best-effort — Persistence failures are logged but do not fail the session
  • Tier-aware — Observations persist at the appropriate memory tier (Conversation, Channel, Workspace, or Account)

Memory entries are written via MemoryRepository with an LLM-assigned importance score in [0.0, 1.0]. The AgentMemoryFacade coordinates writes across tiers.


PropertyValue
Context Pipeline latency~100ms (select) + ~500ms (refine)
Context Pipeline tokens0 (select phase is token-free)
Permission modelAll tools visible; enforcement at execution via PermissionGuard
Memory persistenceBackground, best-effort; does not block user response
Session storageConversation-scoped MemoryRepository + ScratchpadRepository

Was this page helpful?