Skip to content
Documentation GitHub
Reference

Python Sidecar IPC Reference

JSON-RPC protocol specification for communication between the Rust host (dspy_daemon.rs) and the Python sidecar (apps/python-sidecar/).

Transport

The sidecar communicates via stdin/stdout, one JSON object per line (newline-delimited JSON). The Rust host writes requests to the sidecar’s stdin and reads responses from its stdout. Stderr is reserved for logging and DSPy’s internal print output.

Request Format

{
"type": "<request_type>",
"params": { ... }
}

All requests have a type field and an optional params object.

Response Format

Success

{
"type": "response",
"status": "ok",
"result": { ... }
}

Error

{
"type": "response",
"status": "error",
"error_code": "<machine_readable_code>",
"message": "<human_readable_description>",
"details": { ... }
}

The details field is optional and provides additional context (e.g., available templates when a template is not found).


Request Types

health_check

Verify the sidecar is running and report version/configuration status.

Request:

{ "type": "health_check" }

Response:

{
"type": "response",
"status": "ok",
"result": {
"healthy": true,
"dspy_available": true,
"sidecar_version": "0.1.0",
"dspy_version": "2.6.1",
"llm_configured": false
}
}

configure

Configure the LLM connection. Must be called before execute or optimize.

Request:

{
"type": "configure",
"params": {
"model": "anthropic/claude-sonnet-4-20250514",
"credentials": {
"api_key": "sk-..."
}
}
}

Response:

{
"type": "response",
"status": "ok",
"result": { "configured": true }
}

Note: In production, LLM calls are routed through the InklingsLM adapter back to Rust via IPC. The credentials field is used only for direct LLM access during development/testing.


execute

Run a template with inputs and optional saved state.

Request:

{
"type": "execute",
"params": {
"template_id": "chain_of_thought",
"signature": "question -> answer",
"params": {
"question": "What is the capital of France?"
},
"state_blob": null
}
}
FieldTypeRequiredDescription
template_idstringyesID from the template registry
signaturestringnoDSPy signature (overrides template default)
paramsobjectnoInput fields matching the signature
state_blobobject/nullnoPreviously saved state to restore before execution

Success Response:

{
"type": "response",
"status": "ok",
"result": {
"answer": "Paris"
},
"state_blob": { "...": "serialized DSPy module state" }
}

The state_blob in the response contains the module’s state after execution. Store it to restore the same state in future calls.

Error Responses:

Error CodeCondition
llm_not_configuredNo configure message received yet
missing_fieldtemplate_id not provided
unknown_templatetemplate_id not in registry (includes available list in details)
invalid_statestate_blob could not be deserialized
execution_failedTemplate forward() raised an exception

optimize

Run a DSPy optimizer on a template to produce an improved state blob.

Request:

{
"type": "optimize",
"params": {
"template_id": "predict",
"optimizer": "BootstrapFewShot",
"signature": "input -> output",
"examples": [
{ "input": "hello", "output": "world" }
]
}
}
FieldTypeRequiredDescription
template_idstringyesTemplate to optimize
optimizerstringyesDSPy optimizer class name (must be in allowlist)
signaturestringnoDSPy signature
examplesarrayyesTraining examples for the optimizer

Allowed optimizers: BootstrapFewShot, BootstrapFewShotWithRandomSearch, LabeledFewShot, COPRO, MIPRO, MIPROv2, SignatureOptimizer, BayesianSignatureOptimizer, BootstrapFinetune, KNNFewShot

Error Responses:

Error CodeCondition
unsupported_optimizerOptimizer name not in allowlist
optimizer_not_foundOptimizer class not found in DSPy
execution_failedOptimizer raised an exception

manifest

List all available templates with their metadata.

Request:

{ "type": "manifest" }

Response:

{
"type": "response",
"status": "ok",
"result": {
"templates": {
"predict": {
"description": "Basic DSPy Predict -- single LLM call with signature",
"signature": "input -> output",
"parameters": {}
},
"chain_of_thought": {
"description": "Chain of Thought -- step-by-step reasoning before output",
"signature": "input -> output",
"parameters": {}
}
}
}
}

Error Codes Reference

CodeDescription
unknown_templateTemplate ID not found in registry
missing_fieldRequired field not present in request
invalid_stateState blob deserialization failed
unsupported_optimizerOptimizer not in the allowed set
optimizer_not_foundOptimizer class not available in DSPy
execution_failedRuntime error during template execution
unknown_request_typeUnrecognized request type field
sandbox_requiredTemplate requires sandbox execution (not available)
llm_not_configuredLLM not configured — send configure first

Lifecycle

  1. Rust spawns the sidecar process with kill_on_drop(true) and sanitized environment
  2. Rust sends configure with LLM credentials (or InklingsLM IPC mode)
  3. Rust sends health_check to verify readiness
  4. Rust sends execute / optimize / manifest as needed
  5. Sidecar auto-shuts down after DEFAULT_IDLE_TIMEOUT (5 minutes) of inactivity
  6. RESPONSE_TIMEOUT (60 seconds) bounds wall-clock time per request

Key Files

FileRole
crates/infrastructure/agent-harness/src/dspy_daemon.rsRust-side daemon management and IPC
apps/python-sidecar/main.pyPython-side async stdin/stdout loop
apps/python-sidecar/src/inklings/dspy/dispatcher.pyRequest routing
apps/python-sidecar/src/inklings/dspy/errors.pyError codes and structured error responses

See Also

Was this page helpful?