Dual-Output Logging Infrastructure with Command Spans and Correlation
Dual-Output Logging Infrastructure with Command Spans and Correlation
Problem
Building a desktop application with Tauri requires comprehensive logging for production debugging. The challenges include:
- Dual Output Need: Developers need human-readable console output during development, while production debugging requires machine-parseable JSON logs for analysis
- Frontend/Backend Correlation: Operations span both TypeScript frontend and Rust backend - logs need to be correlated
- Security: File paths in logs can expose sensitive filesystem structure
- Retention Management: Log files need automatic rotation and cleanup
Symptoms:
- Unable to trace operations across frontend/backend boundary
- Logs missing context when debugging production issues
- File paths in logs exposing user home directories
- Log files accumulating indefinitely
Investigation
Steps Tried
- tauri-plugin-log - Uses
fern/logcrate, would require migrating away from existingtracingsetup - Console-only logging - Insufficient for production debugging; logs lost on app restart
- Simple file logging - Missing structured format for analysis
What Worked
The tracing + tracing-appender combination with layered subscribers provides both human-readable console output and
structured JSON file output from a single logging configuration.
Root Cause
Desktop applications need different logging strategies for development vs. production:
- Development: Quick visual scanning of console output
- Production: Structured data for filtering, searching, and correlation
The Rust tracing ecosystem supports this through its layer-based architecture where multiple output targets can share
a single filter configuration.
Solution
Architecture Overview
┌─────────────────────────────────────────────────────────┐│ Application Startup (main.rs) │├─────────────────────────────────────────────────────────┤│ 1. Detect paths (dev/.data vs production paths) ││ 2. logging::init(&paths) ││ ├── Prepare logs directory with 0o700 perms ││ ├── Create daily-rotating file appender ││ └── Initialize dual layers (console + JSON file) ││ 3. Load settings (includes log_retention_days) ││ 4. logging::cleanup_old_logs(retention_days) │└─────────────────────────────────────────────────────────┘Dependencies
# Cargo.toml (workspace)tracing = "0.1"tracing-subscriber = { version = "0.3", features = ["env-filter", "json"] }tracing-appender = "0.2"// package.json (frontend)"loglevel": "^1.9.2"Backend Logging (Rust)
File: apps/desktop/src-tauri/src/logging.rs
use tracing_appender::non_blocking::WorkerGuard;use tracing_appender::rolling::{RollingFileAppender, Rotation};use tracing_subscriber::fmt::format::FmtSpan;use tracing_subscriber::{EnvFilter, fmt, layer::SubscriberExt, util::SubscriberInitExt};
pub fn init(paths: &AppPaths) -> WorkerGuard { let logs_dir = paths.logs_dir();
// Prepare directory with secure permissions if let Err(e) = prepare_logs_directory(&logs_dir) { eprintln!("Warning: {}. File logging may be unavailable.", e); }
// Create file appender with daily rotation let file_appender = RollingFileAppender::new(Rotation::DAILY, &logs_dir, "inklings.log"); let (non_blocking, guard) = tracing_appender::non_blocking(file_appender);
// Environment filter: info in debug, warn in release let env_filter = EnvFilter::try_from_default_env().unwrap_or_else(|_| { if cfg!(debug_assertions) { EnvFilter::new("info") } else { EnvFilter::new("warn") } });
// Console layer: human-readable let console_layer = fmt::layer() .with_target(true) .with_thread_ids(false) .with_file(false) .with_line_number(false);
// File layer: JSON format let file_layer = fmt::layer() .json() .with_writer(non_blocking) .with_target(true) .with_span_events(FmtSpan::CLOSE) .with_current_span(true);
tracing_subscriber::registry() .with(env_filter) .with(console_layer) .with(file_layer) .init();
guard // CRITICAL: Must be kept alive for app's lifetime}Command Spans for Correlation
File: apps/desktop/src-tauri/src/context.rs
use std::sync::atomic::{AtomicU64, Ordering};use tracing::Span;
static OPERATION_COUNTER: AtomicU64 = AtomicU64::new(1);
/// Create a span for a Tauri command with auto-generated op_idpub fn command_span(command: &str) -> Span { let op_id = OPERATION_COUNTER.fetch_add(1, Ordering::Relaxed); tracing::info_span!("command", op_id = op_id, cmd = command)}
/// Create a span with frontend-provided op_id for end-to-end correlationpub fn command_span_with_id(command: &str, frontend_op_id: Option<String>) -> Span { let op_id = frontend_op_id.unwrap_or_else(|| { OPERATION_COUNTER.fetch_add(1, Ordering::Relaxed).to_string() }); tracing::info_span!("command", op_id = %op_id, cmd = command)}Usage in Tauri Commands
#[tauri::command]pub fn create_page( state: State<AppState>, title: String, parent_slug: Option<String>,) -> Result<CreatePageResult, CommandError> { let _span = command_span("create_page").entered(); tracing::info!(title = %title, parent_slug = ?parent_slug, "page_create_started");
// ... operation logic ...
tracing::info!(slug = %response.slug, "page_create_success"); Ok(result)}Path Sanitization
/// Prevent exposing full filesystem paths in production logspub fn sanitize_path(path: &Path) -> String { if cfg!(debug_assertions) { return path.display().to_string(); // Full path in dev }
// Release: Only show last 2 components let components: Vec<_> = path.components().collect(); if components.len() <= 2 { return path.display().to_string(); }
let last_two: std::path::PathBuf = components .iter() .skip(components.len().saturating_sub(2)) .collect(); format!(".../{}", last_two.display())}Frontend Logging (TypeScript)
File: apps/desktop/src-react/lib/logger.ts
import log from 'loglevel';
const rootLogger = log.getLogger('inklings');rootLogger.setLevel(import.meta.env.DEV ? 'debug' : 'warn');
export interface LogContext { event: string; op_id?: string; [key: string]: unknown;}
function formatLog(level: string, ctx: LogContext): string { return JSON.stringify({ timestamp: new Date().toISOString(), level, svc: 'inklings-ui', ...ctx, });}
export const logger = { debug: (ctx: LogContext): void => rootLogger.debug(formatLog('debug', ctx)), info: (ctx: LogContext): void => rootLogger.info(formatLog('info', ctx)), warn: (ctx: LogContext): void => rootLogger.warn(formatLog('warn', ctx)), error: (ctx: LogContext): void => rootLogger.error(formatLog('error', ctx)), newOpId: (): string => crypto.randomUUID(),};Log Retention Configuration
File: crates/domain/src/settings.rs
pub const MIN_LOG_RETENTION_DAYS: u32 = 1;pub const MAX_LOG_RETENTION_DAYS: u32 = 365;
impl Settings { pub fn set_log_retention_days(&mut self, days: u32) { self.log_retention_days = days.clamp(MIN_LOG_RETENTION_DAYS, MAX_LOG_RETENTION_DAYS); }}Log Schemas
Backend JSON (file output):
{ "timestamp": "2024-01-15T10:30:00.000Z", "level": "INFO", "target": "inklings_desktop::commands::page", "fields": { "op_id": 42, "cmd": "create_page", "slug": "my-page" }, "spans": [{ "op_id": 42, "cmd": "create_page", "name": "command" }]}Frontend JSON (browser console):
{ "timestamp": "2024-01-15T10:30:00.000Z", "level": "info", "svc": "inklings-ui", "event": "page_create_started", "op_id": "uuid-string"}Prevention
Best Practices
- Always use command spans for mutating operations
- Follow event naming convention:
domain_action_status(e.g.,page_create_started) - Sanitize paths before logging:
tracing::info!(path = %sanitize_path(&path), ...) - Keep WorkerGuard alive for entire application lifetime
- Use structured fields instead of string interpolation
Event Naming Convention
| Pattern | Example |
|---|---|
noun_verb_started | page_create_started |
noun_verb_success | workspace_open_success |
noun_verb_failed | import_file_failed |
noun_action_completed | log_cleanup_completed |
Warning Signs
- Logs missing
op_idfield - check if command_span is being used - Full file paths appearing in production logs - use sanitize_path()
- Log files growing indefinitely - check retention settings
- Unable to correlate frontend/backend logs - pass op_id from frontend
Log File Locations
| Environment | Directory |
|---|---|
| Development | {project}/.data/.inklings/logs/ |
| Production | ~/Inklings/.inklings/logs/ |
References
- Commit: eaeeb07
feat(logging): add production-ready dual-output logging infrastructure - Files:
apps/desktop/src-tauri/src/logging.rsapps/desktop/src-tauri/src/context.rsapps/desktop/src-react/lib/logger.tscrates/domain/src/settings.rs
Was this page helpful?
Thanks for your feedback!