The pattern here is one stream per task, job, or session. A producer appends events as they happen, and consumers read them in real-time. The same read API serves both historical and live data — when a reader catches up to the tail, the connection stays open and new records arrive as they’re appended.
If you run sandboxed execution environments — CI/CD runners, browsers-as-a-service, coding sandboxes — your customers likely want to see what’s happening in real-time.Store each job’s output in its own stream. When a user kicks off a job, create a read-only access token scoped to that stream and return it alongside the job ID. The executor appends logs directly. The customer reads and tails them live.See the Browser Infra demo for a live example of this pattern — click into tasks to watch their progress in real-time.
Same pattern applied to build pipelines. Stream build output to users as it happens. Delete-on-empty can clean up streams automatically once they’ve been fully consumed and are no longer needed.