Skip to main content
Say you’re an infra provider - you run sandboxed execution environments, CI/CD runners, browsers-as-a-service, that kind of thing. Your customers kick off jobs on your platform and want to see what’s happening in real-time with logs from their running environments. Maybe you ingest into ClickHouse or similar for historical analytics, but customers still need to follow their live jobs in real-time. You need to provide dashboards that let them peek into the state of all their N running jobs and M previous jobs. S2 streams are perfect for this. Store the logs of each job in its own stream:
s2://infra-corp/jobs/user-12345/job-12345
When a user kicks off a job, that call schedules the execution and creates a new read-only S2 token scoped to just that job’s stream. Return the job ID and token to the user. Meanwhile, as the job runs, the executor appends logs directly to the stream. The customer can read and tail the logs in real-time using their token, with access to both live and historical data through the same API.

See it in action

Check out our live demo showing this pattern in action - you can click into tasks and view their progress while running.