Real-time logs and observability for serverless job execution
Say you’re an infra provider - you run sandboxed execution environments, CI/CD runners, browsers-as-a-service, that kind of thing. Your customers kick off jobs on your platform and want to see what’s happening in real-time with logs from their running environments.Maybe you ingest into ClickHouse or similar for historical analytics, but customers still need to follow their live jobs in real-time. You need to provide dashboards that let them peek into the state of all their N running jobs and M previous jobs.S2 streams are perfect for this. Store the logs of each job in its own stream:
Copy
Ask AI
s2://infra-corp/jobs/user-12345/job-12345
When a user kicks off a job, that call schedules the execution and creates a new read-only S2 token scoped to just that job’s stream. Return the job ID and token to the user.Meanwhile, as the job runs, the executor appends logs directly to the stream. The customer can read and tail the logs in real-time using their token, with access to both live and historical data through the same API.