Performant support for petabytes of streaming data
Our product philosophy at Orb has always been to reduce data infrastructure that your team needs to build. We've delivered on that with a truly extensible SQL layer when configuring metrics, so you don't have to maintain that business logic outside of Orb. In many cases, that means you need to send Orb a raw stream of events — if you serve a large customer base or handle a lot of traffic, you should be able to use Orb without any aggregation or transformations up front.
Whereas Orb's ingestion API is meant for services that emit tens of thousands of events a minute, this new functionality is built to support hundreds of thousands of events a second.
Orb's streaming aggregation functionality requires you to set up an S3 bucket in your account — we handle listening for updates to the bucket (no specific path structure required!), doing the data de-duplication, rollup, and ingesting events directly into your Orb account.
To learn more about what's required for this setup, see the docs on high throughput workloads.