Azure-docs: Using capture for replay

Created on 13 Sep 2018  Â·  9Comments  Â·  Source: MicrosoftDocs/azure-docs

Someone else left this same comment, but it was closed with no change.
I would like to know how to use Capture and be able to replay events from the Capture location.


Document Details

⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Pri1 assigned-to-author doc-enhancement event-hubsvc triaged

Most helpful comment

I opened the previous comment. This should be an out the box behaviour to be more like Kafka. In Kafka I can keep logs forever and just read from the start if I have a new consumer. The plumbing required in Event Hub to make this happen should just not be needed. You have capture - we need the capability to easily replay the captured events.

All 9 comments

@kevinfms01 Thanks for the feedback! We are currently investigating and will update you shortly.

@ShubhaVijayasarathy could you take a look at this feedback?
We had a similar request:
https://github.com/MicrosoftDocs/azure-docs/issues/8422

@kevinfms01 Thank you for your question! Could you help elaborate on what you mean by replay Capture events? After data is streamed into your event hub and Captured as Avro files to a storage location, the data stream is still retained within your event hub according to your retention policy and can be read by other subscribers. You can also process your Captured Avro files from your persistent storage destination.

For example, if I have capture setup for my event hub and for some reason I want to go back in time beyond 7 days, and replay the events back through event hub (in case my downstream system was unable to process the original events correctly).

@xurui203 Any update? Thanks.

@kevinfms01 Event hubs are not intended as a permanent data store, so if you wish to replay the data from more than 7 days ago you can simply read the Captured data that has been pulled into a more permanent storage destination like Azure Storage. If your downstream systems need to process the events they can read it directly from your Azure Storage blob.

I opened the previous comment. This should be an out the box behaviour to be more like Kafka. In Kafka I can keep logs forever and just read from the start if I have a new consumer. The plumbing required in Event Hub to make this happen should just not be needed. You have capture - we need the capability to easily replay the captured events.

@xurui203 Are there any quid lines on how to get all events from Azure Storage blob? From what I can see this is impossible without apache spark? Is there at least a way to do it with U-SQL and then push it to a query store like CosmosDB?

Is there an update on this? This capability is still very much needed?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

paulmarshall picture paulmarshall  Â·  3Comments

jharbieh picture jharbieh  Â·  3Comments

DeepPuddles picture DeepPuddles  Â·  3Comments

AronT-TLV picture AronT-TLV  Â·  3Comments

spottedmahn picture spottedmahn  Â·  3Comments