Azure-docs: Template for Azure Durable Function

Created on 26 Apr 2019  Â·  8Comments  Â·  Source: MicrosoftDocs/azure-docs

I would find it very helpfull if you would provide a integrated sample of running Azure Durable Functions (Functions with > 4 Minutes runtime) in Azure Data Factory.
Most intersting would be to have an activity which would do this automatically (of the Internal Azure Functionj Activity recognizing a durable function and do it), but as explained in
https://docs.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity
you should "self" built with a Function Activity and I guess a Unit Loop with WebActivity to manually implement the polling with the Status URL...

I would recommdn you to extend the Documentation at https://docs.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity to point to an existing sample or create one.
(Because it needs quite a lot of logic)


Document Details

⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Pri2 assigned-to-author data-factorsvc doc-enhancement triaged

Most helpful comment

To Start over again is easy possible - I just use the whole thing as ExecutePipeline and run it a second time with an Error dependency from the first...

This all is running in Regular Schedule Trigger Executions - no user Interaction...

To me it all boils down to - durable functions need their own Activity to run them.
I have implemented a retry myself - but with a pipeline failing I will also get an ALERT from the Alerting System - because its not an "ADF Internal Retry", but one I implemented manually...

I also posted it on Ideas (https://feedback.azure.com/forums/270578-data-factory/suggestions/37747735-durable-function-activity)

All 8 comments

@hmayer1980 Thanks for the feedback! I have assigned the issue to the content author to evaluate and update as appropriate.

I have made myself and example, but I am still not happy about it.
How would I make it dynamic in Waiting a different (configureable) duration for different functions (to have it generic).
Ans How would I implement a retry strategy with this?

image
I currently use a custom SQL Procedure to "re-throw" the Error if an error happend. (Response is not OK, or Timeout Exceeded). Not very Nice...

@hmayer1980 I really appreciate the work you have done. Very cool. Hmm, as to how to make the wait duration dynamic, I do have an idea. I see that you are receiving both an ID and a status from the Azure Function. If the function was made to also return a wait time, you could base off that.

To clarify, by retry, do you mean wait-and-check-status-again, or start the function over from the beginning? I have a few ideas for the wait-and-check-again.

@MartinJaffer-MSFT
I actually mean the start the function all over again.

The Wait was the thing that confused me the most at the beginning. I have already something to wait a dynamic duration at https://github.com/MicrosoftDocs/azure-docs/issues/29820
but would be very much intersted in your idea.

The WebRequest for GetDurationFunctionSTatus is Blocking - this waits for as long as the function takes. This was the very un-intuitiv thing for me at the beginning, because in a WebBrowser you get the response immediate (non blocking).
The Documentation at https://docs.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity#timeout-and-long-running-functions states

Because statusQueryGetUri returns HTTP Status 202 while the function is running, you can poll the status of the function by using a Web Activity

In reality the Web Request must do this internally. You do not have to implement the polling.
But since you can only set a static timeout on the WebRequest the whole timeout thing is static.

Or do you know a way to make the WebRequest not blocking and actually implement the polling?
Because this could enable a dynamic solution.

Ideally that could look like a Validation activity using an HTTP or Rest dataset. I have not tried it before. Today's workload looks busy, so I probably won't get to try it this week.

As for starting the entire process over again...
If I recall correctly, an Execute pipeline activity is prohibited from calling itself. However, I have a hunch that
PipelineA -> PipelineB -> PipelineA -> PipelineB ...
would be allowed, much like the workaround to achieve incrementer variables.
I know that pipelines can be polled, so a hybrid solution should be possible. Have you used an external service or self-hosted solution to trigger and monitor data factory runs? (I.E. local machine crontab make a REST call to trigger Data Factory run. Would be useful for buisness logic based scheduling.)

To Start over again is easy possible - I just use the whole thing as ExecutePipeline and run it a second time with an Error dependency from the first...

This all is running in Regular Schedule Trigger Executions - no user Interaction...

To me it all boils down to - durable functions need their own Activity to run them.
I have implemented a retry myself - but with a pipeline failing I will also get an ALERT from the Alerting System - because its not an "ADF Internal Retry", but one I implemented manually...

I also posted it on Ideas (https://feedback.azure.com/forums/270578-data-factory/suggestions/37747735-durable-function-activity)

Hi @hmayer1980,

Sorry for the delay in response and thanks for adding your feedback! Outside of the feature request, do you have any other open inquiries with this issue? If not, do I have permission to close it?

Thanks,
Daniel

Closing due to lack of activity. If this issue is still relevant, feel free to reopen!

please-close

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Favna picture Favna  Â·  3Comments

DeepPuddles picture DeepPuddles  Â·  3Comments

Agazoth picture Agazoth  Â·  3Comments

monteledwards picture monteledwards  Â·  3Comments

Ponant picture Ponant  Â·  3Comments