We are using Azure Functions to process some imports for this we use some Queue we have some queue triggered functions.
The issue we are facing is that after functioning properly for some time the Azure Function stops processing messages from Queue (even though they exist in the queue).
To again restart the processing we have to nudge the corresponding function app by opening the code etc..
Please share the information in the original issue template. We need to know your Function App name to investigate issues like this.
I can share the function App name, but is there a more private way to share this information.
@christopheranderson Is there a private way to share Function App information.
Please refer to this document for information on how to privately share your app name:
https://github.com/Azure/azure-webjobs-sdk-script/wiki/Sharing-Your-Function-App-name-privately
@fabiocav Thanks, below are the details --
Execution Time : 2017-06-20T05:30:14.474
Function Run Id=f6d05873-52f2-412c-840e-773b510c74f4
Region : SouthEastAsia
also the issue happened between UTC 2017-06-19 11:45 AM and UTC 2017-06-19 1:45 PM
@christopheranderson @fabiocav Do I need to provide any more details, we are still seeing this issue...
I noticed the same issue when having a function listen on a servicebus queue and running in consumption plan. Scheduling a second function to run every 20 min (doing nothing) on the same function app "solved" the issue. I guess the function app somehow goes inactive when there is a longer period without any messages received from the queue.
Running the function app on a dedicated service plan works correctly and will process messages also after an idle period.
@akhilaj and i belong to the same team and we are still facing the same issue,
Any updates on this..??
We are facing the same problem with our functions. It is not clear what is going on, but it seems it happens mostly when we have too many queue items to process. we have to manually restart when this happens.
We are facing this problem on function with consumption plan and also dedicated service plans, the exceptions are not really clear on what is going on.
One of our functions with a dedicated service plan that we are having problems.
Execution Time : 2018-05-18T12:55:23.621
Function Run Id=e850be27-a9f1-4fe5-9718-19043ff09583
Region : Brazil South
This isn't just functions. It happens with WebJobs too... e.g. https://stackoverflow.com/questions/35941713/servicebus-triggers-just-stop-working-randomly-on-web-jobs-why and I'm still getting it today with the .NET Framework SDK. Is there an underlying issue with the WebJob technology stack that WebJobs and Functions shares?
I am having the same problem on azure consumption plan. I have a function that stops after processing around 1000 queue messages. When I restart the function app it reduces the queue again. When the queue is empty it stops again until you restart. Interestingly azure portal's function view get also stuck when it happens
Just recently a customer hit this issue, which caused them to move out of webjobs / functions. Basically the customer thinks functions shouldn't get stuck retrieving messages, and process the next message when that happens. For now they are using simple dotnet core apps for improved reliability / resiliency
Any updates on the issue?Will it ever be fixed
My problem turned out to be Thread Starvation to solve it I had to create my HttpClient in the constructor and reuse it. I recommend going through the recent documentation and make sure you have applied the best practices for azure functions.
Most helpful comment
We are facing the same problem with our functions. It is not clear what is going on, but it seems it happens mostly when we have too many queue items to process. we have to manually restart when this happens.
We are facing this problem on function with consumption plan and also dedicated service plans, the exceptions are not really clear on what is going on.
One of our functions with a dedicated service plan that we are having problems.
Execution Time : 2018-05-18T12:55:23.621
Function Run Id=e850be27-a9f1-4fe5-9718-19043ff09583
Region : Brazil South