Azure-docs: Having a namespace with more than 5000 concurrent connections namespace

Created on 2 Aug 2019  Â·  9Comments  Â·  Source: MicrosoftDocs/azure-docs

Hello,

is there any workaround (might be an architectural design) for having a namespace with more than 5000 concurrent connections other than having a multiple number of concurrently provisioned namespaces?


Document Details

⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

Pri2 cxp product-question service-bus-messaginsvc triaged

Most helpful comment

The connection count is as much of a bounding quota limitation of a namespace as the messaging units are, even though messaging units don’t linearly tie up to the connection count. Since Service Bus is typically used as a "middle tier" broker, the quota limitation is commonly not an obstacle, and to keep resource usage balanced, we can't adjust those quotas individually. If you need 80,000 msg/sec, you'll need 4 namespaces, and if you need 20,000 clients/concurrent, you'll need 4 namespaces - that's how the capacity boundaries are structured today. I'm sorry that your scenario stretches outside of that box.

For very high numbers of concurrently connected external clients that each need a queue for upstream traffic and that need to be managed under one roof, IoT Hub might be an option.

All 9 comments

Hi @mhabibal13 - Thank you for your feedback! We will review and update as appropriate.

@mhabibal13 If you are on the premium tier, one way to workaround the maximum limit that I can think of would be to use the Azure Event Grid Integration for receivers of queues/topics with low volumes and/or can tolerate (really small) delays in message deliveries.

@axisc @clemensv Could you please share other recommended alternatives for such scenarios if any?

@mhabibal13 Another optimization is to have your senders use the Send Message Batch HTTP API instead and allow for more concurrent receivers over AMQP.

@mhabibal13 Just following up here... Hope my previous comments helps a bit.

To answer your query better, could you please share details on why your scenario entails that many listeners and what those clients do.

Hello Pramod,

thank you for your suggestions/workarounds. I will be searching into the solution of Azure EventGrid Integration. This might be helpful. However the one with the HTTP API is irrelevant for me, since I am having an application with more than 20 000 clients receiving/sending messages to Azure Service Bus namespaces. Thus I need at least 4 namespaces.

These clients might be all communicating with the system at once, yet nights/ on weekends this number of 20 000 clients is massively reduced. Having 4 namespaces would be costing a lot of money

@mhabibal13 Could you confirm if all these clients would be continuously active throughout the day and are all unique clients (separate processes)?

Yes these are independent unique clients running continuously during the day.

The connection count is as much of a bounding quota limitation of a namespace as the messaging units are, even though messaging units don’t linearly tie up to the connection count. Since Service Bus is typically used as a "middle tier" broker, the quota limitation is commonly not an obstacle, and to keep resource usage balanced, we can't adjust those quotas individually. If you need 80,000 msg/sec, you'll need 4 namespaces, and if you need 20,000 clients/concurrent, you'll need 4 namespaces - that's how the capacity boundaries are structured today. I'm sorry that your scenario stretches outside of that box.

For very high numbers of concurrently connected external clients that each need a queue for upstream traffic and that need to be managed under one roof, IoT Hub might be an option.

@clemensv Thanks for the insights!

@mhabibal13 Hope the information shared is helpful. Since there is no doc update required here at the moment, we will now proceed to close this thread. If there are further questions regarding this matter, please tag me in your reply. We will gladly continue the discussion and we will reopen the issue.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

varma31 picture varma31  Â·  3Comments

ianpowell2017 picture ianpowell2017  Â·  3Comments

jharbieh picture jharbieh  Â·  3Comments

AronT-TLV picture AronT-TLV  Â·  3Comments

JamesDLD picture JamesDLD  Â·  3Comments