Calling hundreds of azure functions in parallel

2019-07-10 03:40发布

I have an application that executes rules using some rules engine. I have around 500+ rules and our application will receive around 10,000 entries and all of those 10,000 entries should go through those 500 rules for validation individually. We are currently planning to migrate all our rules into Azure functions. That is each Azure function will correspond to a single rule.

So I was trying a proof of concept and created couple of Azure functions with Task.delay to mock real rule validation. I need to execute all 500 rules for each and every request. So I created a Durable function as orchestrator calling all the 500 activity triggers (rules) using Task.WhenAll, but that didn't work. Azure functions got hung with just one request. So instead I created individual HttpTrigger function for the rules. From a C# class library I called all the 500 functions using Task.WhenAll for one request and it worked like a charm. However, when I tried calling azure functions for all entries (starting with 50) then Task.WhenAll started throwing errors saying a task got cancelled or TimeoutException for HTTP call. Below is the code:

var tasksForCallingAzureFunctions = new List<List<Task<HttpResponseMessage>>>();

Parallel.For(0, 50,  (index) =>
{
  var tasksForUser = new List<Task<HttpResponseMessage>>();

  //call the mocked azure function 500 times 
  for (int j = 0; j < 500; j++)
  {
    tasksForUser.Add(client.SendAsync(CreateRequestObject()));
  }

  //add the tasks for each user into main list to call all the requests in parallel.
  tasksForCallingAzureFunctions.Add(tasksForUser);
});

// call all the requests - 50 * 500 requests in parallel.
Parallel.ForEach(tasksForCallingAzureFunctions,  (tasks) => Task.WhenAll(tasks).GetAwaiter().GetResult());

So my question is, Is this a recommended approach? I'm making 50 * 500 I/O calls. Requirement is all the entries (10000) should go through 500 rules. Other option I thought was send all the entries to each Azure function, so that Azure function will have logic to loop through each entry and validate them. That way I only have to make 500 calls?

Please advise.

1条回答
混吃等死
2楼-- · 2019-07-10 04:02

I would implement this scenario with queue-triggered Functions. Send a queue message per entity per rule, each function will pick one up, do its work and save the result to some storage. This should be more resilient than doing a gazillion sync HTTP requests.

Now, Durable Functions basically do the same behind the scenes, so you were on the right path. Maybe add a new question with minimal repro for your hanging problem.

查看更多
登录 后发表回答