Make .net web api queue requests operate 'sing

2019-08-15 10:42发布

We've got a c# .net Web API service calling code that is not able to handle more than one database request at a time. The system is for online payments of bills with a relatively small demand.

We don't have control of the code to make the change which would fix the issue. Another group using the same code used WCF API and a service configuration to limit concurrent requests to 1, effectively fixing the issue. We now have to re-write in WCF or figure out how to make Web API work.

Is there a way to make the Web API queue up requests and process them one at a time, while still keeping things operating in 'real-time' for the end users?

Sorry if this is vague, I am asking on behalf of the actual programmer in hopes of assisting.

1条回答
再贱就再见
2楼-- · 2019-08-15 11:37

Well only way I'm aware of is "do it yourself". I.E. :

  • move the part where you're serving the request to some shared component (if you don't have it like this already)
  • make a table in persistent storage (db,cloud ...) for queuing operations, store there the incoming request (should be easy doable as the request has to be serializable somehow - I'd choose xml for persistent storage)
  • change your service so instead of processing request it will just validate the request & store the request into persistent storage (db) and return
  • create new assembly for processing requests asynchronously (typically window service if you're on windows infrastructure) that will fetch request from persistent storage (db) one at time, process them and log failures

Alternatively you can change the service to use fire & forget pattern with "lock" statement to limit access to payment part - like:

private static object s_SyncRoot = new object();

...
new Task(() =>
{
  lock (s_SyncRoot)
  {
    // process payment - single thread
  }
}).Start();
return "ok";
...

Which is quick & easy to implement - however you're sacrificing reliability a bit - see for example Fire and Forget with ASP.NET MVC. Also it doesn't scale well (if there will be peek of requests & you'll open too many parallel threads it can effectively take down your API service

查看更多
登录 后发表回答