We've got a c# .net Web API service calling code that is not able to handle more than one database request at a time. The system is for online payments of bills with a relatively small demand.
We don't have control of the code to make the change which would fix the issue. Another group using the same code used WCF API and a service configuration to limit concurrent requests to 1, effectively fixing the issue. We now have to re-write in WCF or figure out how to make Web API work.
Is there a way to make the Web API queue up requests and process them one at a time, while still keeping things operating in 'real-time' for the end users?
Sorry if this is vague, I am asking on behalf of the actual programmer in hopes of assisting.
Well only way I'm aware of is "do it yourself". I.E. :
Alternatively you can change the service to use fire & forget pattern with "lock" statement to limit access to payment part - like:
Which is quick & easy to implement - however you're sacrificing reliability a bit - see for example Fire and Forget with ASP.NET MVC. Also it doesn't scale well (if there will be peek of requests & you'll open too many parallel threads it can effectively take down your API service