Afternoon,
I need a hand to split over 1200 results into "chunks" of 10 so i can process these results with the Amazon MWS API. Can anyone provide any guidance on how i would go about doing this please?
List<string> prodASIN = dc.aboProducts.Select(a => a.asin).Take(10).ToList();
I currently have this, which works. But i have 1200+ results and need to loop through each 10 so i can process them and pass them over to the Amazon MWS API
Why not try something like:
Or if you don't want to load them all into memory.
I know the question is answered but I can't withhold from you this little extension method I once made and that has served me well since.
You can do:
List<T>
has a built-in function calledGetRange()
which was made specifically for what you're trying to do. It's extremely fast and doesn't need Linq, casting, etc...That's it. Very simple.
Test results:
GetRange
vs.Slice
vs.Linq
with 5000 strings inList<string>
As you can clearly see, the Skip/Take approach using Linq is over 383 times slower thanSlice<T>()
and 4,736 times slower thanGetRange()
==================================================================================
Test method used (try it yourself):
Slice Extension (for Arrays):
Array.Copy is extremely fast, a lot faster than the Select/Skip/Take pattern. Although this method is not the fasted I've found, recents tests show that it's nearly 400 times faster than the Skip/Take pattern used to split Lists and Arrays.
To use it as is:
Sorry this isnt LINQ specific, but perhaps it will help...
One of the things I have done when working with data with MWS and ERP software is adding a control column to the database, something like "addedASIN'. In the database I define the control column as a boolean value ( or TINYINT(1) in MySQL ) and default the flag to 0 for all new entries and set it to 1 when the entry has been added.
If you are able to do that then you can do something like
Then once MWS returns successful for the additions update the flag using
The benefit I have found with this is that your database will be able to stop and start with a minimal repetition of data - for instance in my case ( and what started this control column ) our network connection can be flaky, so I was finding during order imports I would lose connectivity resulting in lost orders, or orders being uploaded to our system twice.
This solution has mitigated that by having at most 1 order being added twice as a result of a connectivity loss, and in order for that order to be uploaded twice, connectivity needs to be lost between sending the data to our ERP system, our ERP system acknowledging it was added and the database being updated, which for a round trip takes approximately 30 seconds.