Problem : Fetch 2000 items from Dynamo DB and process(Create a POST req from 100 items) it batch by batch (Batch size = 100).
Question : Is there anyway that I can achieve it from any configuration in AWS.
PS : I've configured a cron schedule to run my Lambda function. I'm using Java. I've made multi-threaded application which synchronously does so, but this eventually increases my computation time drastically.
I have the same problem and thinking of solving it in following way. Please let me know if you try it.
Schedule Job to fetch N items from DynamoDB using Lambda function
Lambda function in #1 will submit M messages to SQS to process each item and trigger lambda functions, in this case it should call lambda functions M times Each lambda function will process request given in the message
In order to achieve this you need to schedule an event via CloudWatch, setup SQS and create lambda function triggered by SQS events.
Honestly, I am not sure if this is price effective but it should be working. Assuming your fetch size is so low, this should be reasonable.
Also you can try using SNS in this case you don't need to worry about SQS message polling.