I was wondering which is the best way to send data from dynamoDB to elasticsearch.
AWS sdk js. https://github.com/Stockflare/lambda-dynamo-to-elasticsearch/blob/master/index.js
DynamoDB logstash plugin: https://github.com/awslabs/logstash-input-dynamodb
Follow this AWS blog. They describe in detail how it is and should be done.
https://aws.amazon.com/blogs/compute/indexing-amazon-dynamodb-content-with-amazon-elasticsearch-service-using-aws-lambda/
edit
I'm assuming you use AWS elasticsearch managed service.
- You should use Dynamodb streams in order to listen to changes (among all, you'll have there events of new items added to dynamodb).
- Create new Kinesis Firehose stream that is set to output all records to your elasticsearch instance.
- Create a new lambda that is triggered by the events of new items in the DynamoDB stream.
- The lambda will get the unique DynamoDB record ID so you can fetch it, do fetch the record payload and ingest it to the Firehose stream endpoint.
- Depending on your DynamoDB record size, you might enable the option to include the record's payload in the stream item, so you won't need to be fetching it from the table and use the provisioned capacity that you've set.
I recommend creating an AWS Lambda stream on your DynamoDB, then take that data from the Lambda and write it into ElasticSearch.