I am using logstash to index data from postgres(jdbc input plugin) into elasticsearch. I don't have any time based information in the database. Postgres table users to import has 2 columns - userid(unique), uname Elastic search export - _id = userid I am exporting this data every hour using cron schedule in logstash.
input {
jdbc {
schedule => "0 */1 * * *"
statement => "SELECT userid, uname FROM users"
}
}
output {
elasticsearch {
hosts => ["elastic_search_host"]
index => "user_data"
document_id => "%{userid}"
}
}
This logstash config indexes data correctly. But, it works only for update and insert cases. If any data/user info is deleted from table, it will not delete the document from the elastic search index. Can someone please help me with the delete case?