I'm a complete newbie to the ELK stack, so please excuse my ignorance. I've been able to get Logstash to send data from my database to Elasticsearch, but it exits once it's done with the transfer. How do I keep it running so it keeps them in sync? Thanks
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
回答1:
You need to specify a schedule in your jdbc
input:
The schedule
below (* * * * *
) will run every minute and select records from your database and only select the records that have been updated after the last time the query ran. Your updated
timestamp field might be named differently, feel free to adjust to fit your case.
input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
jdbc_user => "mysql"
parameters => { "some_field" => "value" }
schedule => "* * * * *"
statement => "SELECT * from songs WHERE some_field = :some_field AND updated > :sql_last_value"
}
}