I'm a complete newbie to the ELK stack, so please excuse my ignorance. I've been able to get Logstash to send data from my database to Elasticsearch, but it exits once it's done with the transfer. How do I keep it running so it keeps them in sync? Thanks
相关问题
- Difference between Types.INTEGER and Types.NULL in
- not found value index error on elastic4s
- Do you need to close the connection you get from j
- metaData.getPrimaryKeys() returns a single row whe
- How to resolve “Invalid string or buffer length” a
相关文章
- es 单字段多分词器时,textField.keyword无法高亮
- Java的JDBC可以返回HashMap吗?
- Get the connected mysql database name (JDBC)
- ElasticSearch: How to search for a value in any fi
- What are the disadvantages of ElasticSearch Doc Va
- NoNodeAvailableException[None of the configured no
- Types cannot be provided in put mapping requests,
- Elasticsearch cluster 'master_not_discovered_e
You need to specify a schedule in your
jdbc
input:The
schedule
below (* * * * *
) will run every minute and select records from your database and only select the records that have been updated after the last time the query ran. Yourupdated
timestamp field might be named differently, feel free to adjust to fit your case.