Sync postgreSql data with ElasticSearch

2020-05-19 05:30发布

Ultimately I want to have a scalable search solution for the data in PostgreSql. My finding points me towards using Logstash to ship write events from Postgres to ElasticSearch, however I have not found a usable solution. The soluions I have found involve using jdbc-input to query all data from Postgres on an interval, and the delete events are not captured.

I think this is a common use case so I hope you guys could share with me your experience, or give me some pointers to proceed.

3条回答
我命由我不由天
2楼-- · 2020-05-19 05:58

If you need to also be notified on DELETEs and delete the respective record in Elasticsearch, it is true that the Logstash jdbc input will not help. You'd have to use a solution working around the binlog as suggested here

However, if you still want to use the Logstash jdbc input, what you could do is simply soft-delete records in PostgreSQL, i.e. create a new BOOLEAN column in order to mark your records as deleted. The same flag would then exist in Elasticsearch and you can exclude them from your searches with a simple term query on the deleted field.

Whenever you need to perform some cleanup, you can delete all records flagged deleted in both PostgreSQL and Elasticsearch.

查看更多
劳资没心,怎么记你
3楼-- · 2020-05-19 06:08

You can also take a look at PGSync.

It's similar to Debezium but a lot easier to get up and running.

PGSync is a Change data capture tool for moving data from Postgres to Elasticsearch. It allows you to keep Postgres as your source-of-truth and expose structured denormalized documents in Elasticsearch.

You simply define a JSON schema describing the structure of the data in Elasticsearch.

Here is an example schema: (you can also have nested objects)

e.g

{
    "nodes": [
        {
            "table": "book",
            "columns": [
                "isbn",
                "title",
                "description"
            ]
        }
    ]
}

PGsync generates queries for your document on the fly. No need to write queries like Logstash. It also supports and tracks deletion operations.

It operates both a polling and an event-driven model to capture changes made to date and notification for changes that occur at a point in time. The initial sync polls the database for changes since the last time the daemon was run and thereafter event notification (based on triggers and handled by the pg-notify) for changes to the database.

It has very little development overhead.

  • Create a schema as described above
  • point pgsync at your Postgres database and Elasticsearch cluster
  • Startup the daemon.

You can easily create a document that includes multiple relations as nested objects. PGSync tracks any changes for you.

Have a look at the github repo for more details.

You can pip install the package from PyPI

查看更多
forever°为你锁心
4楼-- · 2020-05-19 06:11

Please take a look at Debezium. It's a change data capture (CDC) platform, which allow you to steam your data

I created a simple github repository, which shows how it works

enter image description here

查看更多
登录 后发表回答