Is there a way (plugin or tool) to export the data from the database (or database itself) ? I'm looking for this feature as I need to migrate a DB from present host to another one.
相关问题
- Display time in local timezone when querying Influ
- How do we change the “precision:ms” setting in the
- Kapacitor .post() HTTP Post to url not sending dat
- How do you INSERT into influxDB using the SQL-like
- Multi-timeseries operations in Grafana
相关文章
- InfluxDB - Getting only last value in query
- Obtaining a total of two series of data from Influ
- Calculating duration between a start and end event
- Get difference since 30 days ago in InfluxQL/Influ
- Export data from InfluxDB
- Use time field from influxdb in grafana single sta
- Make InfluxDB/Grafana cumulative function that res
- Querying for tag values in a given list
As ezotrank says, you can dump each table. There's a missing "-d" in ezotrank's answer though. It should be:
(Ezotrank, sorry, I would've just posted a comment directly on your answer, but I don't have enough reputation points to do that yet.)
From
1.5
onwards, the InfluxDB OSS backup utility provides a newer option which is much more convenient:-portable
: Generates backup files in the newer InfluxDB Enterprise-compatible format. Highly recommended for all InfluxDB OSS usersTo back up everything:
To backup only the myperf database:
To restore all databases found within the backup directory:
To restore only the myperf database (myperf database must not exist):
Additional options include specifying
timestamp
,shard
etc. See all the other supported options here.If You want to export in an readable format, the inspect command is to prefer. To export the database with the name HomeData the command is:
The parameters for -waldir and -datdir can be found in /etc/influxdb/influxdb.conf.
To import this file again, the command is:
If you have access to the machine running Influx db I would say use the influx_inspect command. The command is simple and very fast. It will dump your db in line protocol. You can then import this dump using influx -import command.
You could dump each table and load them through REST interface:
Or, maybe you want to add new host to cluster? It's easy and you'll get master-master replica for free. Cluster Setup
If I use curl, I get timeouts, and if I use
influxd backup
its not in a format I can read.I'm getting fine results like this: