How to export all collection in MongoDB?

2019-01-16 00:07发布

问题:

I want to export all collection in MongoDB by the command:

mongoexport -d dbname -o Mongo.json

The result is:
No collection specified!

The manual say, if you do not specify a collection, all collections will be exported.
However, why doesn't this work?

http://docs.mongodb.org/manual/reference/mongoexport/#cmdoption-mongoexport--collection

My MongoDB version is 2.0.6

回答1:

For lazy people like me, i use mongodump it's faster:

mongodump -d <database_name> -o <directory_backup>

And to "restore/import" that, i used (from directory_backup/dump/):

mongorestore -d <database_name> <directory_backup>

With this solution, you don't need to each all collections and export one by one. Just specify the database. I would recommend against using mongodump/mongorestore for big data storages. It is very slow and once you get past 10/20GB of data it can take hours to restore.



回答2:

I wrote bash script for that. Just run it with 2 parameters (database name, dir to store files).

#!/bin/bash

if [ ! $1 ]; then
        echo " Example of use: $0 database_name [dir_to_store]"
        exit 1
fi
db=$1
out_dir=$2
if [ ! $out_dir ]; then
        out_dir="./"
else
        mkdir -p $out_dir
fi

tmp_file="fadlfhsdofheinwvw.js"
echo "print('_ ' + db.getCollectionNames())" > $tmp_file
cols=`mongo $db $tmp_file | grep '_' | awk '{print $2}' | tr ',' ' '`
for c in $cols
do
    mongoexport -d $db -c $c -o "$out_dir/exp_${db}_${c}.json"
done
rm $tmp_file


回答3:

Please let us know where you have installed your Mongo DB ? (either in Ubuntu or in Windows)

  • For Windows:

    1. Before exporting you must connect to your Mongo DB in cmd prompt and make sure that you are able to connect to your local host.
    2. Now open a new cmd prompt and execute the below command,

    mongodump --db database name --out path to save
    eg: mongodump --db mydb --out c:\TEMP\op.json

    1. Visit https://www.youtube.com/watch?v=hOCp3Jv6yKo for more details.
  • For Ubuntu:

    1. Login to your terminal where Mongo DB is installed and make sure you are able to connect to your Mongo DB.
    2. Now open a new terminal and execute the below command,

    mongodump -d database name -o file name to save
    eg: mongodump -d mydb -o output.json

    1. Visit https://www.youtube.com/watch?v=5Fwd2ZB86gg for more details .


回答4:

Follow the steps below to create a mongodump from the server and import it another server/local machine which has a username and a password

1. mongodump -d dbname -o dumpname -u username -p password
2. scp -r user@remote:~/location/of/dumpname ./
3. mongorestore -d dbname dumpname/dbname/ -u username -p password


回答5:

Exporting all collections using mongodump use the following command

mongodump -d database_name -o directory_to_store_dumps

To restore use this command

mongorestore -d database_name directory_backup_where_mongodb_tobe_restored


回答6:

If you are OK with the bson format, then you can use the mongodump utility with the same -d flag. It will dump all the collections to the dump directory (the default, can be changed via the -o option) in the bson format. You can then import these files using the mongorestore utility.



回答7:

You can use mongo --eval 'printjson(db.getCollectionNames())' to get the list of collections and then do a mongoexport on all of them. Here is an example in ruby

  out = `mongo  #{DB_HOST}/#{DB_NAME} --eval "printjson(db.getCollectionNames())"`

  collections = out.scan(/\".+\"/).map { |s| s.gsub('"', '') }

  collections.each do |collection|
    system "mongoexport --db #{DB_NAME}  --collection #{collection}  --host '#{DB_HOST}' --out #{collection}_dump"
  end


回答8:

I needed the Windows batch script version. This thread was useful, so I thought I'd contribute my answer to it too.

mongo "{YOUR SERVER}/{YOUR DATABASE}" --eval "rs.slaveOk();db.getCollectionNames()" --quiet>__collections.txt
for /f %%a in ('type __collections.txt') do @set COLLECTIONS=%%a
for %%a in (%COLLECTIONS%) do mongoexport --host {YOUR SERVER} --db {YOUR DATABASE} --collection %%a --out data\%%a.json
del __collections.txt

I had some issues using set /p COLLECTIONS=<__collections.txt, hence the convoluted for /f method.



回答9:

If you want, you can export all collections to csv without specifying --fields (will export all fields).

From http://drzon.net/export-mongodb-collections-to-csv-without-specifying-fields/ run this bash script

OIFS=$IFS;
IFS=",";

# fill in your details here
dbname=DBNAME
user=USERNAME
pass=PASSWORD
host=HOSTNAME:PORT

# first get all collections in the database
collections=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();db.getCollectionNames();"`;
collections=`mongo $dbname --eval "rs.slaveOk();db.getCollectionNames();"`;
collectionArray=($collections);

# for each collection
for ((i=0; i<${#collectionArray[@]}; ++i));
do
    echo 'exporting collection' ${collectionArray[$i]}
    # get comma separated list of keys. do this by peeking into the first document in the collection and get his set of keys
    keys=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();var keys = []; for(var key in db.${collectionArray[$i]}.find().sort({_id: -1}).limit(1)[0]) { keys.push(key); }; keys;" --quiet`;
    # now use mongoexport with the set of keys to export the collection to csv
    mongoexport --host $host -u $user -p $pass -d $dbname -c ${collectionArray[$i]} --fields "$keys" --csv --out $dbname.${collectionArray[$i]}.csv;
done

IFS=$OIFS;


回答10:

I found after trying lots of convoluted examples that very simple approach worked for me.

I just wanted to take a dump of a db from local and import it on a remote instance:

on the local machine:

mongodump -d databasename

then I scp'd my dump to my server machine:

scp -r dump user@xx.xxx.xxx.xxx:~

then from the parent dir of the dump simply:

mongorestore 

and that imported the database.

assuming mongodb service is running of course.



回答11:

If you want to dump all collections in all databases (which is an expansive interpretation of the original questioner's intent) then use

mongodump

All the databases and collections will be created in a directory called 'dump' in the 'current' location



回答12:

you can create zip file by using following command .It will create zip file of database {dbname} provided.You can later import the following zip file in you mongo DB.

Window filepath=C:\Users\Username\mongo 

mongodump --archive={filepath}\+{filename}.gz --gzip --db {dbname}


回答13:

In case you want to connect a remote mongoDB server like mongolab.com, you should pass connection credentials eg.

mongoexport -h id.mongolab.com:60599 -u username -p password -d mydb -c mycollection -o mybackup.json


回答14:

Previous answers explained it well, I am adding my answer to help in case you are dealing with a remote password protected database

mongodump --host xx.xxx.xx.xx --port 27017 --db your_db_name --username your_user_name --password your_password --out /target/folder/path


回答15:

Here's what worked for me when restoring an exported database:

mongorestore -d 0 ./0 --drop

where ./contained the exported bson files. Note that the --drop will overwrite existing data.



回答16:

I realize that this is quite an old question and that mongodump/mongorestore is clearly the right way if you want a 100% faithful result, including indexes.

However, I needed a quick and dirty solution that would likely be forwards and backwards compatible between old and new versions of MongoDB, provided there's nothing especially wacky going on. And for that I wanted the answer to the original question.

There are other acceptable solutions above, but this Unix pipeline is relatively short and sweet:

mongo --quiet mydatabase --eval "db.getCollectionNames().join('\n')" | \
grep -v system.indexes | \
xargs -L 1 -I {} mongoexport -d mydatabase -c {} --out {}.json

This produces an appropriately named .json file for each collection.

Note that the database name ("mydatabase") appears twice. I'm assuming the database is local and you don't need to pass credentials but it's easy to do that with both mongo and mongoexport.

Note that I'm using grep -v to discard system.indexes, because I don't want an older version of MongoDB to try to interpret a system collection from a newer one. Instead I'm allowing my application to make its usual ensureIndex calls to recreate the indexes.



回答17:

if you want to use mongoexport and mongoimport to export/import each collection from database, I think this utility can be helpful for you. I've used similar utility couple of times;

LOADING=false

usage()
{
    cat << EOF
    usage: $0 [options] dbname

    OPTIONS:
        -h      Show this help.
        -l      Load instead of export
        -u      Mongo username
        -p      Mongo password
        -H      Mongo host string (ex. localhost:27017)
EOF
}

while getopts "hlu:p:H:" opt; do
    MAXOPTIND=$OPTIND

    case $opt in 
        h)
            usage
            exit
            ;;
        l)
            LOADING=true
            ;;
        u)
            USERNAME="$OPTARG"
            ;;
        p) 
            PASSWORD="$OPTARG"
            ;;
        H)
            HOST="$OPTARG"
            ;;
        \?)
            echo "Invalid option $opt"
            exit 1
            ;;
    esac
done

shift $(($MAXOPTIND-1))

if [ -z "$1" ]; then
    echo "Usage: export-mongo [opts] <dbname>"
    exit 1
fi

DB="$1"
if [ -z "$HOST" ]; then
    CONN="localhost:27017/$DB"
else
    CONN="$HOST/$DB"
fi

ARGS=""
if [ -n "$USERNAME" ]; then
    ARGS="-u $USERNAME"
fi
if [ -n "$PASSWORD" ]; then
    ARGS="$ARGS -p $PASSWORD"
fi

echo "*************************** Mongo Export ************************"
echo "**** Host:      $HOST"
echo "**** Database:  $DB"
echo "**** Username:  $USERNAME"
echo "**** Password:  $PASSWORD"
echo "**** Loading:   $LOADING"
echo "*****************************************************************"

if $LOADING ; then
    echo "Loading into $CONN"
    tar -xzf $DB.tar.gz
    pushd $DB >/dev/null

    for path in *.json; do
        collection=${path%.json}
        echo "Loading into $DB/$collection from $path"
        mongoimport $ARGS -d $DB -c $collection $path
    done

    popd >/dev/null
    rm -rf $DB
else
    DATABASE_COLLECTIONS=$(mongo $CONN $ARGS --quiet --eval 'db.getCollectionNames()' | sed 's/,/ /g')

    mkdir /tmp/$DB
    pushd /tmp/$DB 2>/dev/null

    for collection in $DATABASE_COLLECTIONS; do
        mongoexport --host $HOST -u $USERNAME -p $PASSWORD -db $DB -c $collection --jsonArray -o $collection.json >/dev/null
    done

    pushd /tmp 2>/dev/null
    tar -czf "$DB.tar.gz" $DB 2>/dev/null
    popd 2>/dev/null
    popd 2>/dev/null
    mv /tmp/$DB.tar.gz ./ 2>/dev/null
    rm -rf /tmp/$DB 2>/dev/null
fi


回答18:

If you want to backup all the dbs on the server, without having the worry about that the dbs are called, use the following shell script:

#!/bin/sh

md=`which mongodump`
pidof=`which pidof`
mdi=`$pidof mongod`
dir='/var/backup/mongo'

if [ ! -z "$mdi" ]
   then
        if [ ! -d "$dir" ]
           then
               mkdir -p $dir
           fi
        $md --out $dir >/dev/null 2>&1
   fi

This uses the mongodump utility, which will backup all DBs if none is specified.

You can put this in your cronjob, and it will only run if the mongod process is running. It will also create the backup directory if none exists.

Each DB backup is written to an individual directory, so you can restore individual DBs from the global dump.



回答19:

Already, you can do that kind operations with a GUI like Robomongo or Mongochef.



回答20:

#mongodump using sh script 
#!/bin/bash
TIMESTAMP=`date +%F-%H%M`
APP_NAME="folder_name"
BACKUPS_DIR="/xxxx/tst_file_bcup/$APP_NAME"
BACKUP_NAME="$APP_NAME-$TIMESTAMP"
/usr/bin/mongodump -h 127.0.0.1 -d <dbname> -o $BACKUPS_DIR/$APP_NAME/$BACKUP_NAME
tar -zcvf $BACKUPS_DIR/$BACKUP_NAME.tgz $BACKUPS_DIR/$APP_NAME/$BACKUP_NAME
rm -rf /home/wowza_analytics_bcup/wowza_analytics/wowza_analytics
### 7 days old backup delete automaticaly using given command

find /home/wowza_analytics_bcup/wowza_analytics/ -mindepth 1 -mtime +7 -delete


回答21:

  1. Open the Connection
  2. Start the server
  3. open new Command prompt

Export:

mongo/bin> mongoexport -d webmitta -c domain -o domain-k.json

Import:

mongoimport -d dbname -c newCollecionname --file domain-k.json

Where

webmitta(db name)
domain(Collection Name)
domain-k.json(output file name)


回答22:

To export in JSON format do this by following commands which you can see.

mongoexport --db dbname --collection collectionName --out directoryPATH/JSONfileName.json