-->

odoo 9 migrate binary field db to filestore

2019-09-17 15:33发布

问题:

Odoo 9 custom module binary field attachment=True parameter added later after that new record will be stored in filesystem storage. Binary Fields some old records attachment = True not used, so old record entry not created in ir.attachment table and filesystem not saved. I would like to know how to migrate old records binary field value store in filesystem storage?. How to create/insert records in ir_attachment row based on old records binary field value? Is any script available?

回答1:

You have to include the postgre bin path in pg_path in your configuration file. This will restore the file store that contains the binary fields

pg_path = D:\fx\upsynth_Postgres\bin


回答2:

I'm sure that you no longer need a solution to this as you asked 18 months ago, but I have just had the same issue (many gigabytes of binary data in the database) and this question came up on Google so I thought I would share my solution.

When you set attachment=True the binary column will remain in the database, but the system will look in the filestore instead for the data. This left me unable to access the data from the Odoo API so I needed to retrieve the binary data from the database directly, then re-write the binary data to the record using Odoo and then finally drop the column and vacuum the table.

Here is my script, which is inspired by this solution for migrating attachments, but this solution will work for any field in any model and reads the binary data from the database rather than from the Odoo API.

import xmlrpclib
import psycopg2

username = 'your_odoo_username'
pwd = 'your_odoo_password'
url = 'http://ip-address:8069'
dbname = 'database-name'

model = 'model.name'
field = 'field_name'

dbuser = 'postgres_user'
dbpwd = 'postgres_password'
dbhost = 'postgres_host'

conn = psycopg2.connect(database=dbname, user=dbuser, password=dbpwd, host=dbhost, port='5432')
cr = conn.cursor()

# Get the uid
sock_common = xmlrpclib.ServerProxy ('%s/xmlrpc/common' % url)
uid = sock_common.login(dbname, username, pwd)
sock = xmlrpclib.ServerProxy('%s/xmlrpc/object' % url)

def migrate_attachment(res_id):
    # 1. get data
    cr.execute("SELECT %s from %s where id=%s" % (field, model.replace('.', '_'), res_id))
    data = cr.fetchall()[0][0]

    # Re-Write attachment
    if data:
        data = str(data)
        sock.execute(dbname, uid, pwd, model, 'write', [res_id], {field: str(data)})
        return True
    else:
        return False

# SELECT attachments:
records = sock.execute(dbname, uid, pwd, model, 'search', [])
cnt = len(records)
print cnt
i = 0

for res_id in records:
    att = sock.execute(dbname, uid, pwd, model, 'read', res_id, [field])

    status = migrate_attachment(res_id)
    print 'Migrated ID %s (attachment %s of %s) [Contained data: %s]' % (res_id, i, cnt, status)
    i += 1

cr.close()

print "done ..."

Afterwards, drop the column and vacuum the table in psql.