urllib2 POST progress monitoring

2019-01-22 00:16发布

问题:

I'm uploading a fairly large file with urllib2 to a server-side script via POST. I want to display a progress indicator that shows the current upload progress. Is there a hook or a callback provided by urllib2 that allows me to monitor upload progress? I know that you can do it with download using successive calls to the connection's read() method, but I don't see a write() method, you just add data to the request.

回答1:

It is possible but you need to do a few things:

  • Fake out the urllib2 subsystem into passing a file handle down to httplib by attaching a __len__ attribute which makes len(data) return the correct size, used to populate the Content-Length header.
  • Override the read() method on your file handle: as httplib calls read() your callback will be invoked, letting you calculate the percentage and update your progress bar.

This could work with any file-like object, but I've wrapped file to show how it could work with a really large file streamed from disk:

import os, urllib2
from cStringIO import StringIO

class Progress(object):
    def __init__(self):
        self._seen = 0.0

    def update(self, total, size, name):
        self._seen += size
        pct = (self._seen / total) * 100.0
        print '%s progress: %.2f' % (name, pct)

class file_with_callback(file):
    def __init__(self, path, mode, callback, *args):
        file.__init__(self, path, mode)
        self.seek(0, os.SEEK_END)
        self._total = self.tell()
        self.seek(0)
        self._callback = callback
        self._args = args

    def __len__(self):
        return self._total

    def read(self, size):
        data = file.read(self, size)
        self._callback(self._total, len(data), *self._args)
        return data

path = 'large_file.txt'
progress = Progress()
stream = file_with_callback(path, 'rb', progress.update, path)
req = urllib2.Request(url, stream)
res = urllib2.urlopen(req)

Output:

large_file.txt progress: 0.68
large_file.txt progress: 1.36
large_file.txt progress: 2.04
large_file.txt progress: 2.72
large_file.txt progress: 3.40
...
large_file.txt progress: 99.20
large_file.txt progress: 99.87
large_file.txt progress: 100.00


回答2:

requests 2.0.0 has streaming uploads. This means you can use a generator to yield tiny chunks and print the progress between chunks.



回答3:

I don't think this is possible, but pycurl does have upload/download progress callbacks you can use.



回答4:

poster supports this

import json
import os
import sys
import urllib2

from poster.encode import multipart_encode
from poster.streaminghttp import register_openers

def _upload_progress(param, current, total):
    sys.stdout.write(
        "\r{} - {:.0f}%                "
        .format(param.name,
                (float(current) / float(total)) * 100.0))
    sys.stdout.flush()

def upload(request_resource, large_file_path):
    register_openers()
    with open(large_file_path, 'r') as large_file:
        request_data, request_headers = multipart_encode(
            [('file', largs_file)],
            cb=_upload_progress)

        request_headers.update({
            'X-HockeyAppToken': 'we use this for hockeyapp upload'
        })

        upload_request = urllib2.Request(request_resource,
                                         request_data, 
                                         request_headers)
        upload_connection = urllib2.urlopen(upload_request)
        upload_response = json.load(upload_connection)
    print "Done"