Prevent timeout during large request in PHP

2019-04-03 16:20发布

I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.

<?php
include 'echove.php';

$e = new Echove(
    'xxxxx',
    'xxxxx'
);

// Read Video IDs
# Define our parameters
$params = array(
    'fields'         => 'id,referenceId'

);

# Make our API call
$videos = $e->findAll('video', $params);


    //print_r($videos);
    foreach ($videos as $video) {

        //print_r($video);
        $ref_id = $video->referenceId;
        $vid_id = $video->id;

        switch ($ref_id) {
            case "":
                $metaData = array(
                    'id' => $vid_id,
                    'referenceId' => $vid_id
                );

                # Update a video with the new meta data
                $e->update('video', $metaData);                
                echo "$vid_id updated sucessfully!<br />";
                break;
            default:
                echo "$ref_id was not updated. <br />";
                break;
        }
    }
?>

Thanks!

3条回答
老娘就宠你
2楼-- · 2019-04-03 16:42

Also use ignore_user_abort() to bypass browser abort. The script will keep running even if you close the browser (use with caution).

查看更多
狗以群分
3楼-- · 2019-04-03 16:46

Try sending a 'Status: 102 Processing' every now and then to prevent the browser from timing out (your best bet is about 15 to 30 seconds in between). After the request has been processed you may send the final response.

The browser shouldn't time out any more this way.

查看更多
何必那么认真
4楼-- · 2019-04-03 16:51

Try the set_time_limit() function. Calling set_time_limit(0) will remove any time limits for execution of the script.

查看更多
登录 后发表回答