Maybe I'm missing something easy here...or maybe I've just been coding on this so long I can't see past my problem, but using CakePHP (1.3) and doing a select (MySQL) should be able to return results even if there are, say, 2,000 records... right?
$options = array(
'conditions' => array(
'Lead.campaign_id' => $campaign['Campaign']['id']
),
'recursive' => -1,
'order' => 'Lead.goal DESC',
'contain' => false,
);
$leads = $this->Lead->find('all',$options);
Yet, this query gives me a white screen of death (with define('LOG_ERROR', 2);
set).
If I add a 'limit' => 300,
(or similar number..doesn't have to be exactly 300) to the $options
array then it will pull results so I know the query itself isn't the problem.
Additionally, I've tried to limit the columns returned by using the 'fields'
parameter in the $options
array...even telling it to only return 1 field but that didn't work either.
I've also echoed out the SQL statement that it uses via
$log = $this->Lead->getDataSource()->getLog(false, false);
debug($log);
Of course, I had to put the 'limit'
in for that to show up (otherwise I just get the white screen of death). Then I copy/paste that query (without the limit) into phpMyAdmin and it returns the records just fine (even with 'show me 1,000 rows' on). Doing this leads me to believe that the system can handle the query & output but maybe I'm making an assumption I shouldn't be.
The reason I'm trying to get all rows and not paginate the records is because I need to output them to a csv file for download by the user (this part is already working... when the query works).
It was suggested to me that it's probably a timeout issue and I should just raise that but that seems like a hack to me and there must be a better solution.
Since I'm doing a download of a csv file maybe there's a way for me to stream the query to a file that then gets downloaded? Or perhaps there's even a better solution?
You are probably running out of memory (check your web server error logs) - Cake will load all 2000 rows into an array which is probably going over your script memory limit.
You could raise your script memory limit but there are other options:
Do you really need to fetch back all 2000 rows? Displaying that much data to the user would be incomprehensible to the user. Would pagination work better here?
If you are planning on doing calculations based on all the data, perhaps you can re-write your query to do the calculations in the query?
If you have to use PHP to process the data, fetch it back in chunks (say 100 records at a time). You may need to set the script time out to a high value if the process might take a long time...