I'm using a PHP proxy to get the contents of a file. I want to search through that file using the powerfull jQuery options, without having to write all kinds of queries in PHP. Here is my PHP code:
$page = file_get_contents( filter_var( $_POST[url], FILTER_SANITIZE_URL ) );
die( json_encode( $page ) );
If the page loaded gets too big PHP will read the entire document, but json_encoding it will only give the first part of the file, not the entire file. I can't find anything about a size limit on json passed data, but apparently there is one.
the question: is there a workaround to prevent only part of the file being transfered?
I need to grab files from other domains, so reading the contents of a file in jQuery is not really an option.
PHP 5.3: ext/json/json.c
PHP 7 (current): ext/json/json.c
There is no built-in restriction to the size of JSON serialized data. Not for strings anyway. I would therefore assume you've run into PHPs memory limit or something.
json_encode
ing a string consistently just adds some escaping and the outer double quotes. Internally that means a bit memory doubling (temporary string concatenation and utf8_to_utf16 conversion/check), so that I ran into my 32MB php memory limit with an 8MB string already. But other than that, there seem to be no arbitrary limits injson.c
To help others who may be running into problems that they can't explain with json_encode. I've found it helps to know about the json error msg function.
I was having a similar problem but it wasn't related to file size. I had malformed utf-8 in the database. You can check your json like this
PHP docs here json_last_error_msg