I have a REST API that returns json responses. Sometimes (and what seems to be at completely random), the json response gets cut off half-way through. So the returned json string looks like:
...route_short_name":"135","route_long_name":"Secte // end of response
I'm pretty sure it's not an encoding issue because the cut off point keeps changing position, depending on the json string that's returned. I haven't found a particular response size either for which the cut off happens (I've seen 65kb not get cut off, whereas 40kbs would).
Looking at the response header when the cut off does happen:
{
"Cache-Control" = "must-revalidate, private, max-age=0";
Connection = "keep-alive";
"Content-Type" = "application/json; charset=utf-8";
Date = "Fri, 11 May 2012 19:58:36 GMT";
Etag = "\"f36e55529c131f9c043b01e965e5f291\"";
Server = "nginx/1.0.14";
"Transfer-Encoding" = Identity;
"X-Rack-Cache" = miss;
"X-Runtime" = "0.739158";
"X-UA-Compatible" = "IE=Edge,chrome=1";
}
Doesn't ring a bell either. Anyone?
I had the same problem:
Nginx cut off some responses from the FastCGI backend. For example, I couldn't generate a proper SQL backup from PhpMyAdmin. I checked the logs and found this:
All I had to do to fix it was to give proper permissions to the
/usr/local/nginx/fastcgi_temp
folder, as well asclient_body_temp
.Fixed!
Thanks a lot samvermette, your Question & Answer put me on the right track.
I've also had this issue –
JSON
parsing client-side was faulty, the response was being cut off or worse still, the response was stale and was read from some random memory buffer.After going through some guides – Serving Static Content Via POST From Nginx as well as Nginx: Fix to “405 Not Allowed” when using POST serving static while trying to configure nginx to serve a simple
JSON
file.In my case, I had to use:
so that the browser doesn't get any funny ideas when nginx adds
Accept-Ranges: bytes
in the response header) as well asin my
server
block for the proxy which serves the static files. Adding it to thelocation
block which would finally serve the foundJSON
file didn't help.Another protip for serving static
JSON
s would also be not forgetting the response type:Other searches yielded folder permission issues – nginx is cutting the end of dynamic pages and cache it or proxy buffering issues – Getting a chunked request through nginx, but that was not my case.
I had similar problem with cutting response from server.
It happened only when I added json header before returning response
header('Content-type: application/json');
In my case
gzip
caused the issue.I solved it by specifying
gzip_types
innginx.conf
and addingapplication/json
to list before turning ongzip
:Thanks for the question and the great answers, it saved me a lot of time. In the end, the answer of clement and sam helped me solve my issue, so the credits go to them.
Just wanted to point out that after reading a bit about the topic, it seems it is not recommended to disable
proxy_buffering
since it could make your server stall if the clients (user of your system) have a bad internet connection for example.I found this discussion very useful to understand more. The example of Francis Daly made it very clear for me:
Looked up my nginx
error.log
file and found the following:Looks like nginx's proxy was trying to save the response content (passed in by thin) to a file. It only does so when the response size exceeds
proxy_buffers
(64kb by default on 64 bits platform). So in the end the bug was connected to my request response size.I ended fixing my issue by setting
proxy_buffering
tooff
in my nginx config file, instead of uppingproxy_buffers
or fixing the file permission issue.Still not sure about the purpose of nginx's buffer. I'd appreciate if anyone could add up on that. Is disabling the buffering completely a bad idea?
It's possible you ran out of inodes, which prevents NginX from using the fastcgi_temp directory properly.
Try
df -i
and if you have 0% inodes free, that's a problem.Try
find /tmp -mtime 10
(older than 10 days) to see what might be filling up your disk.Or maybe it's another directory with too many files. For example, go to /home/www-data/example.com and count the files:
find . -print | wc -l