JSON must be no more than 1000000 bytes

2019-08-15 10:27发布

问题:

We have a Jenkins-Chef setup with a QA build project to a website for a client. The build gets the code from Bitbucket, and a script uploads the cookbooks from the Chef Client to the Chef Server.

These builds ran fine for a long time. Two days ago the automated and manual builds started failing with the following error (taken from the Jenkins console output):

Updated Environment qa

Uploading example-deployment [0.1.314]

ERROR: Request Entity Too Large

Response: JSON must be no more than 1000000 bytes.

From what I understand, JSON files are supposed to be related to nodejs which is what the developers use on this webserver.

We looked all over the config files for Jenkins, the Chef-Server and the QA server. We couldn't find a way to change this 1MB limit that is causing this error.

We tried changing client_max_body_size, didn't work. We checked the JSON files size, non of them reach this limit. Any idea where we can find a solution? Can this limit be changed? Is there anything we can do (Infrastructure wise) or should this be fixed from the developer side?

回答1:

So first of all, the 1M value is more or less hardcoded, the chef-server is not intended to store large objects.

What happens is before uploading a cookbook, a json file with it's information is created, as this file will be stored in DB and indexed it should not exceed a too large size to avoid performances problems.

The idea is to upload to the chef-server only what is absolutely necessary, strip CVS directory, any IDE build/project file, etc.

Best solution to achieve it simply is using the chefignore file. It has to be created just under the cookbook_path.

The content of this is wildcard matches to ignore while uploading the cookbook so an example one could be:

*/.svn/* # To strip subversion directories
*/.git/* # To strip git directories
*~ # to ignore vim backup files