What is the difference between _A0 & _S0 log files

2019-07-27 02:17发布

问题:

I have turned on the switch to send the logs of a GAE Standard app to a GCS bucket. I see there as expected a folder for each day. For every hour of every days I see a very big json file with the extension _S0.json. For some hours I also see a much smaller file with the extension _A0:.json. For instance:

01:00:00_01:59:59_S0.json & 01:00:00_01:59:59_A0:4679580000.json

What is the difference, I am trying to post process the files and need to know.

回答1:

Logs exported to GCS are sharded, the _A0 and _S0 are simply identifiers of the logs shards.

From Log entries in Google Cloud Storage (emphasis mine):

The leaf directories (DD/) contain multiple files, each of which holds the exported log entries for a time period specified in the file name. The files are sharded and their names end in a shard number, Sn or An (n=0, 1, 2, ...). For example, here are two files that might be stored within the directory my-gcs-bucket/syslog/2015/01/13/:

08:00:00_08:59:59_S0.json
08:00:00_08:59:59_S1.json

These two files together contain the syslog log entries for all instances during the hour beginning 0800 UTC. To get all the log entries, you must read all the shards for each time period—in this case, file shards 0 and 1. The number of file shards written can change for every time period depending on the volume of log entries.

I got to the above page via the last link in the below quoted section from Quotas and limits:

Logs ingestion allotment

Logging for App Engine apps is provided by Stackdriver. By default, logs are stored for an application free of charge for up to 7 days and 5GB. Logs older than the maximum retention time are deleted, and attempts to store above the free ingestion limit of 5 gigabytes will result in an error. You can update to the Premium Tier for greater storage capacity and retention length. See Stackdriver pricing for more information on logging rates and limits. If you want to retain your logs for longer than what Stackdriver allows, you can export logs to Google Cloud Storage, Google BigQuery, or Google Cloud Pub/Sub.