Pointing multiple projects' log sinks to one b

2019-08-17 07:03发布

问题:

I have a few GCP projects with log sinks to different storage buckets. I'd like to combine them into a single bucket. But the stackdriver export doesn't add any distinguishing information to the object names it creates; they all look like cloudaudit.googleapis.com/activity/2017/11/14/00:00:00_00:59:59_S0.json

What will happen if I start pushing them all to a single bucket? Will the different project sinks overwrite each other's objects? Is there any way to distinguish which project created the logs just from the object?

If not, I guess I should switch to pubsub sinks, and then write some code that produces objects with more desirable names. Are there any established patterns or examples for doing this?

Update: I filed https://issuetracker.google.com/issues/69371200 for this issue.

回答1:

To enable this, just select custom destination on the sink and point to the bucket with this format: storage.googleapis.com/[BUCKET_ID].

I've just enabled this in a couple of my projects, as I'm curious to see the results when exporting to a bucket. However, I have been using a single BQ sink for all my projects, and the tables created have all the logs mixed, so no logs lost when using a single BQ sink.

I'm assuming for a GCS sink will work in the same way, but I'll tell you in a couple of days.

If a single bucket sink does not work, you can always use a single BQ sink (that will help in analyzing the logs), and when you no longer want to have them in BQ, export them and store the files wherever you want.

Also, since you'll be writing to your sink constantly, you can't use nearline or coldline, so the storage pricing is better in BQ than a regional bucket (0.02 USD/GB in BQ vs somewhere between 0.02 and 0.35 USD/GB for regional storage, depending on the region; BQ has 10GB free monthly, GCS 5GB).

I would generally recommend using a BQ sink, but I'll tell you what happens with my bucket logs.

Update:

A few hours later, and I've verified that shared bucket sinks work pretty much as you would expect. It concatenates logs chronologically regardless of the project origin, and only creates a single file for each time window. Hope this helps! (I still prefer BQ as a log sink...)

Update 2:

For the behavior you seek in the feature request, I would use BQ, but you could just as easily grep the project ID and separate the logs:

grep '"logName":"projects/<your-project-id>/' mixed-log.json > single-project-log.json

Or just get a cloud function triggered by bucket updates (so, every time you receive a log file in the sink) to run this for you.

Or namespace you buckets and have a cloud function moving them to wherever you need as soon as they are written.

The possibilities are endless!



回答2:

If you have an organization or folder which includes all the projects that you want to collect logs from, then you can create a sink that collects from all projects in that org/folder.

Unfortunatlely, you cannot do this from the Cloud Console. Instead you must use gcloud with the --organization or --folder option or the API.