Convert timestamp timezone in Logstash for output

2019-02-10 21:42发布

问题:

In my scenario, the "timestamp" of the syslog lines Logstash receives is in UTC and we use the event "timestamp" in the Elasticsearch output:

output {
    elasticsearch {
        embedded => false
        host => localhost
        port => 9200
        protocol => http
        cluster => 'elasticsearch'
        index => "syslog-%{+YYYY.MM.dd}"
    }
}

My problem is that at UTC midnight, Logstash sends log to different index before the end of the day in out timezone (GMT-4 => America/Montreal) and the index has no logs after 20h (8h PM) because of the "timestamp" being UTC.

We've done a work arround to convert the timezone but we experience a significant performance degradation:

filter {
    mutate {
        add_field => {
            # Create a new field with string value of the UTC event date
            "timestamp_zoned" => "%{@timestamp}"
        }
    }

    date {
        # Parse UTC string value and convert it to my timezone into a new field
        match => [ "timestamp_zoned", "yyyy-MM-dd HH:mm:ss Z" ]
        timezone => "America/Montreal"
        locale => "en"
        remove_field => [ "timestamp_zoned" ]
        target => "timestamp_zoned_obj"
    }

    ruby {
        # Output the zoned date to a new field
        code => "event['index_day'] = event['timestamp_zoned_obj'].strftime('%Y.%m.%d')"
        remove_field => [ "timestamp_zoned_obj" ]
    }
}

output {
    elasticsearch {
        embedded => false
        host => localhost
        port => 9200
        protocol => http
        cluster => 'elasticsearch'
        # Use of the string value
        index => "syslog-%{index_day}"
    }
}

Is there a way to optimize this config?

回答1:

This is the optimize config, please have a try and test for the performance.

You no need to use mutate and date plugin. Use ruby plugin directly.

input {
    stdin {
    }
}

filter {
    ruby {
            code => "
                    event['index_day'] = event['@timestamp'].localtime.strftime('%Y.%m.%d')
            "
    }
}

output {
    stdout { codec => rubydebug }
}

Example output:

{
       "message" => "test",
      "@version" => "1",
    "@timestamp" => "2015-03-30T05:27:06.310Z",
          "host" => "BEN_LIM",
     "index_day" => "2015.03.29"
}


回答2:

In version 1.5.0, we can convert timestamp by local timezone for the index name. Here is my configuration:

filter {
    ruby {
        code => "event['index_day'] = event.timestamp.time.localtime.strftime('%Y.%m.%d')"
    }
}
output {
    elasticsearch {
        host => localhost
        index => "thrall-%{index_day}"
    }
}


回答3:

In Logstash Version 5.0.2,The API was modified. We can convert timestamp by local timezone for the index name. Here is my configuration:

filter { 
   ruby { 
       code => "event['index_day'] = event.timestamp.time.localtime.strftime('%Y.%m.%d')" 
   } 
} 


回答4:

In logstash version 5.0 and later, you can use this:

filter{
ruby {
        code => "event.set('index_day', event.get('[@timestamp]').time.localtime.strftime('%Y%m%d'))"
    }
}