Logstash to Convert Epoch Timestamp

I am trying to parse some timestamps of an era in order to be something more readable.

I looked around to take them apart at normal times, and from what I understand, all I need to do is something like this:

mutate { remove_field => [ "..."] } grok { match => { 'message' => '%{NUMBER:time}%{SPACE}%{NUMBER:time2}...' } } date { match => [ "time","UNIX" ] } 

Example message: 1410811884.84 1406931111.00 .... The first two values ​​must be UNIX time values.

My grok works because all the fields are displayed in Kibana with the expected values, and all the value fields that I deleted do not exist, so the mutation also works. The date section does nothing.

From what I understand, match => [ "time","UNIX" ] should do what I want (change the value of time as the correct date format and show it on the kiban as a field). Apparently I do not understand.

+5
source share
2 answers

The date {} filter replaces the @timestamp value with the provided data, so you should see @timestamp with the same value as the [time] field. This is usually useful because there is some delay in the distribution, processing, and storage of the logs, so it is preferable to use the event’s own time.

Since you have several date fields, you want to use the "target" parameter of the date filter to indicate the destination of the syntax date, for example:

 date { match => [ "time","UNIX" ] target => "myTime" } 

This converts a string field named [time] into a date field named [myTime]. Kibana knows how to display date fields, and you can configure this in kibana settings.

Since you probably do not need a string with a version date of the same data, you can delete the string version as part of the conversion:

 date { match => [ "time","UNIX" ] target => "myTime" remove_field => [ "time" ] } 
+7
source

Consider also trying with UNIX_MS in milliseconds.

 date { timezone => "UTC" match => ["timestamp", "UNIX_MS"] target => "@timestamp" } 
+2
source

Source: https://habr.com/ru/post/1238585/


All Articles