Common trace logging and integration with Logstash, Kibana, and ElasticSearch

I implemented a distributed transaction log library with Tree as Structure, as mentioned in Google Dapper ( http://research.google.com/pubs/pub36356.html ) and eBay CAL Transaction Logging Framework ( http: //devopsdotcom.files.wordpress. com / 2012/11 / screen-shot-2012-11-11-at-10-06-39-am.png ).

Magazine format

TIMESTAMP   HOSTNAME  DATACENTER  ENVIRONMENT  EVENT_GUID PARENT_GUID TRACE_GUID APPLICATION_ID TREE_LEVEL TRANSACTION_TYPE TRANSACTION_NAME STATUS_CODE DURATION(in ms) PAYLOAD(key1=value2,key2=value2)

GUID HEX ROOM FORMAT

MURMER_HASH(HOSTNAME + DATACENTER + ENVIRONMENT)-JVM_THREAD_ID-(TIME_STAMP+Atomic Counter)

What I would like to do is integrate this format with the Kibana user interface, and when the user wants to perform a search and clicking on TRACE_GUID, he will show something similar to the distributed CALL graph, which shows where the time was spent. Here is the UI http://twitter.imtqy.com/zipkin/ . It will be great. I am not an interface developer if someone can tell me how to do this and that will be great.

In addition, I would like to know how I can index payload data for elastic search so that the user defines some kind of expression similar to the payload (duration> 1000), then Elastic Search will list all logarithms that satisfy the condition. Also, I would like to index Payload as Name = Value so that the user can request (regular key3 = value2 or key4 = exception) some kind of regular expression. Please let me know if this can be achieved. Any help index will be great.

Thanks Bhavesh

+4
source share
1 answer

elasticsearch . logstash - . grok {} ( ) .

, (,% {INT: duration: int} ). elasticsearch "duration: > 1000", .

Elasticsearch lucene, .

+1

Source: https://habr.com/ru/post/1536246/


All Articles