I am trying to create a way to navigate my log files and the main functions that I need:
- searching for lines within the log file (and the return line of entries).
- pagination from string
x to string y .
Now I checked Logstash, and it looked great for my first function (search), but not for the second. I was under the idea that I could somehow index the line number of the file along with the log information of each record, but I cannot find a way.
Is there any Logstash filter for this? or filebeat processor? I can not make it work.
I thought that perhaps I could create a way for all my processes to enter the database with the processed information, but this is also impossible (or very difficult) because the log handler also does not know what the current log line is.
In the end, what can I do to serve as a way to paginate my log file (through the service) would actually open it, go to a specific line and show it in the service, which is not very optimal, since the file can be very large and I'm already indexing it in Elasticsearch (using Logstash).
My current configuration is very simple:
Filebeat
filebeat.prospectors: - type: log paths: - /path/of/logs/*.log output.logstash: hosts: ["localhost:5044"]
Logstash
input { beats { port => "5044" } } output { elasticsearch { hosts => [ "localhost:9200" ] } }
Right now, for example, I get an element like:
{ "beat": { "hostname": "my.local", "name": "my.local", "version": "6.2.2" }, "@timestamp": "2018-02-26T04:25:16.832Z", "host": "my.local", "tags": [ "beats_input_codec_plain_applied", ], "prospector": { "type": "log" }, "@version": "1", "message": "2018-02-25 22:37:55 [mylibrary] INFO: this is an example log line", "source": "/path/of/logs/example.log", "offset": 1124 }
If I could somehow include a line_number: 1 type field in this element, it would be great, since I could use Elasticsearch filters to actually navigate through all the logs.
If you have ideas on different ways to store my logs (and navigation), please let me know