I myself have been using this system happily! Cause I did not have to go through all the raw logs on different servers. I could go though the log records, search and sort different logs from different servers all in one place.
The problem arises when the number logs hits a million. In my case I had to deal with more than 30+ millions log records each week, which start to become a disaster after a couple of weeks.
The solution is to realise what you need to find from logs and use Kibana features to do that. In this post I come up with two case scenarios. One case is to find crawlers! The other case is to find daily raw traffic (by the number of requests).
So here we go, first you need to have the basic setup for Kibana meaning at least the Discover tab should be working.
To create the visualisation for daily traffic, just go Visualise tab. Create a visualisation from a new search and then select vertical bar chart.
Now you need to add matrices. For Y axis choose “Unique Count” aggregation and choose “client_ip” for field.
Then add an X axis with “Date Histogram” aggregation, choose the timestamp for field and select an interval (i.e. Daily). Click on apply and save if it is ok.
To create the visualisation for finding crawlers, just go Visualise tab. Create a visualisation from a new search and then select data table.
Select “Count” aggregation as Metric. Then Add “Split Rows” aggregation. Choose “Terms” as aggregation, “client_ip” as field and finally number of results (size) plus sorting option (Order).