logging - Grok filter for my log pattern -


i experimenting elk. try input log following pattern logstash

14:25:43.324 [http-nio-9090-exec-116] info  com.app.mainapp - request has been detected 

i have tried following grok patterns filter in logstash.conf

match => { “message” => [ “ (?<timestamp>%{hour}:%{minute}:%{second}) \[%{notspace:thread}\] %{loglevel:loglevel} %{data:class}\- %{greedydata:message}“ ]}  match => { “message” => [ “ %{time:timestamp} \[%{notspace:thread}\] %{loglevel:loglevel} %{data:class}\- %{greedydata:message}“ ]} 

but when input log logstash , following error

   [0] "_grokparsefailure" 

can suggest correct grok filter above log pattern ?

this parse failure got fixed after removing starting space. working logstash.conf after removing space below

input {   file {     path => ["./debug.log"]     codec => multiline {       # grok pattern names valid! :)       pattern => "^%{timestamp_iso8601} "       negate => true       => previous     }   } }  filter {   grok {     match => [ "message", "%{timestamp_iso8601:timestamp} \[%{notspace:uid}\] \[%{notspace:thread}\] %{loglevel:loglevel} %{data:class}\-%{greedydata:message}" ]     overwrite => [ "message" ]   }   date {     match => [ "timestamp" , "yyyy-mm-dd hh:mm:ss" ]   } }   output {   elasticsearch { hosts => localhost }   stdout { codec => rubydebug } } 

Comments

Popular posts from this blog

javascript - Clear button on addentry page doesn't work -

c# - Selenium Authentication Popup preventing driver close or quit -

tensorflow when input_data MNIST_data , zlib.error: Error -3 while decompressing: invalid block type -