How to handle multiple heterogeneous inputs with Logstash?

LoggingElasticsearchLogstashGraylog2

Logging Problem Overview


Let's say you have 2 very different types of logs such as technical and business logs and you want:

  • raw technical logs be routed towards a graylog2 server using a gelf output,
  • json business logs be stored into an elasticsearch cluster using the dedicated elasticsearch_http output.

I know that with Syslog-NG for instance, the configuration file allow to define several distinct inputs which can then be processed separately before being dispatched; what Logstash seems unable to do. Even if one instance can be initiated with two specific configuration files, all logs take the same channel and are being applied the same processings ...

Should I run as many instances as I have different types of logs?

Logging Solutions


Solution 1 - Logging

> Should I run as many instances as I have different types of logs?

No! You can only run one instance to handle different types of logs.

In the logstash configuration file, you can specific each input with different type. Then in the filter you can use if to distinct different processing, and also at the output you can use "if" output to different destination.

input {
    file {
            type => "technical"
            path => "/home/technical/log"
    }
    file {
            type => "business"
            path => "/home/business/log"
    }
} 
filter {
    if [type] == "technical" {
            # processing .......
    }
    if [type] == "business" {
            # processing .......
    }
}
output {
    if [type] == "technical" {
            # output to gelf
    }
    if [type] == "business" {
            # output to elasticsearch
    }
}

Hope this can help you :)

Solution 2 - Logging

I used tags for multiple file input:

input {
	file {
		type => "java"
		path => "/usr/aaa/logs/stdout.log"
		codec => multiline {
			...
		},
		tags => ["aaa"]
	}

	file {
		type => "java"
		path => "/usr/bbb/logs/stdout.log"
		codec => multiline {
				...
		}
		tags => ["bbb"]
	}
}
output {
    stdout {
        codec => rubydebug
    }
    if "aaa" in [tags] {
		elasticsearch {
			hosts => ["192.168.100.211:9200"]
			index => "aaa"
			document_type => "aaa-%{+YYYY.MM.dd}"
		}
	}

    if "bbb" in [tags] {
		elasticsearch {
			hosts => ["192.168.100.211:9200"]
			index => "bbb"
			document_type => "bbb-%{+YYYY.MM.dd}"
		}
	}
}

Solution 3 - Logging

I think logstash can't read more than 2 files in Input section . try the below

input {
    file {
            type => "technical"
            path => "/home/technical/log"
    }
    file {
            type => "business"
            path => "/home/business/log"
    }
 file {
            type => "business1"
            path => "/home/business/log1"
    }
} 

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionDavidView Question on Stackoverflow
Solution 1 - LoggingBen LimView Answer on Stackoverflow
Solution 2 - LoggingRobin WangView Answer on Stackoverflow
Solution 3 - Logging KM PrakView Answer on Stackoverflow