docs/reference/use-filebeat-modules-kafka.md
This section shows how to set up {{filebeat}} modules to work with {{ls}} when you are using Kafka in between {{filebeat}} and {{ls}} in your publishing pipeline. The main goal of this example is to show how to load ingest pipelines from {{filebeat}} and use them with {{ls}}.
The examples in this section show simple configurations with topic names hard coded. For a full list of configuration options, see documentation about configuring the Kafka input plugin. Also see Configure the Kafka output in the {{filebeat}} Reference.
If you haven’t already set up the {{filebeat}} index template and sample {{kib}} dashboards, run the {{filebeat}} setup command to do that now:
filebeat -e setup
The -e flag is optional and sends output to standard error instead of syslog.
A connection to {{es}} and {{kib}} is required for this one-time setup step because {{filebeat}} needs to create the index template in {{es}} and load the sample dashboards into {{kib}}. For more information about configuring the connection to {{es}}, see the Filebeat quick start.
After the template and dashboards are loaded, you’ll see the message INFO {{kib}} dashboards successfully loaded. Loaded dashboards.
Run the modules enable command to enable the modules that you want to run. For example:
filebeat modules enable system
You can further configure the module by editing the config file under the {{filebeat}} modules.d directory. For example, if the log files are not in the location expected by the module, you can set the var.paths option.
::::{note} You must enable at least one fileset in the module. Filesets are disabled by default. ::::
Run the setup command with the --pipelines and --modules options specified to load ingest pipelines for the modules you’ve enabled. This step also requires a connection to {{es}}. If you want use a {{ls}} pipeline instead of ingest node to parse the data, skip this step.
filebeat setup --pipelines --modules system
Configure {{filebeat}} to send log lines to Kafka. To do this, in the filebeat.yml config file, disable the {{es}} output by commenting it out, and enable the Kafka output. For example:
#output.elasticsearch:
#hosts: ["localhost:9200"]
output.kafka:
hosts: ["kafka:9092"]
topic: "filebeat"
codec.json:
pretty: false
Start {{filebeat}}. For example:
filebeat -e
{{filebeat}} will attempt to send messages to {{ls}} and continue until {{ls}} is available to receive them.
::::{note} Depending on how you’ve installed {{filebeat}}, you might see errors related to file ownership or permissions when you try to run {{filebeat}} modules. See Config File Ownership and Permissions in the Beats Platform Reference if you encounter errors related to file ownership or permissions. ::::
On the system where {{ls}} is installed, create a {{ls}} pipeline configuration that reads from a Kafka input and sends events to an {{es}} output:
input {
kafka {
bootstrap_servers => "myhost:9092"
topics => ["filebeat"]
codec => json
}
}
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => "https://myEShost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
pipeline => "%{[@metadata][pipeline]}" <1>
user => "elastic"
password => "secret"
}
} else {
elasticsearch {
hosts => "https://myEShost:9200"
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
user => "elastic"
password => "secret"
}
}
}
pipeline option to %{[@metadata][pipeline]}. This setting configures {{ls}} to select the correct ingest pipeline based on metadata passed in the event.Start {{ls}}, passing in the pipeline configuration file you just defined. For example:
bin/logstash -f mypipeline.conf
{{ls}} should start a pipeline and begin receiving events from the Kafka input.
To visualize the data in {{kib}}, launch the {{kib}} web interface by pointing your browser to port 5601. For example, http://127.0.0.1:5601. Click Dashboards then view the {{filebeat}} dashboards.