2015-05-12 14:57:29 +00:00
# How to process Cowrie output in an ELK stack
2015-02-03 18:10:29 +00:00
(Note: work in progress, instructions are not verified)
2015-02-04 09:13:29 +00:00
## Prerequisites
2015-05-12 14:57:29 +00:00
* Working Cowrie installation
* Cowrie JSON log file (enable database json in cowrie.cfg)
2017-01-13 13:56:35 +00:00
* Java 8
2015-02-04 09:13:29 +00:00
## Installation
2015-02-03 18:10:29 +00:00
2017-01-13 13:56:35 +00:00
We'll examine simple installation, when we install ELK stack on the same machine that used for cowrie.
* Add Elastic's repository and key
```
wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
apt-get update
2015-02-03 20:29:57 +00:00
```
2015-02-03 18:10:29 +00:00
2017-01-13 13:56:35 +00:00
* Install logstash, elasticsearch and kibana
2015-02-04 09:13:29 +00:00
2017-01-13 13:56:35 +00:00
```
2017-07-29 17:20:47 +00:00
sudo apt-get install elasticsearch logstash kibana
2017-01-13 13:56:35 +00:00
```
2015-02-04 09:13:29 +00:00
2017-01-13 13:56:35 +00:00
* Set them to autostart
```
2017-07-29 17:20:47 +00:00
sudo update-rc.d elasticsearch defaults 95 10
sudo update-rc.d kibana defaults 95 10
2017-01-13 13:56:35 +00:00
```
2015-02-04 09:13:29 +00:00
## ElasticSearch Configuration
TBD
2017-01-13 13:56:35 +00:00
## Kibana Configuration
* Make a folder for logs
```
2017-07-29 17:20:47 +00:00
sudo mkdir /var/log/kibana
sudo chown kibana:kibana /var/log/kibana
2017-01-13 13:56:35 +00:00
```
2017-07-29 17:20:47 +00:00
* Change the following parameters in `/etc/kibana/kibana.yml` to reflect your server setup:
2017-01-13 13:56:35 +00:00
```
"server.host" - set it to "localhost" if you use nginx for basic authentication or external interface if you use XPack (see below)
"server.name" - name of the server
"elasticsearch.url" - address of the elasticsearch
"elasticsearch.username", "elasticsearch.password" - needed only if you use XPack (see below)
"logging.dest" - set path to logs (/var/log/kibana/kibana.log)
```
2017-07-29 17:20:47 +00:00
* Make sure the file `/etc/kibana/kibana.yml` contains a line like
```
tilemap.url: https://tiles.elastic.co/v2/default/{z}/{x}/{y}.png?elastic_tile_service_tos=agree& my_app_name=kibana
```
or your map visualizations won't have any background. When this file is created during the installation
of Kibana, it does _not_ contain such a line, not even in commented out form.
2015-02-04 09:13:29 +00:00
## Logstash Configuration
2015-02-03 18:10:29 +00:00
* Download GeoIP data
2015-02-03 20:29:57 +00:00
```
2017-01-13 13:56:35 +00:00
wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.mmdb.gz
2015-02-03 20:29:57 +00:00
```
2015-02-03 18:10:29 +00:00
2017-01-13 13:56:35 +00:00
* Place these somewhere in your filesystem and make sure that "logstash" user can read it
2015-02-03 18:10:29 +00:00
2017-07-29 17:20:47 +00:00
```
sudo mkdir -p /var/opt/logstash/vendor/geoip/
sudo mv GeoLite2-City.mmdb /var/opt/logstash/vendor/geoip
```
2015-02-03 18:10:29 +00:00
* Configure logstash
2015-02-03 20:29:57 +00:00
```
2017-07-29 17:20:47 +00:00
sudo cp logstash-cowrie.conf /etc/logstash/conf.d
2015-02-04 09:13:29 +00:00
```
* Make sure the configuration file is correct. Check the input section (path), filter (geoip databases) and output (elasticsearch hostname)
```
2017-07-29 17:20:47 +00:00
sudo service logstash restart
2015-02-03 20:29:57 +00:00
```
2015-02-03 18:10:29 +00:00
2015-02-03 19:54:02 +00:00
* By default the logstash is creating debug logs in /tmp.
* To test whether logstash is working correctly, check the file in /tmp
2015-02-03 20:29:57 +00:00
```
2015-05-12 14:57:29 +00:00
tail /tmp/cowrie-logstash.log
2015-02-03 20:29:57 +00:00
```
2015-02-03 19:54:02 +00:00
* To test whether data is loaded into ElasticSearch, run the following query:
2015-02-03 20:29:57 +00:00
```
2017-07-29 17:20:47 +00:00
curl 'http://< hostname > :9200/_search?q=cowrie& size=5'
2015-02-03 20:29:57 +00:00
```
2015-02-03 19:54:02 +00:00
2017-07-29 17:20:47 +00:00
(Replace `<hostname>` with the name or IP address of the machine on which ElasticSearch is running, e.g., `localhost` .)
2015-02-04 09:13:29 +00:00
* If this gives output, your data is correctly loaded into ElasticSearch
2017-01-13 13:56:35 +00:00
* When you successfully configured logstash, remove "file" and "stdout" blocks from output section of logstash configuration.
## Distributed setup of sensors or multiple sensors on the same host
If you have multiple sensors, you will need to setup up FileBeat to feed logstash with logs from all sensors
On the logstash server:
* Change "input" section of the logstash to the following:
```
input {
beats {
port => 5044
}
}
```
On the sensor servers:
* Install filebeat
```
wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
2017-07-29 17:20:47 +00:00
sudo apt-get update
sudo apt-get install filebeat
2017-01-13 13:56:35 +00:00
```
* Enable autorun for it
```
2017-07-29 17:20:47 +00:00
sudo update-rc.d filebeat defaults 95 10
2017-01-13 13:56:35 +00:00
```
* Configure filebeat
```
2017-07-29 17:20:47 +00:00
sudo cp filebeat-cowrie.conf /etc/filebeat/filebeat.yml
2017-01-13 13:56:35 +00:00
```
* Check the following parameters
```
paths - path to cowrie's json logs
logstash - check ip of the logstash host
```
* Start filebeat
```
2017-07-29 17:20:47 +00:00
sudo service filebeat start
2017-01-13 13:56:35 +00:00
```
## Tuning ELK stack
* Refer to elastic's documentation about proper configuration of the system for the best elasticsearch's performance
2017-07-29 17:20:47 +00:00
* You may avoid installing nginx for restricting access to kibana by installing official elastic's plugin called "X-Pack" (https://www.elastic.co/products/x-pack)