cowrie/utils/elk/README.md

68 lines
1.4 KiB
Markdown
Raw Normal View History

2015-05-12 14:57:29 +00:00
# How to process Cowrie output in an ELK stack
(Note: work in progress, instructions are not verified)
2015-02-04 09:13:29 +00:00
## Prerequisites
2015-05-12 14:57:29 +00:00
* Working Cowrie installation
* Cowrie JSON log file (enable database json in cowrie.cfg)
2015-02-04 09:13:29 +00:00
## Installation
* Install logstash, elasticsearch and kibana
```
apt-get install logstash
apt-get install elasticsearch
````
2015-02-04 09:13:29 +00:00
* Install Kibana
This may be different depending on your operating system. Kibana will need additional components such as a web server
## ElasticSearch Configuration
TBD
## Logstash Configuration
* Download GeoIP data
```
wget http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz
wget http://download.maxmind.com/download/geoip/database/asnum/GeoIPASNum.dat.gz
```
* Place these somewhere in your filesystem.
* Configure logstash
```
2015-05-12 14:57:29 +00:00
cp logstash-cowrie.conf /etc/logstash/conf.d
2015-02-04 09:13:29 +00:00
```
* Make sure the configuration file is correct. Check the input section (path), filter (geoip databases) and output (elasticsearch hostname)
```
service logstash restart
```
2015-02-03 19:54:02 +00:00
* By default the logstash is creating debug logs in /tmp.
* To test whether logstash is working correctly, check the file in /tmp
```
2015-05-12 14:57:29 +00:00
tail /tmp/cowrie-logstash.log
```
2015-02-03 19:54:02 +00:00
* To test whether data is loaded into ElasticSearch, run the following query:
```
2015-05-12 14:57:29 +00:00
http://<hostname>:9200/_search?q=cowrie&size=5
```
2015-02-03 19:54:02 +00:00
2015-02-04 09:13:29 +00:00
* If this gives output, your data is correctly loaded into ElasticSearch