Merge pull request #71 from thyrlian/master

Add Elasticsearch-Logstash-Kibana (ELK) example
This commit is contained in:
Romain Bélorgey 2020-07-24 11:34:58 +02:00 committed by GitHub
commit 72bc6b1a16
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 8269 additions and 0 deletions

View File

@ -16,6 +16,7 @@ These samples provide a starting point for how to integrate different services u
## Samples of Docker Compose applications with multiple integrated services
- [`ASP.NET / MS-SQL`](https://github.com/docker/awesome-compose/tree/master/aspnet-mssql) - Sample ASP.NET core application
with MS SQL server database.
- [`Elasticsearch / Logstash / Kibana`](https://github.com/docker/awesome-compose/tree/master/elasticsearch-logstash-kibana) - Sample Elasticsearch, Logstash, and Kibana stack.
- [`Go / NGINX / MySQL`](https://github.com/docker/awesome-compose/tree/master/nginx-golang-mysql) - Sample Go application
with an Nginx proxy and a MySQL database.
- [`Go / NGINX / PostgreSQL`](https://github.com/docker/awesome-compose/tree/master/nginx-golang-postgres) - Sample Go

View File

@ -0,0 +1,58 @@
## Compose sample application
### Elasticsearch, Logstash, and Kibana (ELK) in single-node
Project structure:
```
.
└── docker-compose.yml
```
[_docker-compose.yml_](docker-compose.yml)
```
services:
elasticsearch:
image: elasticsearch:7.8.0
...
logstash:
image: logstash:7.8.0
...
kibana:
image: kibana:7.8.0
...
```
## Deploy with docker-compose
```
$ docker-compose up -d
Creating network "elasticsearch-logstash-kibana_elastic" with driver "bridge"
Creating es ... done
Creating log ... done
Creating kib ... done
```
## Expected result
Listing containers must show three containers running and the port mapping as below:
```
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
173f0634ed33 logstash:7.8.0 "/usr/local/bin/dock…" 43 seconds ago Up 41 seconds 0.0.0.0:5000->5000/tcp, 0.0.0.0:5044->5044/tcp, 0.0.0.0:9600->9600/tcp, 0.0.0.0:5000->5000/udp log
b448fd3e9b30 kibana:7.8.0 "/usr/local/bin/dumb…" 43 seconds ago Up 42 seconds 0.0.0.0:5601->5601/tcp kib
366d358fb03d elasticsearch:7.8.0 "/tini -- /usr/local…" 43 seconds ago Up 42 seconds (healthy) 0.0.0.0:9200->9200/tcp, 0.0.0.0:9300->9300/tcp es
```
After the application starts, navigate to below links in your web browser:
* Elasticsearch: [`http://localhost:9200`](http://localhost:9200)
* Logstash: [`http://localhost:9600`](http://localhost:9600)
* Kibana: [`http://localhost:5601`](http://localhost:5601)
Stop and remove the containers
```
$ docker-compose down
```
## Attribution
The [example Nginx logs](https://github.com/docker/awesome-compose/tree/master/elasticsearch-logstash-kibana/logstash/nginx.log) are copied from [here](https://github.com/elastic/examples/blob/master/Common%20Data%20Formats/nginx_json_logs/nginx_json_logs).

View File

@ -0,0 +1,50 @@
version: '3.8'
services:
elasticsearch:
image: elasticsearch:7.8.0
container_name: es
environment:
discovery.type: single-node
ES_JAVA_OPTS: "-Xms512m -Xmx512m"
ports:
- "9200:9200"
- "9300:9300"
healthcheck:
test: ["CMD-SHELL", "curl --silent --fail localhost:9200/_cluster/health || exit 1"]
interval: 10s
timeout: 10s
retries: 3
networks:
- elastic
logstash:
image: logstash:7.8.0
container_name: log
environment:
discovery.seed_hosts: logstash
LS_JAVA_OPTS: "-Xms512m -Xmx512m"
volumes:
- ./logstash/pipeline/logstash-nginx.config:/usr/share/logstash/pipeline/logstash-nginx.config
- ./logstash/nginx.log:/home/nginx.log
ports:
- "5000:5000/tcp"
- "5000:5000/udp"
- "5044:5044"
- "9600:9600"
depends_on:
- elasticsearch
networks:
- elastic
command: logstash -f /usr/share/logstash/pipeline/logstash-nginx.config
kibana:
image: kibana:7.8.0
container_name: kib
ports:
- "5601:5601"
depends_on:
- elasticsearch
networks:
- elastic
networks:
elastic:
driver: bridge

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,30 @@
input {
file {
path => "/home/nginx.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
json {
source => "message"
}
geoip {
source => "remote_ip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output {
elasticsearch {
hosts => ["http://es:9200"]
index => "nginx"
}
stdout {
codec => rubydebug
}
}