์ด๋ฒ ํฌ์คํ
์์๋ ์ด์ ์ elasticsearch, kibana, logstash๋ฅผ ๋ณ๋๋ก ์คํํ์ฌ PostgreSQL DB๋ฅผ ๋ชจ๋ํฐ๋งํ๋ ๊ฒ์ Docker์ ํตํด์ ๊ตฌํํด๋ณธ๋ค.
๋ก์ปฌ ํ๊ฒฝ์์ elk๋ฅผ ๊ตฌ์ฑํ๋ ํฌ์คํธ๋ ์๋ ๋งํฌ์์ ํ์ธํ ์ ์๋ค.
Docker์ ์ฌ์ฉํด์ ๋ฐฐํฌํ๋ฉด ๋ช ๋ น์ด ํ๋๋ก ์ฝ๊ฒ ๋ค๋ฅธ ์ฌ๋์ ๊ฐ๋ฐํ๊ฒฝ, ๋ฒ์ , ์ค์ ์ ๋ณต์ฌํด์ ์์คํ ์ ๊ตฌ์ถํ ์ ์์ผ๋ฏ๋ก, ๋ค์ ์ค๋น๋จ๊ณ๊ฐ ๋ณต์กํ ELK stack์ ๋ณด๋ค ๊ฐํธํ๊ฒ ์ฌ์ฉํ๊ธฐ ์ํด ๊ณต๋ถํด์ ๊ตฌํํด๋ณด๊ธฐ๋ก ๊ฒฐ์ ํ์๋ค. ์ฌ์ ์์ ์ด ๊ผผ๊ผผํ๊ฒ ๋ค์ด๊ฐ์ผํ์ง๋ง ์ ๋๋ก ์์๋๋ฉด ๋ค์์๋ ํ ์ ์์ํ ๋๊น...๐
์ฝ๋ ์ ๋ฌธ์ ์ฌ๊ธฐ์ ์ฌ๋ ค๋๋ค.
Docker ELK ๋น๋ํ๊ธฐ
์๋ ์คํ์์ค ๊ตฌํ์์ ์์ํ๋ค.
- setup - elastic search - kibana - logstash ์์ผ๋ก ์คํ
- ๊ฐ์ network ์๋์ ๋ฌถ์(elk - driver=bridge)1. Setup Container
1. Setup Container
- setup docker file ์คํ
- entrypoint
- user password
- role
- rolesfile
setup:
profiles:
- setup
build:
context: setup/
args:
ELASTIC_VERSION: 8.8.1
init: true
volumes:
- ./setup/entrypoint.sh:/entrypoint.sh:ro,Z
- ./setup/lib.sh:/lib.sh:ro,Z
- ./setup/roles:/roles:ro,Z
environment:
ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-}
LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-}
KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-}
MONITORING_INTERNAL_PASSWORD: ${MONITORING_INTERNAL_PASSWORD:-}
networks:
- elk
depends_on:
- elasticsearch
Roles & User ์ค์
entrypoint.sh
ํ์ผ์ ๋ณด๋ฉด,
declare -A users_roles
users_roles=(
[logstash_internal]='logstash_writer'
[monitoring_internal]='remote_monitoring_collector'
)
# --------------------------------------------------------
# Roles declarations
declare -A roles_files
roles_files=(
[logstash_writer]='logstash_writer.json'
)
for role in "${!roles_files[@]}"; do
log "Role '$role'"
declare body_file
body_file="${BASH_SOURCE[0]%/*}/roles/${roles_files[$role]:-}"
if [[ ! -f "${body_file:-}" ]]; then
sublog "No role body found at '${body_file}', skipping"
continue
fi
sublog 'Creating/updating'
ensure_role "$role" "$(<"${body_file}")"
done
./roles/
ํ์์ ์๋ role ํ์ผ์ ๊ธฐ๋ฐ์ผ๋ก role์ ์์ฑํ๋ค(ensure_role
command). ์ด ๊ฒฝ์ฐ์ ์ฐ๋ฆฌ๊ฐ ํ์ํ role์logstash-writer
์ด๋ค.
// roles/logstash_writer.json
{
"cluster": ["manage_index_templates", "monitor", "manage_ilm"],
"indices": [
{
"names": ["logs-generic-default", "logstash-*", "ecs-logstash-*"],
"privileges": ["write", "create", "create_index", "manage", "manage_ilm"]
},
{
"names": ["logstash", "ecs-logstash"],
"privileges": ["write", "manage"]
}
]
}
- ๊ทธ๋ฆฌ๊ณ ๋๋ฉด user๋ฅผ ์์ฑํ๋ค.
declare -A users_passwords
users_passwords=(
[logstash_internal]="${LOGSTASH_INTERNAL_PASSWORD:-}"
[kibana_system]="${KIBANA_SYSTEM_PASSWORD:-}"
[monitoring_internal]="${MONITORING_INTERNAL_PASSWORD:-}"
)
for user in "${!users_passwords[@]}"; do
log "User '$user'"
if [[ -z "${users_passwords[$user]:-}" ]]; then
sublog 'No password defined, skipping'
continue
fi
declare -i user_exists=0
user_exists="$(check_user_exists "$user")"
if ((user_exists)); then
sublog 'User exists, setting password'
set_user_password "$user" "${users_passwords[$user]}"
else
if [[ -z "${users_roles[$user]:-}" ]]; then
suberr ' No role defined, skipping creation'
continue
fi
sublog 'User does not exist, creating'
create_user "$user" "${users_passwords[$user]}" "${users_roles[$user]}"
fi
done
create_user
command๋ฅผ ํตํด ์ฌ์ ์ ์๋ user๋ค์ ์์ฑํ๋ค. password ๋ฑ์.env
ํ์ผ ์์ ์๋ค.
2. Elasticsearch Container
- ํ ๋ฒ ์คํํ๊ณ ๋๋ฉด
config/certs
ํด๋ ์๋์ ์ธ์ฆ์ ์์ฑ๋จ. ์ด ์ธ์ฆ์๋ฅผlogstash/config/certs
์๋ ๋๊ฐ์ด ๋ถ์ฌ๋ฃ์ด์ค์ผ permission ์๊น
elasticsearch:
build:
context: elasticsearch/
args:
ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro,Z
- elasticsearch:/usr/share/elasticsearch/data:Z
ports:
- 9200:9200
- 9300:9300
environment:
node.name: elasticsearch
ES_JAVA_OPTS: Xms512m -Xmx512m
# Bootstrap password.
# Used to initialize the keystore during the initial startup of
# Elasticsearch. Ignored on subsequent runs.
ELASTIC_PASSWORD: ${ELASTIC_PASSWORD:-}
# Use single node discovery in order to disable production mode and avoid bootstrap checks.
# see: https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
discovery.type: single-node
networks:
- elk
restart: unless-stopped
3. Kibana Container
- ๋ฐ๋ก ๊ฑด๋๋ฆด ๊ฒ ์์
kibana:
build:
context: kibana/
args:
ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
- ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro,Z
- ./kibana/ca_1688430453313.crt:/usr/share/kibana/certs/ca_1688430453313.crt
ports:
- 5601:5601
environment:
KIBANA_SYSTEM_PASSWORD: ${KIBANA_SYSTEM_PASSWORD:-}
networks:
- elk
depends_on:
- elasticsearch
restart: unless-stopped
4. Logstash Container
logstash:
build:
context: logstash/
args:
ELASTIC_VERSION: ${ELASTIC_VERSION}
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro,Z
- ./logstash/config:/usr/share/logstash/config:ro,Z
- ./logstash/pipeline:/usr/share/logstash/pipeline:ro,Z
ports:
- 5044:5044
- 50000:50000/tcp
- 50000:50000/udp
- 9600:9600
environment:
LS_JAVA_OPTS: -Xms256m -Xmx256m
LOGSTASH_INTERNAL_PASSWORD: ${LOGSTASH_INTERNAL_PASSWORD:-}
networks:
- elk
depends_on:
- elasticsearch
restart: unless-stopped
- ์ด ์ค์ ์ด ๊ธฐ๋ณธ
- logstash๋ฅผ ์ด๋ป๊ฒ ์ฌ์ฉํ๋๋์ ๋ฐ๋ผ์ ์ถ๊ฐ์ ์ธ ์ธํ ์ด ํ์ํจ
https://github.com/elastic/logstash
- config ๋ด๋ถ์ ํ์ํ ํ์ผ
- logstash.yml
- jvm.options (์ ๊นํ์ ํฌํจ๋จ)
- log4j2.properties (์ ๊นํ์ ํฌํจ๋จ)
- certs
Docker ELK์ PostgreSQL ์ฐ๋ํ๊ธฐ
1. PostgreSQL JDBC Driver mount
docker-compose.yml
์ logstash container volumn์ jdbc driver ํ์ผ์ ๋ง์ดํธํ๋ค- read only๋ก ๋ง์ดํธํ๋ฉด ์๋จ!
./logstash/config/postgresql-42.6.0.jar:/usr/share/logstash/logstash-core/lib/jars/postgresql-42.6.0.jar
- ๊ฒฝ๋ก ๋ค์ :ro๋ฅผ ๋ถ์ด๋ฉด read only, :z๋ฅผ ๋ถ์ด๋ฉด ์ปจํ ์ด๋๊ฐ ๊ณต์ ๋๋ค.
2. logstash pipeline configuration ํ์ผ ์์
๊ธฐ์กด ๋ก์ปฌ ํ๊ฒฝ์์ elk๋ฅผ ์คํํ ๋ ์์ฑํ๋ ๋ด์ฉ์ด ์๋์ ๊ฐ๋ค.
input {
jdbc {
jdbc_driver_library => "./logstash/logstash-core/lib/jars/postgresql-42.6.0.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost:5432/searching"
jdbc_user => "{USERNAME}"
jdbc_password => "{PASSWORD}"
jdbc_fetch_size => 2
schedule => "* * * * *"
statement => "select * from contents WHERE id > :sql_last_value ORDER BY id ASC"
last_run_metadata_path => "./logstash/data/plugins/inputs/jdbc/logstash_jdbc_last_run"
use_column_value => true
tracking_column_type => "numeric"
tracking_column => "id"
type => "data"
}
}
output {
elasticsearch {
hosts => ["https://localhost:9200"]
cacert => './logstash/config/certs/http_ca.crt'
ssl => true
user => "logstash_internal"
password => "x-pack-test-password"
index => "contents"
}
}
์์ ํด์ผํ ๋ถ๋ถ์ ์๋์ ๊ฐ๋ค.
- ๋์ปค ์ปจํ ์ด๋ ๋ด๋ถ ๊ฒฝ๋ก๋ก ์์
- postgresql๊ณผ elasticsearch ์ฃผ์ ๋ณ๊ฒฝ
- index template ์ฝ์
Docker Container ๊ฒฝ๋ก๋ก ์์
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/postgresql-42.6.0.jar" # here
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost:5432/searching"
jdbc_user => "{USERNAME}"
jdbc_password => "{PASSWORD}"
jdbc_fetch_size => 2
schedule => "* * * * *"
statement => "select * from contents WHERE id > :sql_last_value ORDER BY id ASC"
last_run_metadata_path => "/usr/share/logstash/jdbc_last_run/logstash_jdbc_last_run_keyword" # here
use_column_value => true
tracking_column_type => "numeric"
tracking_column => "id"
type => "data"
}
}
output {
elasticsearch {
hosts => ["https://localhost:9200"]
cacert => '/usr/share/logstash/config/certs/http_ca.crt' # here
ssl => true
user => "logstash_internal"
password => "x-pack-test-password"
index => "contents"
}
}
- logstash container๋
/usr/share/
ํ์์logstash
ํด๋๋ฅผ ์์ฑํ๊ณ ์๋์ ํ์ผ์์คํ ์ด ๊ตฌ์ฑ๋๋ค. docker-compose.yml
ํ์ผ์์ ๋ง์ดํธ ํ๋ ๊ฒฝ๋ก์ ๊ฐ์ด ์จ์ฃผ๋ฉด ๋๋ค.jdbc_last_run
ํด๋๋ ์๋ก ๋ถ์ฌ์ค๋ค.
# docker-compose.yml
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro,Z
- ./logstash/pipeline:/usr/share/logstash/pipeline:ro,Z
- ./logstash/config:/usr/share/logstash/config:ro,Z
# new
- ./logstash/config/postgresql-42.6.0.jar:/usr/share/logstash/logstash-core/lib/jars/postgresql-42.6.0.jar
- ./logstash/jdbc_last_run:/usr/share/logstash/jdbc_last_run
postgresql๊ณผ elasticsearch ์ฃผ์ ๋ณ๊ฒฝ
- docker compose๊ฐ DB์ ๊ฐ์ ์ปดํจํฐ์ ์์ง ์๋ค๋ ๊ฐ์ ํ์ postgreSQL ์ ์ ์ฃผ์๋ฅผ IP๋ก ๋ณ๊ฒฝํด์ฃผ์ด์ผํ๋ค.
# pipeline.conf
...
jdbc_connection_string => "jdbc:postgresql://{IP ADDRESS}:5432/{SERVER NAME}"
- elasticsearch๋ localhost ๋์ ๋์ปค ๋ด๋ถ์ ์ค์ ํ elk ๋คํธ์ํฌ ์ ์ ํตํด์ ์ปจํ ์ด๋ ๋จ์๋ก ์ ๊ทผํ ์ ์๋ค(์ด๊ฑฐ ํ๋ ธ์ ์๋ ์์)
### index template ์ฝ์
- elasticsearch - kibana - logstash๋ฅผ ๋ฐ๋ก ์คํ์ํค๋ ๋์๋ kibana์์ index template์ ๋จผ์ ์ค์ ํ ๋ค์ logstash๋ฅผ ๋์ํ๋ฉด ๋์ง๋ง, ์ง๊ธ์ ํ ๋ฒ์ ๋ค๋ค๋ฅ ์คํ๋๊ธฐ ๋๋ฌธ์ ๋ฏธ๋ฆฌ json ํํ๋ก ์์ฑํด์ ์ค์ ํด์ฃผ๋ ๊ฒ์ด ์ข๋ค.
- *elasitcsearch 7.8.x ๋ฒ์ ์ด์์ ์ฌ์ฉํ ๊ฒฝ์ฐ composable index๋ก index template ๊ตฌ์ฑ์ด ๋ณ๊ฒฝ๋์์. ํ์์ด ๋ค๋ฅด๋ฏ๋ก ์ฃผ์ํ๊ธฐ*
- ์๋๋ ์์๋ก ์์ฑ๋ index template
```json
// templates/content_template.json
{
"index_patterns": ["contents*"],
"template": {
"settings": {
"analysis": {
"filter": {
"custom_shingle_filter": {
"max_shingle_size": "3",
"min_shingle_size": "2",
"type": "shingle"
}
},
"char_filter": {
"default_character_filter": {
"type": "html_strip"
}
},
"analyzer": {
"whitespace_analyzer": {
"filter": ["lowercase", "word_delimiter", "custom_shingle_filter"],
"char_filter": ["html_strip"],
"type": "custom",
"tokenizer": "whitespace"
},
"nori_analyzer": {
"filter": ["lowercase"],
"char_filter": ["html_strip"],
"type": "custom",
"tokenizer": "korean_nori_tokenizer"
}
},
"tokenizer": {
"korean_nori_tokenizer": {
"type": "nori_tokenizer",
"decompound_mode": "mixed"
},
"custom-edge-ngram": {
"token_chars": ["letter", "digit"],
"min_gram": "1",
"type": "edge_ngram",
"max_gram": "10"
}
}
},
"number_of_shards": "48",
"number_of_replicas": "0"
},
"mappings": {
"properties": {
"content": {
"type": "text",
"fields": {
"search": {
"type": "search_as_you_type",
"doc_values": false,
"max_shingle_size": 3,
"analyzer": "whitespace_analyzer",
"search_analyzer": "whitespace_analyzer",
"search_quote_analyzer": "standard"
}
},
"analyzer": "nori_analyzer",
"search_analyzer": "nori_analyzer",
"search_quote_analyzer": "standard"
},
"id": {
"type": "long",
"ignore_malformed": false,
"coerce": true
},
"title": {
"type": "text",
"fields": {
"search": {
"type": "search_as_you_type",
"doc_values": false,
"max_shingle_size": 3,
"analyzer": "whitespace_analyzer",
"search_analyzer": "whitespace_analyzer",
"search_quote_analyzer": "standard"
}
},
"analyzer": "nori_analyzer",
"search_analyzer": "nori_analyzer",
"search_quote_analyzer": "standard"
},
}
},
"aliases": {}
}
}
- ๊ทธ๋ฆฌ๊ณ ์์ฑํ ํ
ํ๋ฆฟ์ docker compose์์ volumn์ ๋ง์ดํ
ํ๊ธฐ
... - ./logstash/templates:/usr/share/logstash/templates
- `pipeline.conf` ํ์ผ์ ์ธ๋ฑ์ค ํ
ํ๋ฆฟ ์ถ๊ฐํ๊ธฐ
```conf
output{
...
index => "contents"
manage_template => true
template => "/usr/share/logstash/templates/content_template.json"
template_name => "contents"
template_overwrite => true
}
3. Upload
$ docker-compose up
Common Issue
No Available connections
docker-elk-logstash-1 | [2023-12-19T07:33:09,096][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
docker-elk-logstash-1 | [2023-12-19T07:33:09,103][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
=> ์ธ์ฆ์ ๋ฌธ์ - elasticsearch/config/certs ๋ฅผ ๋ณต์ฌํด์ logstash/config/certs๋ก ๋ฃ๊ณ volumn ๋ง์ดํ
ํด์ ํด๊ฒฐํ๊ธฐ
+) logstash ssl ์ด true๋ก ์ค์ ๋์ง ์์๋ ํ์ธ. false๋ก ๋์ด์ผ ์ ๋๋ก ๋์ํจ
jdbc driver - class not found
docker-elk-logstash-1 | [2023-12-19T08:15:49,228][ERROR][logstash.javapipeline ][postgresql] Pipeline error {
:pipeline_id=>"postgresql",
:exception=>
#<LogStash::PluginLoadingError:
#<Java::JavaLang::ClassNotFoundException: org.postgresql.Driver>. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?>,
- jdbc driver๊ฐ ์ ๋ง๋๋ค ์ด์ฉ๊ตฌ~
- docker compose์์ volume ๋ง์ดํ
๋ค์ํ๊ธฐ. read only๋ก ์ค์ ํ๋ฉด ์ค๋ฅ ๋ฐ์ํ๋ฏ๋ก ์ฃผ์
# docker-compose.yml logstash: ... volumes: ... - ./logstash/config/postgresql-42.6.0.jar:/usr/share/logstash/logstash-core/lib/jars/postgresql-42.6.0.jar
jdbc last run permission denied
- ์๋ฌด๋๋ read only๋ก ์ค์ ๋ ํด๋ ํ์์ ์์ด์ ๊ทธ๋ฐ ๊ฒ ๊ฐ์
- ๋ฐ๋ก ํด๋ ๋นผ์ ๋ง์ดํธ ํดํด์ฃผ๊ณ ํด๊ฒฐ๋จ
Failed to install template - 400- ./logstash/jdbc_last_run:/usr/share/logstash/jdbc_last_run
- elasticsearch version 7.8 ์ด์์์๋ถํฐ index template์ composable index์ ๋ฐ๋ผ์ ์ง์ ํ๋๋ก ํด์ ์ ๋จน์๋ค.
- ์ธ๋ฑ์ค ํ
ํ๋ฆฟ ํ์ ๋ค์ ์ง์คฌ๋๋ ๋จ
Failed to install template {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://elasticsearch:9200/_index_template/content-template'",