- Logstash 란?
- 데이터를 수집 및 변환하여 Elasticsearch 같은 분석엔진에 전달하는 파이프라인이다.
- 들어가는 곳과 나가는 곳 즉 INPUTS랑 OUTPUTS를 반드시 설정해야하며 FILTERS는 선택이다.
- 구축환경
- OS: Ubiuntu 22.04
- CPU: 4v Cpu
- RAM: 16GB
- DISK: 100GB(OS 및 프로그램 설치할 하드) + 100GB(ELK 분석/저장 파일 보관할 하드)
- ETC: Openstack(IAAS)
설치 및 설정
root@ubuntu:/usr/share/kibana# apt install logstash
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following NEW packages will be installed:
logstash
0 upgraded, 1 newly installed, 0 to remove and 37 not upgraded.
Need to get 352 MB of archives.
After this operation, 610 MB of additional disk space will be used.
Get:1 https://artifacts.elastic.co/packages/8.x/apt stable/main amd64 logstash amd64 1:8.12.0-1 [352 MB]
Fetched 352 MB in 1min 19s (4481 kB/s)
Selecting previously unselected package logstash.
(Reading database ... 186260 files and directories currently installed.)
Preparing to unpack .../logstash_1%3a8.12.0-1_amd64.deb ...
Unpacking logstash (1:8.12.0-1) ...
Setting up logstash (1:8.12.0-1) ...
Scanning processes...
Scanning candidates...
Scanning linux images...
Restarting services...
Service restarts being deferred:
/etc/needrestart/restart.d/dbus.service
systemctl restart getty@tty1.service
systemctl restart networkd-dispatcher.service
systemctl restart systemd-logind.service
systemctl restart unattended-upgrades.service
No containers need to be restarted.
No user sessions are running outdated binaries.
No VM guests are running outdated hypervisor (qemu) binaries on this host.
root@ubuntu:/etc/logstash# vi logstash.yml
#필자는 파이프라인의 수집된 데이터를 따로 저장할 것이기 때문에 저장 위치를 수정했다.
path.data: /mnt/log_mnt1/logstash
wq
#저장
Inputs와 Outputs 설정
vi /etc/logstash/conf.d/beats-input.conf
#생성
input {
beats {
port => 5044
}
}
root@ubuntu:/etc/logstash/conf.d# vi /etc/logstash/conf.d/elasticsearch-output.conf
#생성
output {
if [@metadata][pipeline] {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
pipeline => "%{[@metadata][pipeline]}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
}
}
}
root@ubuntu:/mnt/log_mnt1# sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t
#구성 테스트
Using bundled JDK: /usr/share/logstash/jdk
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2024-02-05T07:05:46,722][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-02-05T07:05:46,730][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.12.0", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-linux]"}
[2024-02-05T07:05:46,733][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-02-05T07:05:46,735][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-02-05T07:05:46,736][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-02-05T07:05:46,747][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/mnt/log_mnt1/logstash/queue"}
[2024-02-05T07:05:46,748][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/mnt/log_mnt1/logstash/dead_letter_queue"}
[2024-02-05T07:05:47,333][INFO ][org.reflections.Reflections] Reflections took 103 ms to scan 1 urls, producing 132 keys and 468 values
[2024-02-05T07:05:47,647][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
Configuration OK
[2024-02-05T07:05:47,648][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
#테스트 성공
'구축 > ELK' 카테고리의 다른 글
[Apache KAFKA] zookeeper, kafka 구축 (0) | 2024.04.01 |
---|---|
[ELK] Metricbeat (0) | 2024.03.12 |
[ELK] Filebeat 설치 (0) | 2024.03.04 |
[ELK] Kibana 설치 (0) | 2024.02.13 |
[ELK] Elasticsearch 설치 (1) | 2024.02.07 |