当前位置:网站首页>Elk construction guide

Elk construction guide

2022-06-12 12:02:00 Drunken fish!

I haven't sent a message for a long time , Let's share our experience with an operation article on water today , Subsequently, in-depth good articles were launched one after another

springboot+logStash+elasticsearch+kibana

edition

  • elasticsearch 7.4.2
  • logStash 7.4.2
  • springboot 2.1.10

Download address

Select the product and version to download , Download it

https://www.elastic.co/cn/downloads/past-releases

Deploy

start-up Elasticsearch

  • Setup profile elasticsearch

    cluster.name: my-application
    node.name: node-1
    path.data: /cxt/software/maces/7.4.2/elasticsearch-7.4.2/data
    path.logs: /cxt/software/maces/7.4.2/elasticsearch-7.4.2/logs
    network.host: 0.0.0.0
    http.port: 9200
    discovery.seed_hosts: ["127.0.0.1"]
    cluster.initial_master_nodes: ["node-1"]
    
  • start-up

    bin/elasticsearch
    

start-up Kibana

  • Go directly to bin Start in the directory , All are started locally without modifying the configuration

    bin/kibana
    

start-up LogStash

  • config Create under folder springboot-log.conf, The function of this configuration file is to start on this computer 9600 port , Back springboot Applications can go directly to 9600 Send log .input Enter... For the log ,output Output to... For logs elasticsearch

    input{
          
    #  Start at 9600 port , Output on the console 
    	tcp {
          
            mode => "server"
            host => "0.0.0.0"
            port => 9600
            codec => json_lines
    	}
    }
    
    output{
          
    	elasticsearch{
          
    	    hosts=>["192.168.123.166:9200"]
    	    index => "springboot-logstash-%{
          +YYYY.MM.dd}"
        }
    # stdout{
          
    # codec => rubydebug
    # }
    }
    
  • start-up

    bin/logstash -f config/springboot-log.conf
    

start-up SpringBoot application

  • pom

    		<dependency>
    			<groupId>net.logstash.logback</groupId>
    			<artifactId>logstash-logback-encoder</artifactId>
    			<version>7.0</version>
    		</dependency>
    
  • testController Method

    @RestController
    public class TestController {
        public static final Logger log = LoggerFactory.getLogger(TestController.class);
        @RequestMapping("/test")
        public String test(){
            log.info("this is a log from springboot");
            log.trace("this is a trace log ");
            return "success";
        }
    }
    
    
  • Start class main Method to add automatic log generation code

      @SpringBootApplication
    public class ElkApplication {
        public static final Logger log = LoggerFactory.getLogger(ElkApplication.class);
        Random random = new Random(10000);
    
        public static void main(String[] args) {
            SpringApplication.run(ElkApplication.class, args);
            new ElkApplication().initTask();
    
        }
    
        private void initTask() {
            Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(new Runnable() {
                @Override
                public void run() {
                    log.info("seed info msg :" + random.nextInt(999999));
                }
            }, 100, 100, TimeUnit.MILLISECONDS);
        }
    
    }
    
  • resource newly build logback-spring.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <configuration>
        <include resource="org/springframework/boot/logging/logback/base.xml" />
        <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
            <!-- To configure logStash  Service address -->
            <destination>192.168.123.166:9600</destination>
            <!--  Log output code  -->
            <encoder charset="UTF-8" class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
                <providers>
                    <timestamp>
                        <timeZone>UTC</timeZone>
                    </timestamp>
                    <pattern>
                        <pattern>
                            {
                            "logLevel": "%level",
                            "serviceName": "${springAppName:-}",
                            "pid": "${PID:-}",
                            "thread": "%thread",
                            "class": "%logger{40}",
                            "detail": "%message"
                            }
                        </pattern>
                    </pattern>
                </providers>
            </encoder>
        </appender>
    
        <root level="INFO">
            <appender-ref ref="LOGSTASH" />
            <appender-ref ref="CONSOLE" />
        </root>
    </configuration>
    
    
    

verification

  • Start it in sequence and open it es-head The plug-in can view the index information , There is a message calling the interface , There is also an application startup message

    image-20220604110812146

  • Kibana Data presentation

    • Set index rules

      image-20220604110933554

      Set timestamp matching after input

    • Display data , choice Discover

      image-20220604111403630

Join in filebeat

  • logstash newly build filebeat-logstash-log.conf

    input{
          
    	beats {
          
            host => "192.168.123.166"
            port => 9600
    	}
    }
    
    output{
          
    	elasticsearch{
          
    	    hosts=>["192.168.123.166:9200"]
    	    index => "%{
          [@metadata][beat]}-%{
          [@metadata][version]}-%{
          +YYYY.MM.dd}"
        }
    }
    
  • start-up

    bin/logstash -f filebeat-logstash-log.conf
    
  • Filebeat Modify the configuration file , Find the place below to modify , It mainly monitors log files and outputs logstash Server address

    filebeat.inputs:
    
    - type: log
      enabled: true
      paths:
        - /cxt/codework/java/springboot-demo/logs/springboot-elk/2022-06-04/info.2022-06-04.0.log
    setup.kibana:
      Host: "192.168.123.166:5601"
    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["192.168.123.166:9600"]
    
  • springboot The application configures the location of the generated log file ,resource Under the new logback-spring-file.xml

    <?xml version="1.0" encoding="UTF-8"?>
    <configuration debug="false" scan="false">
    
        <!-- Log file path -->
        <property name="log.path" value="logs/springboot-elk"/>
    
        <!-- Console log output -->
        <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
            <encoder>
                <pattern>%d{MM-dd HH:mm:ss.SSS} %-5level [%logger{50}] - %msg%n
                </pattern>
            </encoder>
        </appender>
    
        <!-- Log file debug output -->
        <appender name="fileRolling_info" class="ch.qos.logback.core.rolling.RollingFileAppender">
            <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
                <fileNamePattern>${log.path}/%d{yyyy-MM-dd}/info.%d{yyyy-MM-dd}.%i.log</fileNamePattern>
                <TimeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                    <maxFileSize>50MB</maxFileSize>
                </TimeBasedFileNamingAndTriggeringPolicy>
            </rollingPolicy>
            <encoder>
                <pattern>%date [%thread] %-5level [%logger{50}] %file:%line - %msg%n
                </pattern>
            </encoder>
            <!--<filter class="ch.qos.logback.classic.filter.LevelFilter"> <level>ERROR</level> <onMatch>DENY</onMatch> <onMismatch>NEUTRAL</onMismatch> </filter> -->
        </appender>
        <!-- Log file error output -->
        <appender name="fileRolling_error" class="ch.qos.logback.core.rolling.RollingFileAppender">
            <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
                <fileNamePattern>${log.path}/%d{yyyy-MM-dd}/error.%d{yyyy-MM-dd}.%i.log</fileNamePattern>
                <timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                    <maxFileSize>50MB</maxFileSize>
                </timeBasedFileNamingAndTriggeringPolicy>
            </rollingPolicy>
            <encoder>
                <pattern>%date [%thread] %-5level [%logger{50}] %file:%line - %msg%n
                </pattern>
            </encoder>
            <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
                <level>ERROR</level>
            </filter>
        </appender>
    
        <!-- Level: FATAL 0 ERROR 3 WARN 4 INFO 6 DEBUG 7 -->
        <root level="info">
            <!--{dev.start}-->
            <appender-ref ref="console"/>
            <!--{dev.end}-->
            <!--{alpha.start} <appender-ref ref="fileRolling_info" /> {alpha.end}-->
            <!-- {release.start}-->
            <appender-ref ref="fileRolling_info"/>
            <!-- {release.end}-->
            <appender-ref ref="fileRolling_error"/>
        </root>
        <!-- Framework level setting -->
        <!-- <include resource="config/logger-core.xml" />-->
    
        <!-- Project level setting -->
        <!-- <logger name="your.package" level="DEBUG" /> -->
        <logger name="org.springframework" level="INFO"></logger>
        <logger name="org.mybatis" level="INFO"></logger>
    </configuration>
    
    
  • application.yml The document specifies logback-spring-file.xml

    logging:
      #  Default logback-spring.xml  Use logstash Transferred to the es;
      #  Change it to logback-spring-file.xml Transfer logs to archive log files , Use filebeat Monitor logs 
      config: classpath:logback-spring-file.xml
    
  • The process is over , Now the process is

    springboot
    logs/springboot-elk/*.log
    monitor logs/springboot-elk/*.log
    filebeat Send data to logStash
    logStash The data is saved to ES
    ES
    Kibana Yes ES Render and display data in

Okay , That's all ELK Setup process , Incidental use filebeat Those who listen to log files have also done , It's roughly this dark purple , It's been a long time since I had a liver problem , More in-depth theoretical articles will be published later , Welcome to WeChat official account. :《 Drunken fish JAVA》 Learn together

github

原网站

版权声明
本文为[Drunken fish!]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/163/202206121200567463.html