Log4j直接向elasticsearch服务器进行日志记录[英] Log4j logging directly to elasticsearch server

本文是小编为大家收集整理的关于Log4j直接向elasticsearch服务器进行日志记录的处理/解决方法,可以参考本文帮助大家快速定位并解决问题,中文翻译不准确的可切换到English标签页查看源文。

问题描述

我对如何将日志条目直接放在elasticsearch(而不是logstash)上有些困惑.到现在为止方法看起来很怪异...还是我错误?这是将日志发送到elastic的一种方法?

到目前为止,我有这样的配置:

log4j.rootLogger=DEBUG, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=9200
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=10000
log4j.appender.server.layout.ConversionPattern={"debug_level":"%p","debug_timestamp":"%d{ISO8601}","debug_thread":"%t","debug_file":"%F", "debug_line":"%L","debug_message":"%m"}%n

但是我有一个错误:

log4j:WARN Detected problem with connection: java.net.SocketException: Broken pipe (Write failed)

我找不到任何有用的例子,所以我无法理解我做错了什么以及如何修复它.谢谢.

推荐答案

我找到了最适合我要求的解决方案.这是一个 graylog . 由于它是基于elasticsearch构建的,因此用法很熟悉,因此我能够立即切换到它.

要使用它,我添加了此依赖项以及基本的log4j2依赖项:

<dependency>
    <groupId>org.graylog2.log4j2</groupId>
    <artifactId>log4j2-gelf</artifactId>
    <version>1.3.2</version>
</dependency>

并使用log4j2.json配置:

{
  "configuration": {
    "status": "info",
    "name": "LOGGER",
    "packages": "org.graylog2.log4j2",
    "appenders": {
      "GELF": {
        "name": "GELF",
        "server": "log.myapp.com",
        "port": "12201",
        "hostName": "my-awsome-app",
        "JSONLayout": {
          "compact": "false",
          "locationInfo": "true",
          "complete": "true",
          "eventEol": "true",
          "properties": "true",
          "propertiesAsList": "true"
        },
        "ThresholdFilter": {
          "level": "info"
        }
      }
    },
    "loggers": {
      "logger": [
        {
          "name": "io.netty",
          "level": "info",
          "additivity": "false",
          "AppenderRef": {
            "ref": "GELF"
          }
        }        
      ],
      "root": {
        "level": "info",
        "AppenderRef": [
          {
            "ref": "GELF"
          }
        ]
      }
    }
  }
}

其他推荐答案

我已经在这里写了这个appender log4j2 log4j2 log4j2弹性retastion appender 如果您想使用它.在将其发送到弹性之前,它具有根据时间和/或事件数量缓冲日志事件的能力(使用_bulk API,以使其一口气发送). 它已发布给Maven Central,因此非常直截了当.

其他推荐答案

如果您想查看新事物,我的 log4j2 log4j2 >将为您提供异步登录,并通过故障转移.

本文地址:https://www.itbaoku.cn/post/1575019.html

问题描述

I'm a bit confused on how can I put my log entries directly to elasticsearch (not logstash). So far I found a few appenders (log4j.appender.SocketAppender, log4j.appender.server etc.) that allow to send logs to remote host and also ConversionPattern possibility that seems to allow us to convert logs to "elastic-friendly" format, but this approach looks freaky... or do I mistake? Is this the one way to send logs to elastic?

So far I have a such config:

log4j.rootLogger=DEBUG, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=9200
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=10000
log4j.appender.server.layout.ConversionPattern={"debug_level":"%p","debug_timestamp":"%d{ISO8601}","debug_thread":"%t","debug_file":"%F", "debug_line":"%L","debug_message":"%m"}%n

But I get an error:

log4j:WARN Detected problem with connection: java.net.SocketException: Broken pipe (Write failed)

I can't find any useful example so I can't understand what do I do wrong and how to fix it. Thanks.

推荐答案

I found solution that fits my requirements most. It's a graylog . Since it's build based on elasticsearch the usage is familiar so I was able to switch to it immediately.

To use it I added this dependency along with basic log4j2 dependencies:

<dependency>
    <groupId>org.graylog2.log4j2</groupId>
    <artifactId>log4j2-gelf</artifactId>
    <version>1.3.2</version>
</dependency>

and use log4j2.json configuration:

{
  "configuration": {
    "status": "info",
    "name": "LOGGER",
    "packages": "org.graylog2.log4j2",
    "appenders": {
      "GELF": {
        "name": "GELF",
        "server": "log.myapp.com",
        "port": "12201",
        "hostName": "my-awsome-app",
        "JSONLayout": {
          "compact": "false",
          "locationInfo": "true",
          "complete": "true",
          "eventEol": "true",
          "properties": "true",
          "propertiesAsList": "true"
        },
        "ThresholdFilter": {
          "level": "info"
        }
      }
    },
    "loggers": {
      "logger": [
        {
          "name": "io.netty",
          "level": "info",
          "additivity": "false",
          "AppenderRef": {
            "ref": "GELF"
          }
        }        
      ],
      "root": {
        "level": "info",
        "AppenderRef": [
          {
            "ref": "GELF"
          }
        ]
      }
    }
  }
}

其他推荐答案

I've written this appender here Log4J2 Elastic REST Appender if you want to use it. It has the ability to buffer log events based on time and/or number of events before sending it to Elastic (using the _bulk API so that it sends it all in one go). It has been published to Maven Central so it's pretty straight forward.

其他推荐答案

If you'd like to check out something new, my Log4j2 Elasticsearch Appenders will give you async logging in batches with failover.