问题描述
我对使用Kafka流很新. 在特定的要求中,我必须将Log4J日志直接推向Kafka主题.
我在CentOS上有一个独立的Kafka安装,我已经与Kafka Publisher和Consumer客户端进行了验证.我也使用捆绑的Zookeeper实例.
现在,我还创建了一个独立的Java应用程序,并启用了Log4J记录.另外,我已经编辑了log4j.properties文件如下 -
log4j.rootCategory=INFO log4j.appender.file=org.apache.log4j.DailyRollingFileAppender log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH log4j.appender.file.File=/home/edureka/Desktop/Anurag/logMe log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n log4j.logger.com=INFO,file,KAFKA #Kafka Appender log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n log4j.appender.KAFKA.ProducerType=async log4j.appender.KAFKA.BrokerList=localhost:2181 log4j.appender.KAFKA.Topic=test log4j.appender.KAFKA.Serializer=kafka.test.AppenderStringSerializer
现在,当我运行应用程序时,所有日志都将进入本地日志文件,但消费者仍未显示任何条目. 我使用的主题是在任何一种情况下进行测试.
也没有生成错误日志,log4j库的详细日志如下 -
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a. log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@a1d92a class loader. log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource(). log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a. log4j: Using URL [file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties log4j: Parsing for [root] with value=[DEBUG, stdout, file]. log4j: Level token is [DEBUG]. log4j: Category root set to DEBUG log4j: Parsing appender named "stdout". log4j: Parsing layout options for "stdout". log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n]. log4j: End of parsing for "stdout". log4j: Setting property [target] to [System.out]. log4j: Parsed "stdout" options. log4j: Parsing appender named "file". log4j: Parsing layout options for "file". log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n]. log4j: End of parsing for "file". log4j: Setting property [file] to [/home/edureka/Desktop/Anurag/logMe]. log4j: Setting property [maxBackupIndex] to [10]. log4j: Setting property [maxFileSize] to [5MB]. log4j: setFile called: /home/edureka/Desktop/Anurag/logMe, true log4j: setFile ended log4j: Parsed "file" options. log4j: Finished configuring. 2015-05-11 19:44:40 DEBUG TestMe:19 - This is debug : anurag 2015-05-11 19:44:40 INFO TestMe:23 - This is info : anurag 2015-05-11 19:44:40 WARN TestMe:26 - This is warn : anurag 2015-05-11 19:44:40 ERROR TestMe:27 - This is error : anurag 2015-05-11 19:44:40 FATAL TestMe:28 - This is fatal : anurag 2015-05-11 19:44:40 INFO TestMe:29 - message from log4j appender
任何帮助都会很棒. 谢谢, AJ
推荐答案
在您的输出中,我看不到创建的kafka appender,所以难怪没有记录到kafka的东西.我猜想的原因是您仅从名为TestMe的类(可能在默认软件包中)登录,而Kafka Appender仅添加到名为" com"的logger.
.问题描述
I am pretty new to using the Kafka stream. In a particular requirement I have to push my log4j logs directly to Kafka topic.
I have a standalone kafka installation running on centos and i have verified it with the kafka publisher and consumer clients. Also i am using the bundled zookeeper instance.
Now i have also created a standalone java app with log4j logging enabled. Also i have edited the log4j.properties file as below -
log4j.rootCategory=INFO log4j.appender.file=org.apache.log4j.DailyRollingFileAppender log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH log4j.appender.file.File=/home/edureka/Desktop/Anurag/logMe log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n log4j.logger.com=INFO,file,KAFKA #Kafka Appender log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout log4j.appender.KAFKA.layout.ConversionPattern=%d{yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}{UTC} %p %C %m%n log4j.appender.KAFKA.ProducerType=async log4j.appender.KAFKA.BrokerList=localhost:2181 log4j.appender.KAFKA.Topic=test log4j.appender.KAFKA.Serializer=kafka.test.AppenderStringSerializer
Now when i am running the application, all the logs are going into the local log file but the consumer is still not showing any entry happening. The topic i am using is test in either scenario.
Also no error log is being generated the the detailed logs of the log4j library are as below -
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a. log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@a1d92a class loader. log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource(). log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@a1d92a. log4j: Using URL [file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties] for automatic log4j configuration. log4j: Reading configuration from URL file:/home/edureka/workspace/TestKafkaLog4J/bin/log4j.properties log4j: Parsing for [root] with value=[DEBUG, stdout, file]. log4j: Level token is [DEBUG]. log4j: Category root set to DEBUG log4j: Parsing appender named "stdout". log4j: Parsing layout options for "stdout". log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n]. log4j: End of parsing for "stdout". log4j: Setting property [target] to [System.out]. log4j: Parsed "stdout" options. log4j: Parsing appender named "file". log4j: Parsing layout options for "file". log4j: Setting property [conversionPattern] to [%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n]. log4j: End of parsing for "file". log4j: Setting property [file] to [/home/edureka/Desktop/Anurag/logMe]. log4j: Setting property [maxBackupIndex] to [10]. log4j: Setting property [maxFileSize] to [5MB]. log4j: setFile called: /home/edureka/Desktop/Anurag/logMe, true log4j: setFile ended log4j: Parsed "file" options. log4j: Finished configuring. 2015-05-11 19:44:40 DEBUG TestMe:19 - This is debug : anurag 2015-05-11 19:44:40 INFO TestMe:23 - This is info : anurag 2015-05-11 19:44:40 WARN TestMe:26 - This is warn : anurag 2015-05-11 19:44:40 ERROR TestMe:27 - This is error : anurag 2015-05-11 19:44:40 FATAL TestMe:28 - This is fatal : anurag 2015-05-11 19:44:40 INFO TestMe:29 - message from log4j appender
Any help will be really great. Thanks, AJ
推荐答案
In your output, I don't see the KAFKA appender being created, so no wonder nothing is logged to Kafka. I'm guessing the reason for that is that you only log from a class named TestMe (probably in the default package), while the KAFKA appender is only added to the logger named "com".