最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

hadoop - How to configure MODERN log4j for compression - Stack Overflow

programmeradmin4浏览0评论

Previous log files (.1, .2, etc and previous days) do not compress.

I have seen other articles but, compared to the official documentation, they have different names/settings.

I am trying to configure this for HADOOP 3 but what I see in the hadoop log4j.properties and what I see in the log4j documentation are different.

I believe the CURRENT log file will be (hadoop log4j.properties)

log4j.appender.RFA=.apache.log4j.RollingFileAppender log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}

but the documentation suggests the current log is called fileName not File and other articles made reference to activeLog which, I can only assume, belonged to an older iteration of the logger.

Furthermore, I am not really sure how

hadoop.log.file=hadoop.log

translates into

hadoop-hdfs-namenode-dr1-hmaster01.log

According to the documentation, compressed archives seem to be referenced as filePattern, so I also added

log4j.appender.RFA.filePattern=${hadoop.log.dir}/${hadoop.log.file}.gz

but did not get any compressed file. However, when I changed

log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}

to

log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}.gz

I did get compressed files but the current log also had a filename of blah.gz

I adjusted properties according to APACHE LOG4J documentation

发布评论

评论列表(0)

  1. 暂无评论