5.添加我们的logback.xml配置文件到Hadoop的配置目录下:
sudo mv ~/logback/logback.xml /etc/hadoop/conf/
配置文件内容如下:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<property value="${hadoop.log.dir}"/>
<property value="${hadoop.log.file}"/>
<!-- Output to File and Rotate if it's necessary -->
<appender>
<file>${LOG_DIR}/${LOG_FILE_NAME}</file>
<rollingPolicy>
<!-- rollover daily -->
<fileNamePattern>${LOG_DIR}/${LOG_FILE_NAME}-%d{yyyy-MM-dd}.%i.log</fileNamePattern>
<timeBasedFileNamingAndTriggeringPolicy
>
<!-- or whenever the file size reaches 100MB -->
<maxFileSize>100MB</maxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
</rollingPolicy>
<encoder>
<pattern>%date %level [%thread] %10logger [%file:%line] %msg%n</pattern>
</encoder>
</appender>
<!--Output to central logging-->
<appender>
<appId>901240</appId>
<serverIp>192.168.82.58</serverIp>
<serverPort>63100</serverPort>
</appender>
<root level="INFO">
<appender-ref ref="ROLLING"/>
<appender-ref ref="CentralLogging"/>
</root>
</configuration>
可以看到我们添加了自己的CentralLogging的Appender。
6.把slf4j-log4j12-1.6.1这个jar包注释掉!因为如果保留,log4j和logback会竞争,而不幸的是logback比log4j晚加载,所以log还会走log4j。。。此坑搞了半天
sudo mv /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar.bak
ok,配置完后,重启hadoop机器,搞定。但我们还是不满足,hbase的log也搞进来多好啊。