Hadoop 源码编译 step by step 最简洁的步骤

Java : 1.7.0_79
Hadoophadoop-2.6.5-src.tar.gz

maven:3.3.9

protocbuf:2.5

解压缩 tar -zxvf 


1 配置maven 环境变量

export MAVEN_HOME=http://www.likecs.com/root/compileHadoop/maven-3.3.9
export PATH=$PATH:$MAVEN_HOME/bin
source ~/.bash_profile

检查 maven 是否安装成功


mvn -version

[root@bigdatahadoop protobuf-2.5.0]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /root/compileHadoop/maven-3.3.9
Java version: 1.7.0_79, vendor: Oracle Corporation
Java home: /usr/java/jdk1.7.0_79/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
 


2 编译安装 protobuf   configure 时候可以指定安装路径
  cd   protobuf - 2.5.0   
  ./configure 
  make
  make install

检测  protoc –version

[root@bigdatahadoop protobuf-2.5.0]# protoc --version
libprotoc 2.5.0

need check and install

yum install  gcc-c++



3 预备条件已经就绪 ,现在开始编译 (更多编译条件请参考 src 下 的 BUILDING.txt  文件 )

cd hadoop-2.6.5-src

mvn clean package -Pdist,native -DskipTests -Dtar

--------------------------------

Error

1

[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 07:39 min
[INFO] Finished at: 2016-10-12T21:50:39+08:00
[INFO] Final Memory: 36M/87M
[INFO] ------------------------------------------------------------------------
[ERROR] Unknown lifecycle phase "–Pdist,native". You must specify a valid lifecycle phase or a goal in the format <plugin-prefix>:<goal> or <plugin-group-id>:<plugin-artifact-id>[:<plugin-version>]:<goal>. Available lifecycle phases are: validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy, pre-clean, clean, post-clean, pre-site, site, post-site, site-deploy. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1]
命令中有中文符号 (-)

Eror

2

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:06 h
[INFO] Finished at: 2016-10-12T23:07:53+08:00
[INFO] Final Memory: 81M/320M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
[ERROR] around Ant part ...<exec dir="/root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:140 in /root/compileHadoop/hadoop-2.6.5-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1]
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common
[root@bigdatahadoop hadoop-2.6.5-src]#

yum install cmake tar zxvf apache-ant-1.9.4-bin.tar.gz

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:https://www.heiqu.com/zzpdgy.html