Hadoop cmake maven protobuf
问题描述在64位linux装的hadoop,在很多地方会遇到libhadoop.so.1.0.0 which might have disabled stack guard. 是因为hadoop是32位的,需要手工编译hadoop。
hadoop为2.2.0,操作系统为Oracle linux 6.3 64位。
实例和解决过程。 遇到的问题[hadoop@hadoop01 input]$ hadoop dfs -put ./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
Java HotSpot(TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might havedisabled stack guard. The VM will try to fix the stack guard now.
It's highly recommendedthat you fix the library with 'execstack -c <libfile>', or link it with'-z noexecstack'.
13/10/24 04:08:55 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
put: `in': No such file or directory
查看本地文件[hadoop@hadoop01 input]$ file /app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0
/app/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0:ELF 32-bit LSB shared object, Intel 80386,version 1 (SYSV), dynamically linked, not stripped
貌似是32位和64位的原因%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E
@hadoop.apache.org/msg52576.html
操作系统64位,软件是32位。悲剧了。。。装好的集群没法用。
解决方法:重新编译hadoop解决方法,就是重新编译hadoop软件:
下载程序代码机器得连网,如果没联网找可以联网的机器下载,但是编译时还是要下载一些东西,所以,实在不行。最好找相同平台(可以是虚拟机)能上网的机器做下面工作,弄好了再拷回来。
# svn checkout'http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0'
都下载到这了:
[hadoop@hadoop01 hadoop]$ ls
BUILDING.txt hadoop-common-project hadoop-maven-plugins hadoop-tools
dev-support hadoop-dist hadoop-minicluster hadoop-yarn-project
hadoop-assemblies hadoop-hdfs-project hadoop-project pom.xml
hadoop-client hadoop-mapreduce-project hadoop-project-dist
安装开发环境 1.必要的包[root@hadoop01 /]# yum install svn
[root@hadoop01 ~]# yum install autoconfautomake libtool cmake
root@hadoop01 ~]# yum install ncurses-devel
root@hadoop01 ~]# yum install openssl-devel
root@hadoop01 ~]# yum install gcc*
2.安装maven下载,并解压
[root@hadoop01 stable]# mvapache-maven-3.1.1 /usr/local/
将/usr/local/apache-maven-3.1.1/bin加到环境变量中
3.安装protobuf没装 protobuf,后面编译做不完,结果如下:
[INFO] ---hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common ---
[WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program "protoc": error=2, No suchfile or directory
[ERROR] stdout: []
……………………
[INFO] Apache Hadoop Main................................ SUCCESS [5.672s]
[INFO] Apache Hadoop Project POM......................... SUCCESS [3.682s]
[INFO] Apache Hadoop Annotations......................... SUCCESS [8.921s]
[INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.676s]
[INFO] Apache Hadoop Project Dist POM.................... SUCCESS [4.590s]
[INFO] Apache Hadoop Maven Plugins....................... SUCCESS [9.172s]
[INFO] Apache Hadoop Auth................................ SUCCESS [10.123s]
[INFO] Apache Hadoop Auth Examples....................... SUCCESS [5.170s]
[INFO] Apache HadoopCommon .............................. FAILURE [1.224s]
[INFO] Apache Hadoop NFS................................. SKIPPED
[INFO] Apache Hadoop Common Project...................... SKIPPED
[INFO] Apache Hadoop HDFS................................ SKIPPED
[INFO] Apache Hadoop HttpFS.............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeperJournal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS............................ SKIPPED
[INFO] Apache Hadoop HDFS Project........................ SKIPPED
安装protobuf过程下载:https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
https://code.google.com/p/protobuf/downloads/list
[root@hadoop01 protobuf-2.5.0]# pwd
/soft/protobuf-2.5.0
依次执行下面的命令即可
./configure
make
make check
make install
[root@hadoop01 protobuf-2.5.0]# protoc--version
libprotoc 2.5.0
相关阅读:
Ubuntu 13.04上搭建Hadoop环境