$ cat ~/.bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
JAVA_HOME=/usr/java/default
Hadoop_INSTALL=/home/jiangzy/hadoop/hadoop-1.0.0
PATH=$PATH:$HOME/bin:$JAVA_HOME/bin
PATH=$PATH:$HADOOP_INSTALL/bin
export PATH
export JAVA_HOME
export HADOOP_INSTALL
Hadoop配置
1. conf/hadoop-env.sh中设置JAVA_HOME
2. 3个XML配置
[jiangzy@hadoop-2 hadoop-1.0.0]$ cat conf/hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="https://www.linuxidc.com/configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/home/jiangzy/hadoop/single/hdfs/data</value>
<final>true</final>
</property>
<property>
<name>dfs.name.dir</name>
<value>/home/jiangzy/hadoop/single/hdfs/name</value>
<final>true</final>
</property>
</configuration>
[jiangzy@hadoop-2 hadoop-1.0.0]$ cat conf/mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="https://www.linuxidc.com/configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
启动并测试
[jiangzy@hadoop-2 hadoop-1.0.0]$ start-dfs.sh
[jiangzy@hadoop-2 hadoop-1.0.0]$ start-mapred.sh
[jiangzy@hadoop-2 hadoop-1.0.0]$ hadoop fs -put conf input
[jiangzy@hadoop-2 hadoop-1.0.0]$ hadoop jar hadoop-examples-1.0.0.jar grep input output 'dfs[a-z.]+'
状态监测
[jiangzy@hadoop-2 hadoop-1.0.0]$ jps
4112 JobTracker
6229 Jps
3867 DataNode
4225 TaskTracker
4014 SecondaryNameNode
3753 NameNode
另外,可通过Web接口查看HDFS及Job任务状态。