Hadoop 调试第一个MapReduce程序过程详细记录总结

开发环境搭建参考
    <Hadoop 在Windows7操作系统下使用Eclipse来搭建Hadoop开发环境>:


1,程序代码如下:

​package wc;

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class W2 {

public static class TokenizerMapper extends Mapper<Object, Text, Text, IntWritable> {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context)
throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
}
}
}


public static class IntSumReducer extends
Reducer<Text, IntWritable, Text, IntWritable> {
private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable<IntWritable> values,
Context context) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}


public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
        System.setProperty("hadoop.home.dir", "E:/hadoop/hadoop-2.3.0");
String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: wordcount <in> <out>");
System.exit(2);
}


Job job = new Job(conf, "word count");
job.setJarByClass(W2.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}

2,运行方式:

在eclipse中W2.java代码区点击右键,点击里面的run on hadoop即可运行该程序。

3,运行报错(1):

Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/base/Preconditions

at org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configuration.java:314)

at org.apache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configuration.java:327)

at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)

at wc.WordCount.main(WordCount.java:82)

Caused by: java.lang.ClassNotFoundException: com.google.common.base.Preconditions

at java.net.URLClassLoader$1.run(Unknown Source)

at java.net.URLClassLoader$1.run(Unknown Source)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(Unknown Source)

at java.lang.ClassLoader.loadClass(Unknown Source)

at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)

at java.lang.ClassLoader.loadClass(Unknown Source)

... 4 more

少了guava-r07.jar包。

4,运行报错(2):

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName

缺少hadoop-auth-2.2.0.jar包,这个包在. /eclipse/configuration/org.eclipse.osgi/bundles/230/1/.cp/lib/hadoop-auth-2.2.0.jar里面

5,运行报错(3):

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory

缺少2个包:

/usr/local/eclipse/configuration/org.eclipse.osgi/bundles/230/1/.cp/lib/slf4j-api-1.7.5.jar

/usr/local/eclipse/configuration/org.eclipse.osgi/bundles/230/1/.cp/lib/slf4j-log4j12-1.7.5.jar

6,运行报错(4):

在Eclipse运行hadoop报错:

2014-12-11 20:12:01,750 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(996)) - fs.default.name is deprecated. Instead, use fs.defaultFS
SLF4J: This version of SLF4J requires log4j version 1.2.12 or later. See also #log4j_version
2014-12-11 20:12:02,760 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-12-11 20:12:02,812 ERROR [main] util.Shell (Shell.java:getWinUtilsPath(336)) - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

解决:

代码里加上 System.setProperty("hadoop.home.dir", "d:/hadoop");并查看Windows环境下Hadoop目录下的bin目录下有没有winutils.exe,没有就下一个拷贝过去。

7,运行报错(5):

报错:

Exception in thread "main" java.lang.NoClassDefFoundError: com/google/protobuf/ServiceException

at org.apache.hadoop.ipc.ProtobufRpcEngine.<clinit>(ProtobufRpcEngine.java:69)

at java.lang.Class.forName0(Native Method)

缺乏/usr/local/app/apache-tomcat-6.0.37_9090/webapps/solr/WEB-INF/lib/protobuf-java-2.4.0a.jar

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:https://www.heiqu.com/10a199eb91644d11e772d7608194059a.html