Hadoop2.2+Mahout0.9实战

shenghaomail 2014-04-09

版本:Hadoop2.2.0,mahout0.9。

使用mahout的org.apache.mahout.cf.taste.hadoop.item.RecommenderJob进行测试。

首先说明下,如果使用官网提供的下载hadoop2.2.0以及mahout0.9进行调用mahout的相关算法会报错。一般报错如下:

java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
 at org.apache.mahout.common.HadoopUtil.getCustomJobName(HadoopUtil.java:174)
 at org.apache.mahout.common.AbstractJob.prepareJob(AbstractJob.java:614)
 at org.apache.mahout.cf.taste.hadoop.preparation.PreparePreferenceMatrixJob.run(PreparePreferenceMatrixJob.java:73)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

这个是因为目前mahout只支持hadoop1 的缘故。在这里可以找到解决方法:https://issues.apache.org/jira/browse/MAHOUT-1329。主要就是修改pom文件,修改mahout的依赖。

大家可以下载修改后的源码包

1、(Mahout0.9源码(支持Hadoop2))

2、自己编译Mahout(mvn clean install -Dhadoop2 -Dhadoop.2.version=2.2.0 -DskipTests),或者直接下载已经编译好的jar包。

------------------------------------------分割线------------------------------------------

------------------------------------------分割线------------------------------------------

接着,按照这篇文章建立eclipse的环境:http://blog.csdn.net/fansy1990/article/details/22896249。环境配置好了之后,需要添加mahout的jar包,下载前面提供的jar包,然后导入到java工程中。

编写下面的java代码:

package fz.hadoop2.util;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.yarn.conf.YarnConfiguration;

public class Hadoop2Util {
 private static Configuration conf=null;
 
 private static final String YARN_RESOURCE="node31:8032";
 private static final String DEFAULT_FS="hdfs://node31:9000";
 
 public static Configuration getConf(){
  if(conf==null){
   conf = new YarnConfiguration();
   conf.set("fs.defaultFS", DEFAULT_FS);
   conf.set("mapreduce.framework.name", "yarn");
   conf.set("yarn.resourcemanager.address", YARN_RESOURCE);
  }
  return conf;
 }
}

===============================================

相关阅读

===============================================

package fz.mahout.recommendations;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.util.ToolRunner;
import org.apache.mahout.cf.taste.hadoop.item.RecommenderJob;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;

import fz.hadoop2.util.Hadoop2Util;
/**
 * 测试mahout org.apache.mahout.cf.taste.hadoop.item.RecommenderJob
 * environment:
 * mahout0.9
 * hadoop2.2
 * @author fansy
 *
 */
public class RecommenderJobTest{
 //RecommenderJob rec = null;
 Configuration conf =null;
 @Before
 public void setUp(){
 // rec= new RecommenderJob();
  conf= Hadoop2Util.getConf();
  System.out.println("Begin to test...");
 }
 
 @Test
 public void testMain() throws Exception{
  String[] args ={
  "-i","hdfs://node31:9000/input/user.csv", 
        "-o","hdfs://node31:9000/output/rec001", 
        "-n","3","-b","false","-s","SIMILARITY_EUCLIDEAN_DISTANCE", 
        "--maxPrefsPerUser","7","--minPrefsPerUser","2", 
        "--maxPrefsInItemSimilarity","7", 
        "--outputPathForSimilarityMatrix","hdfs://node31:9000/output/matrix/rec001",
        "--tempDir","hdfs://node31:9000/output/temp/rec001"};
  ToolRunner.run(conf, new RecommenderJob(), args);
 }
 
 @After
 public void cleanUp(){
 
 }
}

相关推荐