用户登录
用户注册

分享至

java mapreduce编程

  • 作者: -逶迤唯一
  • 来源: 51数据库
  • 2020-10-03
<modelVersion>4.0.0</modelVersion>

<groupId>com.cdh</groupId>
<artifactId>cdh-test</artifactId>
<version>SNAPSHOT-1.0.0</version>
<packaging>jar</packaging>

<name>cdh-test</name>
<url></url>

<properties>
<hadoop.version>2.0.0-mr1-cdh4.4.0</hadoop.version>
<hbase.version>0.94.6-cdh4.4.0</hbase.version>
<project.build.sourceEncoding>utf-8</project.build.sourceEncoding>
<maven.compiler.encoding>utf-8</maven.compiler.encoding>
</properties>

<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<encoding>utf-8</encoding>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>

<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.9</version>
<configuration>
<buildOutputDirectory>eclipse-classes</buildOutputDirectory>
<downloadSources>true</downloadSources>
<downloadJavadocs>false</downloadJavadocs>
</configuration>
</plugin>
</plugins>
</build>



  因为不是太复杂,所以只贴代码了,有注释,相信朋友没都能看懂。
代码
public static void main(string[] args) { //在本地windows平台eclipse运行mapreduce程序 //创建远程用户,以指定的用户来运行程序 //把要运行的程序代码放到run方法体里 usergroupinformation ugi = usergroupinformation.createremoteuser("hadoop"); ugi.doas(new privilegedaction() { public void run() { try{ //设置引用jar的分割符,linux一般用,和:,但windows一般是用;, //所以要设置一下本地的分割符 system.setproperty("path.separator", ":"); configuration conf = new configuration(); //可以设置用到的第三方jar //conf.set("tmpjars", "/tmp/jars/hbase-0.94.5.jar,/tmp/jars/protobuf-java-2.4.0a.jar,/tmp/jars/zookeeper-3.4.3.jar"); conf.set("mapred.job.tracker", "172.168.8.54:9001"); conf.set("fs.default.name", "hdfs://172.168.8.54:9000"); conf.set("hadoop.job.ugi", "hadoop"); conf.set("hbase.zookeeper.quorum","172.168.8.55,172.168.8.56,172.168.8.57"); conf.set("hbase.zookeeper.property.clientport", "2181"); job job = new job(conf); job.setjobname("readhbasetohdfsaction"); job.setjarbyclass(readhbasetohdfsaction.class); job.setnumreducetasks(1); job.setreducerclass(readhbasetohdfsreduce.class); filesystem fs=filesystem.get(conf); path outpath=new path("/tmp/2/1"); if(fs.exists(outpath)){ fs.delete(outpath, true); } fileoutputformat.setoutputpath(job, outpath); job.setoutputkeyclass(text.class); job.setoutputvalueclass(text.class); scan scan = new scan(); tablemapreduceutil.inittablemapperjob("misdn_catetory22", scan, readhbasetohdfsmapper.class, text.class, text.class, job); job.waitforcompletion(true); }catch(exception e){ e.printstacktrace(); } return null; }}); }bigs



  使用eclipse编写mapreduce程序的步骤:
一.安装hadoop for eclipse的插件,注意:插件版本要和hadoop版本一致。
下载:hadoop-eclipse-plugin-2.5.2.jar
http://download.csdn.net/detail/tondayong1981/8186269
将hadoop-eclipse-plugin-2.5.2.jar文件放到ECLIPSE_HOME/plugins下,重启eclipse
二.在eclipse中新建hadoop项目
File>New>other>Map/Reduce Project>next>输入项目名>finish

点击右上角Map/Reduce视图
现在能看见Map/Reduce Locations了,点击右侧的紫色大象。

因为我们是yarn的环境 所以不需要配置Map/Reduce(V2) Master 项
DFS Master中的IP和端口 就是我们hadoop/etc/hadoop/core-site.xml中fs.defaultFS的值
现在我们就能看见远程hadoop的HDFS了

三.在eclipce中运行Map/Reduce,以hadoop自带的wordcount为例
首先下载hadoop的源码:
http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.5.2/
在hadoop-2.5.2-src ? hadoop-mapreduce-project ? hadoop-mapreduce-examples ? src ? main ? java ? org ? apache ? hadoop ? examples目录中找到WordCount.java
将WordCount.java复制到刚才建的myhadoop项目的src下

创建数据源文件word.txt

在HDFS上创建输入目录,再将word.txt传到HDFS上
$hadoop fs -mkdir /tmp/input
$hadoop fs -copyFromLocal /home/hadoop/word.txt /tmp/input/word.txt
回到eclipse 刷新DFS将看到

运行WordCount.java
(1).在新建的项目Hadoop,点击WordCount.java,右键-->Run As-->Run Configurations
(2).在弹出的Run Configurations对话框中,点Java Application,右键-->New,这时会新建一个application名为WordCount
(3).配置运行参数,点Arguments,在Program arguments中输入你要传给程序的输入文件夹和你要求程序将计算结果保存的文件夹,如:
hdfs://10.6.9.226:9000/tmp/input/word.txt hdfs://10.6.9.226:9000/tmp/output
点击Run,等运行结束刷新DFS Locations将会看到output文件夹
软件
前端设计
程序设计
Java相关