CentOS 7下Hadoop2.6伪分布模式安装

1.Hadoop核心配置文件:

# gedit core-site.xml

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://localhost:9000</value>

</property>

</configuration>

# gedit hdfs-site.xml

<configuration>

<property>

<name>dfs.replication</name>

<value>1</value>

</property>

</configuration>

(注解:dfs.replication指定HDFS文件的备份方式默认3,由于是伪分布式,因此需要修改为1。)

# gedit mapred-site.xml.template

<configuration>

<property>

<name>mapreduce.framework.name</name>

<value>yarn</value>

</property>

<property>

<name>yarn.app.mapreduce.am.staging-dir</name>

<value>/data/hadoop/staging</value>

</property>

</configuration>

(注解:mapreduce.framework.name配置mapreduce框架。)

#gedit yarn-site.xml

<configuration>

<!-- Site specific YARN configurationproperties -->

<property>

<name>yarn.nodemanager.aux-services</name>

<value>mapreduce_shuffle</value>

</property>

<property>

<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>

<value>org.apache.hadoop.mapred.ShuffleHandler</value>

</property>

<property>

<name>yarn.resourcemanager.hostname</name>

<value>Hadoop</value>

</property>

</configuration>

(注解:配置YARN信息)

2.格式HDFS文件系统

在启动Hadoop前,需要格式化Hadoop的HDFS文件系统,如果配置Hadoop环境变量,可以直接执行hadoop命令。否则,进入Hadoop的bin目录执行文件格式命令:

$ hadoop namenode -format

此时,Hadoop整个安装与配置过程完成了。

3.启动Hadoop

进入Hadoop的sbin目录,启动Hadoop,验证是否安装成功!

# ./start-all.sh

使用java的jps命令查看是否有hadoop的进程

# jps

10197 NameNode

10769 ResourceManager

10579 SecondaryNameNode

11156 Jps

10898 NodeManager

10344 DataNode

出现以上进程,说明Hadoop安装成功。

(注意:Hadoop2.0使用YARN管理代替了JobTracke和TaskTracker,因此这里只有ResourceManager进程,没有JobTracke和TaskTracker两个进程了)

打开浏览器,输入:

http://localhost:50070/

如果能够查看信息说明Hadoop安装成功了

4.运行WordCount实例

创建测试两侧文件file1.txt,file2.txt

$ vi file1.txt

welcome to hadoop

hello world!

$ vi file2.txt

hadoop hello

在HDFS上创建输入input输入目录:

$ hdfs dfs -mkdir /input

将file1.txt与file2.txt文件上传到HDFS的input目录

$ hdfs dfs -put file1.txt /input

$ hdfs dfs -put file2.txt /input

查看刚才上传的两个文件

$ hdfs dfs -put file2.txt /input

14/10/25 14:43:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

$ hdfs dfs -ls /input

14/10/25 14:43:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Found 2 items

-rw-r--r-- 1 hadoop supergroup 31 2014-10-25 14:43 /input/file1.txt

-rw-r--r-- 1 hadoop supergroup 13 2014-10-25 14:43 /input/file2.txt

执行hadoop自带的WordCount程序,统计单词数

进入/opt/hadoop-2.6.0/share/hadoop/mapreduce执行命令:

$ hadoop jar hadoop-mapreduce-examples-2.6.0.jar wordcount /input /output

执行没报错,查看运行结果:

$ hdfs dfs -ls /output/part-r-00000

14/10/25 14:54:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

ls: `/outputpart-r-00000': No such file or directory

$ hdfs dfs -cat /output/part-r-00000

14/10/25 14:54:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

hadoop 2

hello 2

to 1

welcome 1

world! 1

统计结果正确!