网站建设资讯

NEWS

网站建设资讯

redhat6.下安装配置hadoop环境--单实例版本

redhat6.4下(用作hadoop master :记录所有数据分布情况--namenode)--安装单实例计算机
1、修改hostname
[root@hadoop01 ]# vim /etc/hosts
[root@hadoop01 ]# hostname
hadoop01
2、关闭防火墙
[root@hadoop01]# service iptables stop
iptables: Setting chains to policy ACCEPT: filter [ OK ]
iptables: Flushing firewall rules: [ OK ]
iptables: Unloading modules: [ OK ]
[root@hadoop01 ]# chkconfig iptables off
[root@hadoop01 ]# vi /etc/selinux/config
3、配置java环境
[root@hadoop01 ]# vi /etc/profile
export JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/
export JRE_HOME=$JAVA_HOME/jre
export CLASSPARH=$JACA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
[root@hadoop01 ]# source /etc/profile //环境生效
[root@hadoop01 ]# mv hadoop-2.7.6 /opt
4、新建hadoop文件夹
mkdir /root/hadoop
mkdir /root/hadoop/tmp
mkdir /root/hadoop/var
mkdir /root/hadoop/dfs
mkdir /root/hadoop/dfs/name
mkdir /root/hadoop/dfs/data
5、配置hadoop
[root@hadoop01 hadoop]# pwd
/opt/hadoop-2.7.6/etc/hadoop
[root@hadoop01 hadoop]vim hadoop-env.sh //修改JAVA_HOME
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"
JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/

创新互联是一家专业提供泰顺企业网站建设,专注与成都网站建设、做网站、H5建站、小程序制作等业务。10年已为泰顺众多企业、政府机构等服务。创新互联专业网站制作公司优惠进行中。

[root@hadoop01 hadoop]vim yarn-env.sh //修改JAVA_HOME
JAVA_HOME=/usr/lib/jvm/jre-1.7.0-openjdk.x86_64/

[root@hadoop01 hadoop]# vim core-site.xml


hadoop.tmp.dir
/root/hadoop/tmp
Abase for other temporary directories.


io.file.bufefer.size
131072



fs.default.name
hdfs://hadoop01:9000

[root@hadoop01 hadoop]#vim hdfs-site.xml

dfs.namenode.secondary.http-address
hdfs://hadoop01:9001


dfs.namenode.dir
/root/hadoop/dfs/name
Path on the local filesystem where theNameNode stores the namespace and transactions logs persistently.


dfs.datanode.dir
/root/hadoop/dfs/data
Comma separated list of paths on the localfilesystem of a DataNode where it should store its blocks.



dfs.replication
3


dfs.webhdfs.enabled
true

[root@hadoop01 hadoop# vim mapred-site.xml
服务器。 –>

mapred.job.tracker
hadoop01:9001


mapred.local.dir
/root/hadoop/var



mapreduce.framework.name
yarn



mapreduce.jobhistory.address
hadoop01:10020



mapreduce.jobhistory.webapp.address
hadoop01:19888

[root@hadoop01 hadoop]# vim yarn-site.xml


yarn.nodemanager.aux-services
mapreduce_shuffle



yarn.resourcemanager.hostname
hadoop01


yarn.resourcemanager.webapp.address
hadoop01:8088



yarn.log-aggregation-enable
true



yarn.log-aggregation.retain-seconds
86400

[root@hadoop01 hadoop# vi /etc/profile
export HADOOP_HOME=/opt/hadoop-2.7.6
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export PATH=.:${JAVA_HOME}/bin:${HADOOP_HOME}/bin:$PATH:$HADOOP_HOME/sbin
6、初始化并启动HADOOP
[root@hadoop01 bin]# ./hadoop namenode -format //初始化
[root@hadoop01 current]# pwd
/root/hadoop/dfs/name/current
[root@hadoop01 current]# ls //验证初始化成功
fsimage_0000000000000000000 fsimage_0000000000000000000.md5 seen_txid VERSION
[root@hadoop01 sbin]# pwd
/opt/hadoop-2.7.6/sbin
[root@hadoop01 sbin]# ./start-dfs.sh //启动HDFS
Starting namenodes on [hadoop01]
The authenticity of host 'hadoop01 (172.19.15.151)' can't be established.
RSA key fingerprint is 00:57:cb:a6:0d:28:05:17:58:0b:5f:8c:9d:2e:c5:1f.
Are you sure you want to continue connecting (yes/no)? yes
hadoop01: Warning: Permanently added 'hadoop01,172.19.15.151' (RSA) to the list of known hosts.
root@hadoop01's password:
hadoop01: starting namenode, logging to /opt/hadoop-2.7.6/logs/hadoop-root-namenode-hadoop01.out
root@localhost's password:
localhost: starting datanode, logging to /opt/hadoop-2.7.6/logs/hadoop-root-datanode-hadoop01.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is 00:57:cb:a6:0d:28:05:17:58:0b:5f:8c:9d:2e:c5:1f.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known hosts.
root@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop-2.7.6/logs/hadoop-root-secondarynamenode-hadoop01.out
[root@hadoop01 sbin]# ./start-yarn.sh //启动yarn
[root@hadoop01 sbin]# start-yarn.sh
-bash: start-yarn.sh: command not found
[root@hadoop01 sbin]# ./start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /opt/hadoop-2.7.6/logs/yarn-root-resourcemanager-hadoop01.out
root@localhost's password:
localhost: starting nodemanager, logging to /opt/hadoop-2.7.6/logs/yarn-root-nodemanager-hadoop01.out
7、查看是否启动成功
[root@hadoop01 sbin]# jps
27916 DataNode
28629 Jps
28239 ResourceManager
27779 NameNode
28083 SecondaryNameNode
28531 NodeManager
8、网页访问
http://172.19.15.151:8088/cluster
http://172.19.15.151:50070


当前题目:redhat6.下安装配置hadoop环境--单实例版本
标题链接:http://cdweb.net/article/jessee.html