可以看到所有Hadoop守护进程均已启动
保存并退出,然后重启sshd服务
[root@promote .ssh]# service sshd restart Redirecting to /bin/systemctl restart sshd.service [root@promote .ssh]# ps -ef|grep sshd root 1995 1 0 22:33 ? 00:00:00 sshd: hadoop [priv] hadoop 2009 1995 0 22:33 ? 00:00:00 sshd: hadoop@pts/0 root 4171 1 0 23:11 ? 00:00:00 /usr/sbin/sshd -D root 4175 3397 0 23:12 pts/0 00:00:00 grep --color=auto sshd |
然后切换回hadoop用户,将ssh证书公钥拷贝至/home/hadoop/.ssh/authorized_keys文件中
[root@promote .ssh]# su hadoop
[hadoop@promote .ssh]$ cat id_rsa.pub >> authorized_keys
修改authorized_keys文件的权限为644(这步一定要有)
[hadoop@promote .ssh]$ chmod 644 authorized_keys
[hadoop@promote .ssh]$ ssh localhost
The authenticity of host 'localhost (127.0.0.1)' can't be established.
RSA key fingerprint is 25:1f:be:72:7b:83:8e:c7:96:b6:71:35:fc:5d:2e:7d.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
Last login: Thu Feb 13 23:42:43 2014
第一次登陆将会将证书内容保存在/home/hadoop/.ssh/known_hosts文件中,以后再次登陆将不需要输入密码
[hadoop@promote .ssh]$ ssh localhost
Last login: Thu Feb 13 23:46:04 2014 from localhost.localdomain
至此ssh证书部分配置完成
第二步:安装JDK
[hadoop@promote ~]$ java -version
java version "1.7.0_25"
OpenJDK Runtime Environment (fedora-2.3.10.3.fc19-i386)
OpenJDK Client VM (build 23.7-b01, mixed mode)
将OpenJDK换为Oracle的Java SE
[hadoop@promote .ssh]$ cd ~
[hadoop@promote ~]$ uname -i
i386
在Oracle的官网下载jdk-6u45-linux-i586.bin后上传至服务器,赋予权限并进行安装,最后删除安装包
[hadoop@promote ~]$ chmod u+x jdk-6u45-linux-i586.bin
[hadoop@promote ~]$ ./jdk-6u45-linux-i586.bin
[hadoop@promote ~]$ rm -rf jdk-6u45-linux-i586.bin
[hadoop@promote conf]$ export PATH=$PATH:/home/hadoop/jdk1.6.0_45/bin
出现以下结果说明JDK成功安装:
[hadoop@promote ~]$ java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) Client VM (build 20.45-b01, mixed mode, sharing)
第三步:安装Hadoop
在Hadoop官网下载hadoop-1.2.1.tar.gz并上传至服务器/home/hadoop路径下
[hadoop@promote ~]$ tar -xzf hadoop-1.2.1.tar.gz
[hadoop@promote ~]$ rm -rf hadoop-1.2.1.tar.gz
[hadoop@promote ~]$ cd hadoop-1.2.1/conf/
[hadoop@promote conf]$ vi hadoop-env.sh
将JAVA_HOME指向第二步安装的JDK所在目录片
# The java implementation to use. Required.
export JAVA_HOME=/home/hadoop/jdk1.6.0_45
保存并退出
第四步:修改Hadoop配置文件
修改core-site.xml:
<?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> </configuration> |