1、修改虚拟机上的/etc/hosts文件,将127.0.0.1 node1这行注释掉,
2、添加192.168.1.100(我的真实IP) node1
3、重启hadoop,即:stop-all.sh ;start-all.sh
4、关闭hadoop safemode:hadoop dfsadmin -safemode leave
参考:VMware虚拟机中Hadoop服务的端口无法访问的问题
map/reduce(v2) master:
host:填写node1
port:填写和mapred-site.xml中端口一致
dfs master:
host:填写node1
port:填写和core-site.xml中端口一致
user name:填写和虚拟机中运行hadoop一致的用户名(最好将win7登录名改成一致的)
Q2: windows下eclipse提交远程任务到hadoop:
1、windows下配置HADOOP_HOME变量,同时将%HADOOP_HOME%/bin加入PATH中。
2、hadoop.dll放入%WINDOWS%/Syswow64(64bit),/System32(32bit);
winutils.exe放入%HADOOP_HOME%/bin下。
3、从远程hadoop服务器中$HADOOP_HOME/etc/hadoop文件夹下拷贝几个文件:
core-site.xml , hdfs-site.xml , log4j.properties , mapred-site.xml ,yarn-site.xml ,放置到src文件夹下。
1)。修改mapred-site.xml,增加如下内容:
<property>
<name>mapred.remote.os</name>
<value>Linux</value>
</property>
<property>
<name>mapreduce.app-submission.cross-platform</name>
<value>true</value>
</property>
<property>
<name>mapreduce.application.classpath</name>
<value>
/usr/local/hadoop-2.6.4/etc/hadoop,
/usr/local/hadoop-2.6.4/share/hadoop/common/*,
/usr/local/hadoop-2.6.4/share/hadoop/common/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/hdfs/*,
/usr/local/hadoop-2.6.4/share/hadoop/hdfs/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/mapreduce/*,
/usr/local/hadoop-2.6.4/share/hadoop/mapreduce/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/yarn/*,
/usr/local/hadoop-2.6.4/share/hadoop/yarn/lib/*
</value>
</property>
<property>
<name>yarn.application.classpath</name>
<value>
/usr/local/hadoop-2.6.4/etc/hadoop,
/usr/local/hadoop-2.6.4/share/hadoop/common/*,
/usr/local/hadoop-2.6.4/share/hadoop/common/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/hdfs/*,
/usr/local/hadoop-2.6.4/share/hadoop/hdfs/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/mapreduce/*,
/usr/local/hadoop-2.6.4/share/hadoop/mapreduce/lib/*,
/usr/local/hadoop-2.6.4/share/hadoop/yarn/*,
/usr/local/hadoop-2.6.4/share/hadoop/yarn/lib/*
</value>
</property>
以上参考自:Eclipse远程提交MapReduce任务到Hadoop集群