i have 1 node cluster running on mac , can create dir in hdfs , put file
hadoop fs -mkdir sampledata hadoop fs -put ~/sampletweets.csv sampledata
but when run simple streaming job below error (shown below code)
hadoop jar ~/cloudera/cdh5.5/hadoop/share/hadoop/mapreduce1/contrib/streaming/hadoop- streaming-2.6.0-mr1-cdh5.5.1.jar \ -dmapred.reduce.tasks=1 \ -input sampledata \ -output sampledata/output1 \ -mapper cat \ -reducer "wc -l"
error on screen , http://localhost:8088/cluster/app/application_1453348020587_0005
application application_1453348020587_0005 failed 2 times due error launching appattempt_1453348020587_0005_000002. got exception: java.io.ioexception: failed on local exception: java.net.socketexception: host down; host details : local host is: "tanna-imac.local/192.168.1.13"; destination host is: "192.168.1.4":58893;
my /etc/hosts:
cat /etc/hosts 127.0.0.1 localhost 255.255.255.255 broadcasthost 192.168.1.13 localhost ::1 localhost fe80::1%lo0 localhost
not sure picking destination host 192.168.1.4 from? , service running on port 58893
Comments
Post a Comment