您好,登錄后才能下訂單哦!
這篇文章給大家介紹如何在Windows上運(yùn)行Eclipse使用virtualbox搭建的Ubuntu的hadoop集群服務(wù),內(nèi)容非常詳細(xì),感興趣的小伙伴們可以參考借鑒,希望對大家能有所幫助。
在windows端使用eclipse在ubuntu集群中運(yùn)行程序
將ubuntu的master節(jié)點(diǎn)的hadoop拷貝到windows某個(gè)路徑下,例如:E:\Spring\Hadoop\hadoop\hadoop-2.7.1
Eclipse安裝對應(yīng)版本的hadoop插件,并且,在windows-preference-mapreduce中設(shè)置hadoop目錄的路徑
第一種:空指針異常
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:441)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:421)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:281)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at WordCount.main(WordCount.java:89)
來源: <http://bbs.csdn.net/topics/390876548>
原因:要讀寫windows平臺(tái)的文件,沒有權(quán)限,所以,在hadoop\bin中以及System32放置對應(yīng)版本的winutils.exe以及hadoop.dll,加入環(huán)境變量HADOOP_HOME,值為:E:\Spring\Hadoop\hadoop\hadoop-2.7.1,在Path中加入%HADOOP_HOME%\bin,重啟eclipse(否則不會(huì)生效),運(yùn)行不會(huì)報(bào)這個(gè)異常了
第二種:Permission denied
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=cuiguangfan, access=WRITE, inode="/tmp/hadoop-yarn/staging/cuiguangfan/.staging":linux1:supergroup:drwxr-xr-x
參考以下鏈接(啟發(fā)):
http://www.huqiwen.com/2013/07/18/hdfs-permission-denied/
所以,在程序運(yùn)行時(shí)設(shè)置System.setProperty("HADOOP_USER_NAME", "linux1");
注意,在windows中設(shè)置系統(tǒng)環(huán)境變量不起作用
第三種:no job control
2014-05-28 17:32:19,761 WARN org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Exception from container-launch with container ID: container_1401177251807_0034_01_000001 and exit code: 1
org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash: line 0: fg: no job control
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
原因:hadoop的系統(tǒng)環(huán)境變量沒有正確設(shè)置導(dǎo)致的
解決:重寫YARNRunner
參考鏈接:http://blog.csdn.net/fansy1990/article/details/27526167
即將所有%XX%替換成$XX,將\\替換成/,我這里處理的更完整一點(diǎn),截圖:
第四種:Invalid host name: local host is: (unknown); destination host is
原因:本應(yīng)該運(yùn)行在遠(yuǎn)端master上的端口沒有設(shè)置完全
解決:
將hdfs-site.xml、mapred-site.xml、yarn-site.xml中的屬性在程序中顯式設(shè)置為master節(jié)點(diǎn)(IP設(shè)置)
截圖:
第五種:在192.168.99.101:8088(查看所有Application)中,點(diǎn)開某個(gè)datanode節(jié)點(diǎn),無法找到
原因:因?yàn)辄c(diǎn)開的是linuxX-clound,系統(tǒng)沒有找到linuxX-clound對應(yīng)的IP地址,這里,設(shè)置windows的hosts文件,將在master或者slaves中設(shè)置的hosts拷貝過來
即:
192.168.99.101 linux0-cloud 192.168.99.100 linux1-cloud 192.168.99.102 linux2-cloud 192.168.99.103 linux3-cloud
,由此,修改完hosts后,我們可以將conf中的設(shè)置遠(yuǎn)端地址改為linux0-cloud(master節(jié)點(diǎn))
補(bǔ)充:在解決了第三種錯(cuò)誤后,這個(gè)錯(cuò)誤應(yīng)該消失,如果沒消失,在Mapreduce-site.xml和Yarn-site.xml都加入以下內(nèi)容:
<property> <name>mapreduce.application.classpath</name> <value> /home/linux1/hadoop/hadoop-2.7.1/etc/hadoop, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/common/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/common/lib/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/hdfs/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/hdfs/lib/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/mapreduce/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/mapreduce/lib/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/yarn/*, /home/linux1/hadoop/hadoop-2.7.1/share/hadoop/yarn/lib/* </value> </property>
第六種:java.lang.RuntimeException:java.lang.ClassNotFoundException
原因:mapreduce程序在hadoop中的運(yùn)行機(jī)理:mapreduce框架在運(yùn)行Job時(shí),為了使得各個(gè)從節(jié)點(diǎn)上能執(zhí)行task任務(wù)(即map和reduce函數(shù)),會(huì)在作業(yè)提交時(shí)將運(yùn)行作業(yè)所需的資源,包括作業(yè)jar文件、配置文件和計(jì)算所得的輸入劃分,復(fù)制到HDFS上一個(gè)以作業(yè)ID命名的目錄中,并且作業(yè)jar的副本較多,以保證tasktracker運(yùn)行task時(shí)可以訪問副本,執(zhí)行程序。程序不是以jar的形式運(yùn)行的,所以不會(huì)上傳jar到HDFS中,以致節(jié)點(diǎn)外的所有節(jié)點(diǎn)在執(zhí)行task任務(wù)時(shí)上不能找到map和reduce類,所以在運(yùn)行task時(shí)會(huì)出現(xiàn)錯(cuò)誤。
解決:臨時(shí)生成jar包,設(shè)置路徑
關(guān)于如何在Windows上運(yùn)行Eclipse使用virtualbox搭建的Ubuntu的hadoop集群服務(wù)就分享到這里了,希望以上內(nèi)容可以對大家有一定的幫助,可以學(xué)到更多知識(shí)。如果覺得文章不錯(cuò),可以把它分享出去讓更多的人看到。
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。