您好,登錄后才能下訂單哦!
這篇文章主要為大家展示了“hadoop安裝配置會(huì)遇到什么問(wèn)題”,內(nèi)容簡(jiǎn)而易懂,條理清晰,希望能夠幫助大家解決疑惑,下面讓小編帶領(lǐng)大家一起研究并學(xué)習(xí)一下“hadoop安裝配置會(huì)遇到什么問(wèn)題”這篇文章吧。
使用環(huán)境:
ubuntu14.04 64位系統(tǒng)
java7
hadoop 2.4.1
幾個(gè)注意點(diǎn)
1、下載hadoop官網(wǎng)鏈接不可用,所以在開(kāi)源中國(guó)下載
2、在下列場(chǎng)合務(wù)必使用同一個(gè)linux帳號(hào)權(quán)限下操作,我由于習(xí)慣與sudo創(chuàng)建配置文件和進(jìn)行操作,出現(xiàn)一些問(wèn)題
格式化hdfs
ssh無(wú)密碼登錄帳號(hào)
新建編輯配置文件
啟動(dòng)hadoop等
3、hadoop要小心版本差異所帶來(lái)的問(wèn)題,選擇適當(dāng)?shù)慕滩模热鏷adoop1.x和2.x相比,2.x沒(méi)有JobTracker和TaskTracker
幾個(gè)問(wèn)題:
問(wèn)題1、localhost: Error: JAVA_HOME is not set and could not be found.
修改hadoop目錄下/hadoop/etc/hadoop/hadoop-env.sh中的 $JAVA_HOME為絕對(duì)路徑
# The java implementation to use. #export JAVA_HOME=${JAVA_HOME} export JAVA_HOME=/usr/lib/jvm/java7
問(wèn)題2、出現(xiàn)本地庫(kù)無(wú)法導(dǎo)入,據(jù)說(shuō)是64位機(jī)器會(huì)出現(xiàn)
錯(cuò)誤如下:
This script is Deprecated. Instead use stop-dfs.sh and stop-yarn.sh 14/08/10 07:07:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Stopping namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. cluster1] sed: -e expression #1, char 6: unknown option to `s' -c: Unknown cipher type 'cd' ^Ccluster1: stopping namenode cluster1: stopping datanode VM: ssh: Could not resolve hostname VM: Name or service not known stack: ssh: Could not resolve hostname stack: Name or service not known
解決辦法,在hadoop-env.sh修改如下變量,我直接是加在文件末尾
export HADOOP_HOME=/data/server/hadoop export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
問(wèn)題3、sbin/start-all.sh啟動(dòng)時(shí)只有jps只有一個(gè)NodeManager問(wèn)題?
nob@hadoop0:/data/server/hadoop/sbin$ jps 9922 NodeManager 10236 Jps
原因:我格式化hdfs的時(shí)候使用了超級(jí)管理員sudo命令,啟動(dòng)的時(shí)候用戶權(quán)限不一致
問(wèn)題4、sbin/start-all.sh 啟動(dòng)后使用jps查看沒(méi)有JobTracker和TaskTracker?
nob@hadoop0:/data/server/hadoop/sbin$ jps 9444 DataNode 9922 NodeManager 9633 SecondaryNameNode 9790 ResourceManager 9291 NameNode 10236 Jps
原因是: hadoop-2.x版本中不存在JobTracker和TaskTracker,可以參考博客 http://blog.csdn.net/skywalker_only/article/details/37905463,啟動(dòng)NameNode和DataNode的命令為start-dfs.sh,啟動(dòng)yarn的命令為start-yarn.sh。
以上是“hadoop安裝配置會(huì)遇到什么問(wèn)題”這篇文章的所有內(nèi)容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內(nèi)容對(duì)大家有所幫助,如果還想學(xué)習(xí)更多知識(shí),歡迎關(guān)注億速云行業(yè)資訊頻道!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。