您好,登錄后才能下訂單哦!
這篇文章主要介紹“怎么在CentOS下重新編譯hadoop源碼”,在日常操作中,相信很多人在怎么在CentOS下重新編譯hadoop源碼問(wèn)題上存在疑惑,小編查閱了各式資料,整理出簡(jiǎn)單好用的操作方法,希望對(duì)大家解答”怎么在CentOS下重新編譯hadoop源碼”的疑惑有所幫助!接下來(lái),請(qǐng)跟著小編一起來(lái)學(xué)習(xí)吧!
網(wǎng)上找了Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl警告的原因,說(shuō)的是由于hadoop一些本地庫(kù)里編譯時(shí)用到的C庫(kù)與本機(jī)上的版本不同造成的,在本機(jī)環(huán)境下重新編譯hadoop即可。
不過(guò)這個(gè)警告對(duì)hadoop使用影響不大。
然而作為一個(gè)有強(qiáng)迫癥的程序員嘗試了一些方法后無(wú)果,只能自己編譯源碼
切換到root用戶
下載Ant Maven ProtocolBuffer findbugs CMake 的tar包放到/hadoop目錄下
我用的版本是:
[hadoop@vm1 Downloads]$ ls apache-ant-1.9.5.tar.gz findbugs-2.0.2.tar.gz jdk-8u45-linux-x64.gz apache-maven-3.0.5.tar.gz hadoop-2.7.0-src.tar.gz protobuf-2.5.0 cmake-2.8.6 hadoop-2.7.0.tar.gz protobuf-2.5.0.tar.gz cmake-2.8.6.tar.gz jdk-7u79-linux-x64.gz
yum -y install lzo-devel zlib-devel gcc autoconf automake libtool tar zxf protobuf-2.5.0.tar.gz cd protobuf-2.5.0 ./configure
這時(shí)候因?yàn)閜rotobuf需要c++支持,如果機(jī)器沒(méi)裝c++會(huì)出現(xiàn)如下錯(cuò)誤:
checking whether to enable maintainer-specific portions of Makefiles... yes checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64-unknown-linux-gnu checking target system type... x86_64-unknown-linux-gnu checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for a thread-safe mkdir -p... /bin/mkdir -p checking for gawk... gawk checking whether make sets $(MAKE)... yes checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking for style of include used by make... GNU checking dependency style of gcc... gcc3 checking for g++... no checking for c++... no checking for gpp... no checking for aCC... no checking for CC... no checking for cxx... no checking for cc++... no checking for cl.exe... no checking for FCC... no checking for KCC... no checking for RCC... no checking for xlC_r... no checking for xlC... no checking whether we are using the GNU C++ compiler... no checking whether g++ accepts -g... no checking dependency style of g++... none checking how to run the C++ preprocessor... /lib/cpp configure: error: in `/hadoop/protobuf-2.5.0': configure: error: C++ preprocessor "/lib/cpp" fails sanity check See `config.log' for more details
----------------------------------------------------------------------------------------
這時(shí)候需要
yum install glibc-headers yum install gcc-c++
然后再到protobuf文件夾下執(zhí)行./configure
這下好了。那么goon
make make check make install tar apache-ant-1.9.2-bin.tar.gz mv apache-ant-1.9.2 /hadoop/app/ant192 tar apache-maven-3.0.5-bin.tar.gz mv apache-maven-3.0.5 /hadoop/maven305 tar zxf findbugs-2.0.2.tar.gz mv findbugs-2.0.2 /hadoop/findbugs202 tar zxf cmake-2.8.6.tar.gz cd cmake-2.8.6 ./bootstrap; make; make install cd .. tar zxf hadoop-2.7.0-src.tar.gz mv hadoop-2.7.0-src /hadoop/hadoop270_src chown -R hadoop:hadoop /hadoop/hadoop270_src vi /etc/profile export ANT_HOME=/hadoop/ant192 export MAVEN_HOME=/hadoop/maven305 export FINDBUGS_HOME=/hadoop/findbugs202 export PATH=${ANT_HOME}/bin:${MAVEN_HOME}/bin:${FINDBUGS_HOME}/bin:$PATH source /etc/profile su - hadoop cd /hadoop/hadoop270_src mvn clean package -DskipTests -Pdist,native,docs -Dtar
如果是第一次配置maven這一步會(huì)有點(diǎn)久,最好配置下maven的鏡像地址
編譯最后可能出現(xiàn)這個(gè)錯(cuò)誤:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...<exec failonerror="true" dir="/home/hadoop/app/hadoop270_src/hadoop-tools/hadoop-pipes/target/native" executable="cmake">... @ 5:124 in /home/hadoop/app/hadoop270_src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
是zlib1g-dev 和 libssl-dev沒(méi)有安裝, 編譯本地庫(kù)需要這2個(gè)庫(kù)的支持
解決方法:
yum install openssl-devel
然后重新:
mvn clean package -DskipTests -Pdist,native,docs -Dtar
注意:在jdk1.8環(huán)境下,可能出現(xiàn)錯(cuò)誤:
[WARNING] The requested profile "native" could not be activated because it does not exist. [WARNING] The requested profile "docs" could not be activated because it does not exist. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...<exec failonerror="true" dir="/home/hadoop/app/hadoop270_src/hadoop-dist/target" executable="sh">... @ 38:100 in /home/hadoop/app/hadoop270_src/hadoop-dist/target/antrun/build-main.xml
解決方法:將1.8換成1.7
那么編譯成功:
[INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 25:22.002s [INFO] Finished at: Tue Jul 07 21:20:38 PDT 2015 [INFO] Final Memory: 131M/405M [INFO] ------------------------------------------------------------------------ [hadoop@vm1 hadoop270_src]$ ls BUILDING.txt hadoop-dist hadoop-project NOTICE.txt dev-support hadoop-hdfs-project hadoop-project-dist pom.xml hadoop-assemblies hadoop-mapreduce-project hadoop-tools README.txt hadoop-client hadoop-maven-plugins hadoop-yarn-project hadoop-common-project hadoop-minicluster LICENSE.txt [hadoop@vm1 hadoop270_src]$ cd hadoop-dist/ [hadoop@vm1 hadoop-dist]$ ls pom.xml target [hadoop@vm1 hadoop-dist]$ cd target/ [hadoop@vm1 target]$ ls antrun hadoop-2.7.0 hadoop-dist-2.7.0-javadoc.jar test-dir dist-layout-stitching.sh hadoop-2.7.0.tar.gz javadoc-bundle-options dist-tar-stitching.sh hadoop-dist-2.7.0.jar maven-archiver [hadoop@vm1 target]$ pwd /hadoop/app/hadoop270_src/hadoop-dist/target
用自己編譯好的hadoop包配置好相應(yīng)環(huán)境,啟動(dòng)hdfs已經(jīng)沒(méi)有(Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl)警告:
[hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-dfs.sh Starting namenodes on [vm1] vm1: starting namenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-namenode-vm1.out vm1: starting datanode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-datanode-vm1.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-secondarynamenode-vm1.out [hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-yarn.sh starting yarn daemons starting resourcemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-resourcemanager-vm1.out vm1: starting nodemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-nodemanager-vm1.out [hadoop@vm1 hadoop-2.7.0]$ jps 3251 NodeManager 3540 Jps 3145 ResourceManager 2699 NameNode 2828 DataNode 2991 SecondaryNameNode
到此,關(guān)于“怎么在CentOS下重新編譯hadoop源碼”的學(xué)習(xí)就結(jié)束了,希望能夠解決大家的疑惑。理論與實(shí)踐的搭配能更好的幫助大家學(xué)習(xí),快去試試吧!若想繼續(xù)學(xué)習(xí)更多相關(guān)知識(shí),請(qǐng)繼續(xù)關(guān)注億速云網(wǎng)站,小編會(huì)繼續(xù)努力為大家?guī)?lái)更多實(shí)用的文章!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。