您好,登錄后才能下訂單哦!
這篇文章主要講解了“l(fā)inux 下hadoop 2.9.2源碼編譯的坑怎么解決”,文中的講解內(nèi)容簡(jiǎn)單清晰,易于學(xué)習(xí)與理解,下面請(qǐng)大家跟著小編的思路慢慢深入,一起來(lái)研究和學(xué)習(xí)“l(fā)inux 下hadoop 2.9.2源碼編譯的坑怎么解決”吧!
編譯過(guò)程begin
centos 7 (系統(tǒng)mac上vmware fusion )
1、Hadoop下載源碼包:(再吐槽一下百度 當(dāng)你用Hadoop 搜索的時(shí)候都是廣告,用hadoop稍微好一些,只有出來(lái)的第一個(gè)是廣告了)
2.9.2 source 源碼包,是目前的穩(wěn)定版本
2、解壓
做一個(gè)簡(jiǎn)單的規(guī)劃,創(chuàng)建目錄
[root@nancycici bin]# mkdir /opt/sourcecode [root@nancycici bin]# mkdir /opt/software [root@nancycici bin]# cd /opt/sourcecode
shell工具rz上傳壓縮包
[root@nancycici ~]# rz
(當(dāng)然得先安裝了這個(gè)工具才能用 :安裝命令:yum -y install lrzsz)
[root@nancycici sourcecode]# ls hadoop-2.9.2-src hadoop-2.9.2-src.tar hadoop-3.1.2-src hadoop-3.1.2-src.tar
3、查看 BUILDING.txt 中編譯所需要的條件,根據(jù)這些Requirements 逐個(gè)進(jìn)行安裝
[root@nancycici hadoop-2.9.2-src]# vi /opt/sourcecode/hadoop-2.9.2-src/BUILDING.txt Requirements: * Unix System * JDK 1.7 or 1.8 * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac * Zlib devel (if compiling native code) * openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance) * Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs) * Internet connection for first build (to fetch all Maven and Hadoop dependencies) * python (for releasedocs) * Node.js / bower / Ember-cli (for YARN UI v2 building)
* JDK 1.7 or 1.8
官網(wǎng)下載的1.8解壓配置環(huán)境變量
[root@nancycici ~]# mkdir /usr/java [root@nancycici ~]# mv jdk-8u201-linux-x64.rpm /usr/java/ [root@nancycici ~]# cd /usr/java/ [root@nancycici java]# ll total 261376 -r--------. 1 root root 176209195 Mar 27 18:10 jdk-8u201-linux-x64.rpm [root@nancycici java]# rpm -ivh jdk-8u201-linux-x64.rpm [root@nancycici java]# mv jdk1.8.0_201-amd64 jdk1.8.0 [root@nancycici java]# vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.8.0 export PATH=$JAVA_HOME/bin:$PATH
環(huán)境變量生效查看
[root@nancycici java]# source /etc/profile [root@nancycici java]# which java /usr/java/jdk1.8.0/bin/java [root@nancycici java]# java -version java version "1.8.0_201" Java(TM) SE Runtime Environment (build 1.8.0_201-b09) Java HotSpot(TM) 64-Bit Server VM (build 25.201-b09, mixed mode)
接下來(lái)maven
* Maven 3.0 or later
[root@nancycici software]# tar -xvf apache-maven-3.6.0-bin.tar [root@nancycici protobuf]# vi /etc/profile export MAVEN_HOME=/opt/software/apache-maven-3.6.0 export PATH=$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH [root@nancycici protobuf]# source /etc/profile [root@nancycici ~]# mvn -version Apache Maven 3.6.0 (97c98ec64a1fdfee7767ce5ffb20918da4f719f3; 2018-10-25T02:41:47+08:00) Maven home: /opt/software/apache-maven-3.6.0 Java version: 1.8.0_201, vendor: Oracle Corporation, runtime: /usr/java/jdk1.8.0/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "3.10.0-514.el7.x86_64", arch: "amd64", family: "unix"
接下來(lái)findbugs(最開(kāi)始在官網(wǎng)下載了findbugs 3.0.1 編譯出現(xiàn)問(wèn)題后換了1.3.9,還是按building來(lái)吧)
* Findbugs 1.3.9 (if running findbugs)
[root@nancycici software]# tar -xvf findbugs-1.3.9.tar export FINDBUGS_HOME=/opt/software/findbugs-1.3.9 export PATH=$FINDBUGS_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@nancycici protobuf]# source /etc/profile
接下來(lái) protocol 這個(gè)需要安裝 * ProtocolBuffer 2.5.0 cmake yum直接安裝 * CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
[root@nancycici software]# tar -xvf protobuf-2.5.0.tar [root@nancycici protobuf-2.5.0]# yum install -y gcc gcc-c++ make cmake [root@nancycici protobuf-2.5.0]# ./configure --prefix=/usr/local/protobuf [root@nancycici protobuf-2.5.0]# make && make install
[root@nancycici protobuf]# vi /etc/profile export PROTOC_HOME=/usr/local/protobuf export PATH=$FINDBUGS_HOME/bin:$PROTOC_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH [root@nancycici protobuf]# source /etc/profile [root@nancycici protobuf]# which protoc /usr/local/protobuf/bin/protoc
[root@nancycici software]# cmake -version cmake version 2.8.12.2
剩下的直接yum安裝
[root@nancycici software]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool [root@nancycici software]# yum install -y snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
4、現(xiàn)在可以編譯了(網(wǎng)絡(luò)很重要)
[root@nancycici software]# cd /opt/sourcecode [root@nancycici sourcecode]# ls hadoop-2.9.2-src hadoop-2.9.2-src.tar hadoop-3.1.2-src hadoop-3.1.2-src.tar [root@nancycici sourcecode]# cd hadoop-2.9.2-src/ [root@nancycici hadoop-3.1.2-src]# mvn clean package -Pdist,native -DskipTests -Dtar
坑來(lái)了~~~~
編譯時(shí)間可以從一小時(shí)到 幾天。。因?yàn)橐恢备鞣Nfail,比如這些
[INFO] Apache Hadoop Auth ................................. FAILURE
[INFO] Apache Hadoop Common ............................... FAILURE
[INFO] Apache Hadoop Common Project ....................... FAILURE
[INFO] Apache Hadoop HDFS Client .......................... FAILURE
沒(méi)有記錄具體的報(bào)錯(cuò),但是大部分是由于網(wǎng)路問(wèn)題連不上amazon的庫(kù),一些重新編譯之后能過(guò),一些反復(fù)在一個(gè)位置卡住的時(shí)候可能有問(wèn)題,Apache Hadoop Auth Apache Hadoop Common 這個(gè)位置反復(fù)fail查看報(bào)錯(cuò)猜測(cè)好像是有一些包沒(méi)有裝好,期間yum重裝了 gcc gcc-c++ 發(fā)現(xiàn)之后確實(shí)沒(méi)安裝成功,這些包可以嘗試一下,安裝后一定要看到成功。
期間還出現(xiàn)maven的相關(guān)報(bào)錯(cuò),
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.1.2:
大概是這樣,環(huán)境變量中加上
export MAVEN_OPTS="-Xms256m -Xmx512m"
最后這幾個(gè)都過(guò)了。
[INFO] Apache Hadoop Amazon Web Services support .......... FAILURE
這個(gè)卡了很久 ,確認(rèn)不是網(wǎng)絡(luò)問(wèn)題了
Failed to collect dependencies at com.amazonaws:DynamoDBLocal:jar:[1.11.86,2.
其中這個(gè)報(bào)錯(cuò)出現(xiàn)很多次
參考亞馬遜的網(wǎng)站
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.Maven.html
以及參考博客
https://blog.csdn.net/galiniur0u/article/details/80669408
To use DynamoDB in your application as a dependency:
Download and install Apache Maven. For more information, see Downloading Apache Maven and Installing Apache Maven .
Add the DynamoDB Maven repository to your application's Project Object Model (POM) file:
<!--Dependency:--> <dependencies> <dependency> <groupId>com.amazonaws</groupId> <artifactId>DynamoDBLocal</artifactId> <version>[1.11,2.0)</version> </dependency> </dependencies> <!--Custom repository:--> <repositories> <repository> <id>dynamodb-local-oregon</id> <name>DynamoDB Local Release Repository</name> <url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url> </repository> </repositories>
將dependencies 和repositories加入pom.xml
[root@nancycici hadoop-project]# pwd /opt/sourcecode/hadoop-2.9.2-src/hadoop-project [root@nancycici hadoop-project]# ls pom.xml pom.xml.bk src target [root@nancycici hadoop-project]# vi pom.xml
然鵝這個(gè)時(shí)候還是不行。
這里重新用yum下載了一個(gè)Java版本(java-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64)重裝了Java
[root@nancycici jvm]# pwd /usr/lib/jvm [root@nancycici jvm]# ls java-1.7.0-openjdk-1.7.0.211-2.6.17.1.el7_6.x86_64 jre-1.7.0-openjdk jre-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64 java-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64 jre-1.7.0-openjdk-1.7.0.211-2.6.17.1.el7_6.x86_64 jre-openjdk jre jre-1.8.0 jre-1.7.0
最后設(shè)置的環(huán)境變量
#export JAVA_HOME=/usr/java/jdk1.8.0 export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.102-4.b14.el7.x86_64 export MAVEN_HOME=/opt/software/apache-maven-3.6.0 export PROTOC_HOME=/usr/local/protobuf export MAVEN_OPTS="-Xms256m -Xmx512m" export FINDBUGS_HOME=/opt/software/findbugs-1.3.9 export PATH=$FINDBUGS_HOME/bin:$PROTOC_HOME/bin:$MAVEN_HOME/bin:$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH
然后關(guān)閉了防火墻
systemctl stop firewalld.service #停止firewall
systemctl disable firewalld.service #禁止firewall開(kāi)機(jī)啟動(dòng)
各種磨難五天,編譯成功了
[INFO] Reactor Summary for Apache Hadoop Main 2.9.2: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 6.019 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 4.249 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 4.580 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 7.920 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.603 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 6.075 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 11.859 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 10.455 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 13.511 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 7.333 s] [INFO] Apache Hadoop Common ............................... SUCCESS [02:09 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 13.265 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 19.103 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.250 s] [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 34.898 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [01:30 min] [INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 5.592 s] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 28.792 s] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 13.432 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 7.642 s] [INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 41.155 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.096 s] [INFO] Apache Hadoop YARN ................................. SUCCESS [ 0.109 s] [INFO] Apache Hadoop YARN API ............................. SUCCESS [ 19.196 s] [INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 50.308 s] [INFO] Apache Hadoop YARN Registry ........................ SUCCESS [ 9.489 s] [INFO] Apache Hadoop YARN Server .......................... SUCCESS [ 0.101 s] [INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 17.042 s] [INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 18.634 s] [INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [ 4.028 s] [INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 9.632 s] [INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 6.423 s] [INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 33.567 s] [INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [ 2.233 s] [INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 8.271 s] [INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [ 5.145 s] [INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 4.856 s] [INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 6.291 s] [INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [ 10.117 s] [INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [ 4.253 s] [INFO] Apache Hadoop YARN Applications .................... SUCCESS [ 0.062 s] [INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [ 4.325 s] [INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [ 2.326 s] [INFO] Apache Hadoop YARN Site ............................ SUCCESS [ 0.106 s] [INFO] Apache Hadoop YARN UI .............................. SUCCESS [ 0.049 s] [INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 7.882 s] [INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [ 0.271 s] [INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 34.480 s] [INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 24.368 s] [INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [ 6.248 s] [INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 11.452 s] [INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 7.556 s] [INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 5.185 s] [INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [ 2.128 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 7.005 s] [INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 3.082 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 5.188 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 9.639 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.798 s] [INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 2.937 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 9.550 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 6.652 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.858 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 3.717 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 4.609 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 1.026 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 6.787 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 19.538 s] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 9.503 s] [INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 5.122 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 10.356 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 2.316 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 9.908 s] [INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 7.750 s] [INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 14.210 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 21.479 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.103 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:10 min] [INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [ 1.427 s] [INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [ 0.078 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 17:00 min [INFO] Finished at: 2019-04-19T09:15:30+08:00 [INFO] ------------------------------------------------------------------------ [root@nancycici hadoop-2.9.2-src]#
編譯好的包路徑:
main: [exec] $ tar cf hadoop-2.9.2.tar hadoop-2.9.2 [exec] $ gzip -f hadoop-2.9.2.tar [exec] [exec] Hadoop dist tar available at: /opt/sourcecode/hadoop-2.9.2-src/hadoop-dist/target/hadoop-2.9.2.tar.gz [exec]
問(wèn)題的處理不一定適合每個(gè)人,但相關(guān)的報(bào)錯(cuò)可以試一試
感謝各位的閱讀,以上就是“l(fā)inux 下hadoop 2.9.2源碼編譯的坑怎么解決”的內(nèi)容了,經(jīng)過(guò)本文的學(xué)習(xí)后,相信大家對(duì)linux 下hadoop 2.9.2源碼編譯的坑怎么解決這一問(wèn)題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是億速云,小編將為大家推送更多相關(guān)知識(shí)點(diǎn)的文章,歡迎關(guān)注!
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。