這篇文章主要介紹“怎么在CentOS下重新編譯hadoop源碼”,在日常操作中,相信很多人在怎么在CentOS下重新編譯hadoop源碼問題上存在疑惑,小編查閱了各式資料,整理出簡單好用的操作方法,希望對大家解答”怎么在CentOS下重新編譯hadoop源碼”的疑惑有所幫助!接下來,請跟著小編一起來學習吧!
目前成都創(chuàng)新互聯(lián)公司已為千余家的企業(yè)提供了網(wǎng)站建設(shè)、域名、雅安服務器托管、綿陽服務器托管、企業(yè)網(wǎng)站設(shè)計、茂名網(wǎng)站維護等服務,公司將堅持客戶導向、應用為本的策略,正道將秉承"和諧、參與、激情"的文化,與客戶和合作伙伴齊心協(xié)力一起成長,共同發(fā)展。
網(wǎng)上找了Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl警告的原因,說的是由于hadoop一些本地庫里編譯時用到的C庫與本機上的版本不同造成的,在本機環(huán)境下重新編譯hadoop即可。
不過這個警告對hadoop使用影響不大。
然而作為一個有強迫癥的程序員嘗試了一些方法后無果,只能自己編譯源碼
切換到root用戶
下載Ant Maven ProtocolBuffer findbugs CMake 的tar包放到/hadoop目錄下
我用的版本是:
[hadoop@vm1 Downloads]$ ls apache-ant-1.9.5.tar.gz findbugs-2.0.2.tar.gz jdk-8u45-linux-x64.gz apache-maven-3.0.5.tar.gz hadoop-2.7.0-src.tar.gz protobuf-2.5.0 cmake-2.8.6 hadoop-2.7.0.tar.gz protobuf-2.5.0.tar.gz cmake-2.8.6.tar.gz jdk-7u79-linux-x64.gz
yum -y install lzo-devel zlib-devel gcc autoconf automake libtool tar zxf protobuf-2.5.0.tar.gz cd protobuf-2.5.0 ./configure
這時候因為protobuf需要c++支持,如果機器沒裝c++會出現(xiàn)如下錯誤:
checking whether to enable maintainer-specific portions of Makefiles... yes checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64-unknown-linux-gnu checking target system type... x86_64-unknown-linux-gnu checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for a thread-safe mkdir -p... /bin/mkdir -p checking for gawk... gawk checking whether make sets $(MAKE)... yes checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking for style of include used by make... GNU checking dependency style of gcc... gcc3 checking for g++... no checking for c++... no checking for gpp... no checking for aCC... no checking for CC... no checking for cxx... no checking for cc++... no checking for cl.exe... no checking for FCC... no checking for KCC... no checking for RCC... no checking for xlC_r... no checking for xlC... no checking whether we are using the GNU C++ compiler... no checking whether g++ accepts -g... no checking dependency style of g++... none checking how to run the C++ preprocessor... /lib/cpp configure: error: in `/hadoop/protobuf-2.5.0': configure: error: C++ preprocessor "/lib/cpp" fails sanity check See `config.log' for more details
----------------------------------------------------------------------------------------
這時候需要
yum install glibc-headers yum install gcc-c++
然后再到protobuf文件夾下執(zhí)行./configure
這下好了。那么goon
make make check make install tar apache-ant-1.9.2-bin.tar.gz mv apache-ant-1.9.2 /hadoop/app/ant192 tar apache-maven-3.0.5-bin.tar.gz mv apache-maven-3.0.5 /hadoop/maven305 tar zxf findbugs-2.0.2.tar.gz mv findbugs-2.0.2 /hadoop/findbugs202 tar zxf cmake-2.8.6.tar.gz cd cmake-2.8.6 ./bootstrap; make; make install cd .. tar zxf hadoop-2.7.0-src.tar.gz mv hadoop-2.7.0-src /hadoop/hadoop270_src chown -R hadoop:hadoop /hadoop/hadoop270_src vi /etc/profile export ANT_HOME=/hadoop/ant192 export MAVEN_HOME=/hadoop/maven305 export FINDBUGS_HOME=/hadoop/findbugs202 export PATH=${ANT_HOME}/bin:${MAVEN_HOME}/bin:${FINDBUGS_HOME}/bin:$PATH source /etc/profile su - hadoop cd /hadoop/hadoop270_src mvn clean package -DskipTests -Pdist,native,docs -Dtar
如果是第一次配置maven這一步會有點久,最好配置下maven的鏡像地址
編譯最后可能出現(xiàn)這個錯誤:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...... @ 5:124 in /home/hadoop/app/hadoop270_src/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
是zlib1g-dev 和 libssl-dev沒有安裝, 編譯本地庫需要這2個庫的支持
解決方法:
yum install openssl-devel
然后重新:
mvn clean package -DskipTests -Pdist,native,docs -Dtar
注意:在jdk1.8環(huán)境下,可能出現(xiàn)錯誤:
[WARNING] The requested profile "native" could not be activated because it does not exist. [WARNING] The requested profile "docs" could not be activated because it does not exist. [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-dist: An Ant BuildException has occured: exec returned: 1 [ERROR] around Ant part ...... @ 38:100 in /home/hadoop/app/hadoop270_src/hadoop-dist/target/antrun/build-main.xml
解決方法:將1.8換成1.7
那么編譯成功:
[INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 25:22.002s [INFO] Finished at: Tue Jul 07 21:20:38 PDT 2015 [INFO] Final Memory: 131M/405M [INFO] ------------------------------------------------------------------------ [hadoop@vm1 hadoop270_src]$ ls BUILDING.txt hadoop-dist hadoop-project NOTICE.txt dev-support hadoop-hdfs-project hadoop-project-dist pom.xml hadoop-assemblies hadoop-mapreduce-project hadoop-tools README.txt hadoop-client hadoop-maven-plugins hadoop-yarn-project hadoop-common-project hadoop-minicluster LICENSE.txt [hadoop@vm1 hadoop270_src]$ cd hadoop-dist/ [hadoop@vm1 hadoop-dist]$ ls pom.xml target [hadoop@vm1 hadoop-dist]$ cd target/ [hadoop@vm1 target]$ ls antrun hadoop-2.7.0 hadoop-dist-2.7.0-javadoc.jar test-dir dist-layout-stitching.sh hadoop-2.7.0.tar.gz javadoc-bundle-options dist-tar-stitching.sh hadoop-dist-2.7.0.jar maven-archiver [hadoop@vm1 target]$ pwd /hadoop/app/hadoop270_src/hadoop-dist/target
用自己編譯好的hadoop包配置好相應環(huán)境,啟動hdfs已經(jīng)沒有(Unable to load native-hadoop library for your platform... using builtin-java classes where applicabl)警告:
[hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-dfs.sh Starting namenodes on [vm1] vm1: starting namenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-namenode-vm1.out vm1: starting datanode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-datanode-vm1.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /home/hadoop/app/hadoop-2.7.0/logs/hadoop-hadoop-secondarynamenode-vm1.out [hadoop@vm1 hadoop-2.7.0]$ ./sbin/start-yarn.sh starting yarn daemons starting resourcemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-resourcemanager-vm1.out vm1: starting nodemanager, logging to /home/hadoop/app/hadoop-2.7.0/logs/yarn-hadoop-nodemanager-vm1.out [hadoop@vm1 hadoop-2.7.0]$ jps 3251 NodeManager 3540 Jps 3145 ResourceManager 2699 NameNode 2828 DataNode 2991 SecondaryNameNode
到此,關(guān)于“怎么在CentOS下重新編譯hadoop源碼”的學習就結(jié)束了,希望能夠解決大家的疑惑。理論與實踐的搭配能更好的幫助大家學習,快去試試吧!若想繼續(xù)學習更多相關(guān)知識,請繼續(xù)關(guān)注創(chuàng)新互聯(lián)網(wǎng)站,小編會繼續(xù)努力為大家?guī)砀鄬嵱玫奈恼拢?/p>
本文名稱:怎么在CentOS下重新編譯hadoop源碼
轉(zhuǎn)載注明:http://weahome.cn/article/goosji.html