立即注册 登录
About云-梭伦科技 返回首页

sstutu的个人空间 https://www.aboutyun.com/?70 [收藏] [复制] [分享] [RSS]

日志

hadoop2.3.0cdh5.0编译安装

已有 1034 次阅读2014-8-6 13:42

系统版本:ubuntu12.04
1、安装JDK
我这里用的是64位机,要下载对应的64位的JDK,下载地址:http://www.oracle.com/technetwork/cn/java/javase/downloads/jdk7-downloads-1880260-zhs.html,选择对应的JDK版本,解压JDK,然后配置环境变量

vi /etc/profile
export PATH  
export JAVA_HOME=/usr/share/jdk1.7 .0_45
export PATH=$PATH:$JAVA_HOME/bin  
source /etc/profile  


测试下JDK是否安装成功: java -version
java version "1.7.0_45"  
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)  
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)  


2、编译前的准备(maven)
apt-get install mvn
验证配置是否成功: mvn --version
Apache Maven 3.0.4 
Maven home: /usr/share/maven  
Java version: 1.7.0_45, vendor: Oracle Corporation  
Java home: /usr/share/jdk1.7 .0_45/jre
Default locale: en_US, platform encoding: UTF-8  
OS name: "linux", version: "2.6.32-358.el6.x86_64", arch: "amd64", family: "unix"  

3、编译hadoop
进入/home/hadoop/hadoop-2.3.0-cdh5.0.0/src
如果是你32bit的机器,可以直接下载官方已经编译好的包,64bit的机子跑编译好的包跑不了。
由于maven国外服务器可能连不上,先给maven配置一下国内镜像,在maven目录下:conf/settings.xml,在<mirrors></mirros>里添加,原本的不要动
<mirror>
<id>nexus-osc</id>
<mirrorOf>*</mirrorOf>
<name>Nexusosc</name>
<url>http://maven.oschina.net/content/groups/public/</url>
</mirror>
同样,在<profiles></profiles>内新添加
<profile>
<id>jdk-1.7</id>
<activation>
<jdk>1.7</jdk>
</activation>
<repositories>
<repository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>nexus</id>
<name>local private nexus</name>
<url>http://maven.oschina.net/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</profile>

Hadoop2.3.0编译需要protoc2.5.0的支持,所以还要下载protoc,下载地址:https://code.google.com/p/protobuf/downloads/list,要下载2.5.0版本
对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略

apt-get install install gcc  
apt-get  intall gcc-c++  
apt-get install make  

安装protoc
tar -xvf protobuf-2.5.0.tar.bz2  
cd protobuf-2.5.0  
./configure --prefix=/opt/protoc/  
make && make install  
安装完配置下环境变量,就不多说了,跟上面过程一样。
别急,还不要着急开始编译安装,不然又是各种错误,需要安装cmake,openssl-devel,ncurses-devel依赖
apt-get  install cmake  
apt-get  install openssl-devel  
apt-get  install ncurses-devel  
由于是采用国内的软件源会出现avro-maven-plugin-1.7.5-cdh5.0.0.jar无法找到的情况,这里需要我们自己下载
https://repository.cloudera.com/artifactory/repo/org/apache/avro/avro-tools/1.7.5-cdh5.0.0/ avro-maven-plugin-1.7.5-cdh5.0.0.jar
把avro-maven-plugin-1.7.5-cdh5.0.0.jar传到/root/.m2/repository/org/apache/avro/avro-maven-plugin/1.7.5-cdh5.0.0目录下面
ok,现在可以进行编译了,
mvn clean package -Pdist,native -DskipTests -Dtar  

[INFO] ------------------------------------------------------------------------  
[INFO] Reactor Summary:  
[INFO]  
[INFO] Apache Hadoop Main ................................ SUCCESS [3.709s]  
[INFO] Apache Hadoop Project POM ......................... SUCCESS [2.229s]  
[INFO] Apache Hadoop Annotations ......................... SUCCESS [5.270s]  
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.388s]  
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [3.485s]  
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.655s]  
[INFO] Apache Hadoop Auth ................................ SUCCESS [7.782s]  
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [5.731s]  
[INFO] Apache Hadoop Common .............................. SUCCESS [1:52.476s]  
[INFO] Apache Hadoop NFS ................................. SUCCESS [9.935s]  
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.110s]  
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:58.347s]  
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [26.915s]  
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [17.002s]  
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [5.292s]  
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.073s]  
[INFO] hadoop-yarn ....................................... SUCCESS [0.335s]  
[INFO] hadoop-yarn-api ................................... SUCCESS [54.478s]  
[INFO] hadoop-yarn-common ................................ SUCCESS [39.215s]  
[INFO] hadoop-yarn-server ................................ SUCCESS [0.241s]  
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.601s]  
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.566s]  
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.754s]  
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [20.625s]  
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.755s]  
[INFO] hadoop-yarn-client ................................ SUCCESS [6.748s]  
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.155s]  
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.661s]  
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.160s]  
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [36.090s]  
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.753s]  
[INFO] hadoop-yarn-site .................................. SUCCESS [0.151s]  
[INFO] hadoop-yarn-project ............................... SUCCESS [4.771s]  
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.870s]  
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.812s]  
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [15.759s]  
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.831s]  
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [8.126s]  
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.320s]  
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [9.596s]  
[INFO] hadoop-mapreduce .................................. SUCCESS [3.905s]  
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.118s]  
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.651s]  
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.671s]  
[INFO] Apache Hadoop Rumen ............................... SUCCESS [10.038s]  
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.062s]  
[INFO] Apache Hadoop Data Join ........................... SUCCESS [4.104s]  
[INFO] Apache Hadoop Extras .............................. SUCCESS [4.210s]  
[INFO] Apache Hadoop Pipes ............................... SUCCESS [9.419s]  
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.306s]  
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.037s]  
[INFO] Apache Hadoop Distribution ........................ SUCCESS [21.579s]  
[INFO] Apache Hadoop Client .............................. SUCCESS [7.299s]  
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [7.347s]  
[INFO] ------------------------------------------------------------------------  
[INFO] BUILD SUCCESS  
[INFO] ------------------------------------------------------------------------  
[INFO] Total time: 11:53.144s  
[INFO] Finished at: Fri Nov 22 16:58:32 CST 2013  
[INFO] Final Memory: 70M/239M  
[INFO] ------------------------------------------------------------------------  

直到看到上面的内容那就说明编译完成了。

编译后的路径在: /home/hadoop/hadoop-2.3.0-cdh5.0.0/src/hadoop-dist/target/ hadoop-2.3.0-cdh5.0.0
[root@localhost bin]# ./hadoop version  
Hadoop 2.2.0  
Subversion Unknown -r Unknown  
Compiled by root on 2013-11-22T08:47Z  
Compiled with protoc 2.5.0  
From source with checksum 79e53ce7994d1628b240f09af91e1af4  
This command was run using /data/hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar  

可以看出hadoop的版本
root@dm-001: /home/hadoop/hadoop-2.3.0-cdh5.0.0/src/hadoop-dist/target/ hadoop-2.3.0-cdh5.0.0#file  lib//native/*  
lib//native/libhadoop.a:        current ar archive  
lib//native/libhadooppipes.a:   current ar archive  
lib//native/libhadoop.so:       symbolic link to `libhadoop.so.1.0.0'  
lib//native/libhadoop.so.1.0.0: <spanstyle="color:#ff0000;">ELF 64-bit LSB shared object, x86-64, version 1</span> (SYSV), dynamically linked, not stripped  
lib//native/libhadooputils.a:   current ar archive  
lib//native/libhdfs.a:          current ar archive  
lib//native/libhdfs.so:         symbolic link to `libhdfs.so.0.0.0'  
lib//native/libhdfs.so.0.0.0:   <spanstyle="color:#ff0000;">ELF 64-bit LSB shared object, x86-64, version 1</span> (SYSV), dynamically linked, not stripped  


路过

雷人

握手

鲜花

鸡蛋

评论 (0 个评论)

facelist doodle 涂鸦板

您需要登录后才可以评论 登录 | 立即注册

关闭

推荐上一条 /2 下一条