文章目录:
- 1、apache servicemix7的源码在哪
- 2、如何编译Apache Hadoop2.4.0源代码
- 3、服务器的apache安装后怎么设置网站源码后面那些
- 4、如何编译Apache Hadoop2.2.0源代码
apache servicemix7的源码在哪
运行 Terminal,输入命令:复制代码 代码如下:ssh username@ip,然后输入密码。2. 安装 Apache 软件:复制代码 代码如下:yum install httpd3. 设置 Apache 在服务器启动时运行:复制代码 代码如下:chkconfig --levels 235 httpd on4. 在 Apache 配置文件中配置域名:复制代码 代码如下:vi /etc/httpd/conf/httpd.conf,找到 ServerName ,添加逗域名:80地,保存并退出。5. 重启 Apache:复制代码 代码如下:service httpd restart6. 浏览器中访问第4步配置的域名,如果出现逗Apache 2 Test Page powered by CentOS地的页面,说明配置成功。
如何编译Apache Hadoop2.4.0源代码
安装JDK
hadoop是java写的,编译hadoop必须安装jdk。
从oracle官网下载jdk,下载地址是,选择 jdk-7u45-linux-x64.tar.gz下载。
执行以下命令解压缩jdk
tar -zxvf jdk-7u45-linux-x64.tar.gz
会生成一个文件夹jdk1.7.0_45,然后设置环境变量中。
执行命令 vi/etc/profile,增加以下内容到配置文件中,结果显示如下
export JAVA_HOME=/usr/java/jdk1.7.0_45
export JAVA_OPTS="-Xms1024m-Xmx1024m"
exportCLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar:$CLASSPATH
export PATH=$JAVA_HOME/bin:$PATH
保存退出文件后,执行以下命令
source /etc/profile
java –version 看到显示的版本信息即正确。
安装maven
hadoop源码是使用maven组织管理的,必须下载maven。从maven官网下载,下载地址是,选择 apache-maven-3.1.0-bin.tar.gz 下载,不要选择3.1下载。
执行以下命令解压缩jdk
tar -zxvf apache-maven-3.1.0-bin.tar.gz
会生成一个文件夹apache-maven-3.1.0,然后设置环境变量中。
执行命令vi /etc/profile,编辑结果如下所示
MAVEN_HOME=/usr/maven/apache-maven-3.1.0
export MAVEN_HOME
export PATH=${PATH}:${MAVEN_HOME}/bin
保存退出文件后,执行以下命令
source /etc/profile
mvn -version
如果看到下面的显示信息,证明配置正确了。
服务器的apache安装后怎么设置网站源码后面那些
运行 Terminal,输入命令:复制代码 代码如下:ssh username@ip,然后输入密码。2. 安装 Apache 软件:复制代码 代码如下:yum install httpd3. 设置 Apache 在服务器启动时运行:复制代码 代码如下:chkconfig --levels 235 httpd on4. 在 Apache 配置文件中配置域名:复制代码 代码如下:vi /etc/httpd/conf/httpd.conf,找到 ServerName ,添加“域名:80”,保存并退出。5. 重启 Apache:复制代码 代码如下:service httpd restart6. 浏览器中访问第4步配置的域名,如果出现“Apache 2 Test Page powered by CentOS”的页面,说明配置成功。
如何编译Apache Hadoop2.2.0源代码
从hadoop官网下载2.2稳定版,下载地址是,下载hadoop-2.2.0-src.tar.gz 下载。
执行以下命令解压缩jdk
tar -zxvf hadoop-2.2.0-src.tar.gz
会生成一个文件夹 hadoop-2.2.0-src。源代码中有个bug,这里需要修改一下,编辑目录/usr/local/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth中的文件pom.xml,执行以下命令
gedit pom.xml
在第55行下增加以下内容
dependency
groupIdorg.mortbay.jetty/groupId
artifactIdjetty-util/artifactId
scopetest/scope
/dependency
保存退出即可。
上述bug详见,在hadoop3中修复了,离我们太遥远了。
好了,现在进入到目录/usr/local/hadoop-2.2.0-src中,执行命令
mvn package -DskipTests -Pdist,native,docs
如果没有执行第4步,把上面命令中的docs去掉即可,就不必生成文档了。
该命令会从外网下载依赖的jar,编译hadoop源码,需要花费很长时间,你可以吃饭了。
在等待n久之后,可以看到如下的结果:
[INFO] Apache Hadoop Main ................................ SUCCESS [6.936s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [4.928s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [9.399s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.871s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [7.981s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.965s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [39.748s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [11.081s]
[INFO] Apache Hadoop Common .............................. SUCCESS [10:41.466s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [26.346s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.061s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [12:49.368s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [41.896s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [41.043s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [9.650s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.051s]
[INFO] hadoop-yarn ....................................... SUCCESS [1:22.693s]
[INFO] hadoop-yarn-api ................................... SUCCESS [1:20.262s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:30.530s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.177s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.781s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [40.800s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [6.099s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [37.639s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [4.516s]
[INFO] hadoop-yarn-client ................................ SUCCESS [25.594s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.286s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [10.143s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.119s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [55.812s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [8.749s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.524s]
[INFO] hadoop-yarn-project ............................... SUCCESS [16.641s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [40.796s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [7.628s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [24.066s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [13.243s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [16.670s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.787s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [17.012s]
[INFO] hadoop-mapreduce .................................. SUCCESS [6.459s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [12.149s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.968s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [5.851s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [18.364s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [14.943s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [9.648s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [5.763s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [16.289s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.261s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.043s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [56.188s]
[INFO] Apache Hadoop Client .............................. SUCCESS [10.910s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.321s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 40:00.444s
[INFO] Finished at: Thu Dec 26 12:42:24 CST 2013
[INFO] Final Memory: 109M/362M
[INFO] ------------------------------------------------------------------------
好了,编译完成了。
SPATHexport PATH=$JAVA_HOME/bin:$PATH保存退出文件后,执行以下命令source /etc/profilejava –version 看到显示的版本信息即正确。安装mavenhado
...................... SUCCESS [10.910s][INFO] Apache Hadoop Mini-Cluster ..................
.................. SUCCESS [25.594s][INFO] hadoop-yarn-applications .......................... SUC