您好,登錄后才能下訂單哦!
------------------------------軟件版本--------------------------------------
RHEL6.8 hadoop2.8.1 apache-maven-3.3.9 findbugs-1.3.9 protobuf-2.5.0.tar.gz jdk-8u45
------------------------------軟件版本---------------------------------------
1.Hadoop
宏觀: Hadoop為主的生態圈 hadoop flume........
狹義: apache hadoop hadoop.apache.org
2.Hadoop(存儲+計算+資源和作業調度)
hadoop1.x
HDFS 存儲
MapReduce 計算+資源和作業調度
hadoop2.x 企業正在用
HDFS 存儲
MapReduce 計算
YARN 資源和作業調度平臺 計算組件都會on yarn
hadoop3.x ???
EC技術:Erasure Encoding 簡稱EC,是Hadoop3給HDFS拓展的一種新特性,用來解決存儲空間文件。
YARN:提供YARN的時間軸服務V.2,以便用戶和開發人員可以對其進行測試,并提供反饋意見。
優化Hadoop Shell腳本
重構Hadoop Client Jar包
支持隨機Container
MapReduce任務級本地優化
支持多個NameNode
部分默認服務端口被改變
支持文件系統連接器
DataNode內部添加了負載均衡
重構后臺程序和任務對管理
2.Maven部署
blog
2.1解壓
[root@hadoop1 softwore]# pwd |
2.2配置mavne目錄
2.3查看配置文件和 解壓我們準備好的倉庫文件
3.Hadoop編譯
3.1解壓
3.2查看pom.xml
3.3查看BUILDING.txt
Requirements:編譯軟件環境要求
* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
3.4 JDK部署
[root@hadoop1 softwore]# tar -zxvf jdk-8u45-linux-x64.gz -C /usr/java[root@hadoop1 softwore]# ls -ld /usr/java/* drwxr-xr-x 8 root root 4096 12月 15 2016 /usr/java/djdk1.7.0_79 drwxr-xr-x 8 uucp 143 4096 4月 11 2015 /usr/java/jdk1.8.0_45 drwxr-xr-x 8 uucp 143 4096 10月 7 2015 /usr/java/jdk1.8.0_65 [root@hadoop1 softwore]# vim /etc/profile export JAVA_HOME=/usr/java/jdk1.8.0_45 [root@hadoop1 softwore]# source /etc/profile |
3.5 MAVEN
[root@hadoop000 hadoop-2.8.1-src]# mvn --version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /opt/software/apache-maven-3.3.9
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
[root@hadoop000 hadoop-2.8.1-src]#
3.6 FINDBUGS
[root@hadoop000 findbugs-1.3.9]# findbugs -version
1.3.9
[root@hadoop000 findbugs-1.3.9]#
3.7 PROTOCBUF
[root@hadoop000 local]# protoc --version
libprotoc 2.5.0
[root@hadoop000 local]#
3.8 OTHER
3.9 編譯
mvn clean package -Pdist,native -DskipTests -Dtar
3.10 解讀
4.hadoop部署
單機部署 進程沒有
偽分布式部署 進程存在+1節點 開發
集群部署 進程存在+n節點 開發/生產
下載的包: src 源代碼包里面不包含jar 小
不帶src或者帶bin 編譯好的組件 大
4.1解壓
tar -xzvf hadoop-2.8.1.tar.gz
chown -R root:root hadoop-2.8.1
4.2解讀解壓文件
[root@hadoop000 hadoop-2.8.1]# ll
total 148
drwxrwxr-x. 2 root root 4096 Jun 2 2017 bin
drwxrwxr-x. 3 root root 4096 Jun 2 2017 etc
drwxrwxr-x. 2 root root 4096 Jun 2 2017 include
drwxrwxr-x. 3 root root 4096 Jun 2 2017 lib
drwxrwxr-x. 2 root root 4096 Jun 2 2017 libexec
-rw-rw-r--. 1 root root 99253 Jun 2 2017 LICENSE.txt
-rw-rw-r--. 1 root root 15915 Jun 2 2017 NOTICE.txt
-rw-r--r--. 1 root root 1366 Jun 2 2017 README.txt
drwxrwxr-x. 2 root root 4096 Jun 2 2017 sbin
drwxrwxr-x. 4 root root 4096 Jun 2 2017 share
[root@hadoop000 hadoop-2.8.1]#
bin 執行命令的shell
etc 配置文件
lib 庫
sbin 啟動和關閉hadoop
share jar
[root@hadoop000 hadoop-2.8.1]# rm -f bin/*.cmd
[root@hadoop000 hadoop-2.8.1]# rm -f sbin/*.cmd
[root@hadoop000 hadoop-2.8.1]#
[root@hadoop000 hadoop-2.8.1]# ll bin
total 348
-rwxrwxr-x. 1 root root 139387 Jun 2 2017 container-executor
-rwxrwxr-x. 1 root root 6514 Jun 2 2017 hadoop
-rwxrwxr-x. 1 root root 12330 Jun 2 2017 hdfs
-rwxrwxr-x. 1 root root 6237 Jun 2 2017 mapred
-rwxrwxr-x. 1 root root 1776 Jun 2 2017 rcc
-rwxrwxr-x. 1 root root 156812 Jun 2 2017 test-container-executor
-rwxrwxr-x. 1 root root 14416 Jun 2 2017 yarn
[root@hadoop000 hadoop-2.8.1]#
4.3配置環境變量
[root@hadoop000 ~]# vi /etc/profile
export HADOOP_HOME=/opt/software/hadoop-2.8.1
export PATH=$HADOOP_HOME/bin:$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MVN_HOME/bin:$JAVA_HOME/bin:$PATH
[root@hadoop000 ~]# source /etc/profile
[root@hadoop000 ~]# which hadoop
/opt/software/hadoop-2.8.1/bin/hadoop
4.4配置core-site文件
etc/hadoop/core-site.xml:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
etc/hadoop/hdfs-site.xml:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
4.5ssh
[root@hadoop000 ~]# rm -rf .ssh
[root@hadoop000 ~]# ssh-keygen
[root@hadoop000 ~]# cd .ssh
[root@hadoop000 .ssh]# ll
total 8
-rw-------. 1 root root 1671 May 13 21:47 id_rsa
-rw-r--r--. 1 root root 396 May 13 21:47 id_rsa.pub
[root@hadoop000 .ssh]# cat id_rsa.pub >> authorized_keys
[root@hadoop000 .ssh]#
[root@hadoop000 ~]# ssh localhost date
The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is ec:85:86:32:22:94:d1:a9:f2:0b:c5:12:3f:ba:e2:61.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
Sun May 13 21:49:14 CST 2018
[root@hadoop000 ~]#
[root@hadoop000 ~]#
[root@hadoop000 ~]# ssh localhost date
Sun May 13 21:49:17 CST 2018
[root@hadoop000 ~]#
4.6 Format the filesystem:
$ bin/hdfs namenode -format
4.7 java home配置
[root@hadoop000 hadoop]# vi hadoop-env.sh
export JAVA_HOME=/usr/java/jdk1.8.0_45
4.8 Start NameNode daemon and DataNode daemon:
[root@hadoop000 hadoop-2.8.1]# sbin/start-dfs.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-namenode-hadoop000.out
localhost: starting datanode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-datanode-hadoop000.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
RSA key fingerprint is ec:85:86:32:22:94:d1:a9:f2:0b:c5:12:3f:ba:e2:61.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (RSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /opt/software/hadoop-2.8.1/logs/hadoop-root-secondarynamenode-hadoop000.out
[root@hadoop000 hadoop-2.8.1]#
[root@hadoop000 hadoop-2.8.1]#
[root@hadoop000 hadoop-2.8.1]#
[root@hadoop000 hadoop-2.8.1]#
[root@hadoop000 hadoop-2.8.1]# jps
16243 Jps
15943 DataNode
5127 Launcher
16139 SecondaryNameNode
15853 NameNode
[root@hadoop000 ~]# hdfs dfs -put jepson.log /
[root@hadoop000 ~]#
[root@hadoop000 ~]#
[root@hadoop000 ~]# hdfs dfs -ls /
Found 1 items
-rw-r--r-- 3 root supergroup 6 2018-05-13 21:57 /jepson.log
[root@hadoop000 ~]#
[root@hadoop000 ~]# hdfs dfs -cat /jepson.log
A
5
6
[root@hadoop000 ~]#
[root@hadoop000 ~]#
[root@hadoop000 ~]# cat jepson.log
A
5
6
[root@hadoop000 ~]#
0513作業:
1.hadoop編譯
2.hdfs偽分布式部署
3.以上兩篇blog
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。