91超碰碰碰碰久久久久久综合_超碰av人澡人澡人澡人澡人掠_国产黄大片在线观看画质优化_txt小说免费全本

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

安裝hadoop過程詳解

發布時間:2020-07-22 16:21:59 來源:網絡 閱讀:1204 作者:蛐蛐的熱情 欄目:大數據

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz  hadoop的下載文件

安裝jdk

http://www.linuxidc.com/Linux/2014-08/105906.htm 

安裝hadoop

進入

/root/zby/hadoop/hadoop-1.2.1/conf

配置hadoop,主要是配置core-site.xml,hdfs-site.xml,mapred-site.xml三個配置文件

4個文件需要編輯:

第一個文件改個jdk按照路徑即可

hadoop-env.sh

export HADOOP_HEAPSIZE=256  修改hadoop所用內存

#export JAVA_HOME=/usr/lib/jvm/jdk7   這行需要編輯

路徑不知道可以用如下命令進行查找

[root@iZ28c21psoeZ conf]# echo $JAVA_HOME

/usr/lib/jvm/jdk7

第二個文件:打開文件直接進行替換,如下中文 注釋都刪除后粘貼。。。

cd /opt/hadoop-1.2.1/conf 

vim core-site.xml

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>

<name>hadoop.tmp.dir</name>

<value>/hadoop</value>

</property>

<property>

<name>dfs.name.dir</name>

<value>hadoop/name</value>

</property>

第三個文件:如下中文 注釋都刪除后粘貼。。。

vim hdfs-site.xml

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>

<name>dfs.data.dir</name>

<value>/hadoop/data</value>

</property>

</configuration>

第四個文件:如下中文注釋都刪除后粘貼。。。

vim mapred-site.xml

 

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

<property>

<name>mapred.job.tracker</name>

<value>ldy:9001</value>

</property>

</configuration>

接下來還需要修改下vim /etc/profile

將如下代碼放置在最后,如果前5行在安裝jdk時已經生效可以不用添加。

export JAVA_HOME=/usr/lib/jvm/jdk7

export JRE_HOME=${JAVA_HOME}/jre

export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib

export PATH=${JAVA_HOME}/bin:$PATH

export HADOOP_HOME=/opt/hadoop-1.2.1

export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$PATH

接下來 進入該目錄:

/opt/hadoop-1.2.1/bin

對hadoop進行一個格式化操作:

hadoop -namenode -format

如果遇到如下錯誤:

Warning: $HADOOP_HOME is deprecated.

/opt/hadoop-1.2.1/bin/hadoop: line 350: /usr/lib/jdk7/bin/java: No such file or directory

/opt/hadoop-1.2.1/bin/hadoop: line 434: /usr/lib/jdk7/bin/java: No such file or directory

/opt/hadoop-1.2.1/bin/hadoop: line 434: exec: /usr/lib/jdk7/bin/java: cannot execute: No such file or directory

查看第一個文件是否正確 

[root@iZ28c21psoeZ conf]# echo $JAVA_HOME

/usr/lib/jvm/jdk7

接著執行,又報錯了。。

[root@iZ28c21psoeZ bin]# hadoop -namenode -format

Warning: $HADOOP_HOME is deprecated.

Unrecognized option: -namenode

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.

[root@iZ28c21psoeZ bin]#

可以修改的地方有兩個
      第一個(次要的):/opt/hadoop/conf/hadoop-env.sh

       修改參數: export HADOOP_HEAPSIZE=256   #默認值為2000M,為Java虛擬機占用的內存的大小 

  第二個(主要的):將如下源碼放在hadoop最下方保存

       查看/opt/hadoop/bin/hadoop 源碼:
       ################################################
####################
       if [[ $EUID -eq 0 ]]; then
           HADOOP_OPTS="$HADOOP_OPTS -jvm server $HADOOP_DATANODE_OPTS"
       else
           HADOOP_OPTS="$HADOOP_OPTS -server $HADOOP_DATANODE_OPTS"
       fi

       ####################################################################


重新執行,看看結果,貌似又報錯了。

[root@iZ28c21psoeZ bin]# ./hadoop namenode -format

Warning: $HADOOP_HOME is deprecated.


16/07/04 18:49:04 INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG: Starting NameNode

STARTUP_MSG:   host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG:   args = [-format]

STARTUP_MSG:   version = 1.2.1

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG:   java = 1.7.0_60

************************************************************/

[Fatal Error] core-site.xml:11:3: The element type "property" must be terminated by the matching end-tag "</property>".

16/07/04 18:49:04 FATAL conf.Configuration: error parsing conf file: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "</property>".

16/07/04 18:49:04 ERROR namenode.NameNode: java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "</property>".

        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1249)

        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)

        at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)

        at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)

Caused by: org.xml.sax.SAXParseException; systemId: file:/opt/hadoop-1.2.1/conf/core-site.xml; lineNumber: 11; columnNumber: 3; The element type "property" must be terminated by the matching end-tag "</property>".

        at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)

        at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:347)

        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:177)

        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1156)

        ... 6 more


16/07/04 18:49:04 INFO namenode.NameNode: SHUTDOWN_MSG:

/************************************************************

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

************************************************************/

[root@iZ28c21psoeZ bin]#

 

根據日志提示是3大配置文件中有錯誤:

果然:

</property>   寫成了</properry> 

重新執行一遍看看:

[root@iZ28c21psoeZ bin]# ./hadoop namenode -format

Warning: $HADOOP_HOME is deprecated.

16/07/04 18:55:26 INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG: Starting NameNode

STARTUP_MSG:   host = iZ28c21psoeZ/10.251.57.77

STARTUP_MSG:   args = [-format]

STARTUP_MSG:   version = 1.2.1

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013

STARTUP_MSG:   java = 1.7.0_60

************************************************************/

16/07/04 18:55:27 INFO util.GSet: Computing capacity for map BlocksMap

16/07/04 18:55:27 INFO util.GSet: VM type       = 64-bit

16/07/04 18:55:27 INFO util.GSet: 2.0% max memory = 259522560

16/07/04 18:55:27 INFO util.GSet: capacity      = 2^19 = 524288 entries

16/07/04 18:55:27 INFO util.GSet: recommended=524288, actual=524288

16/07/04 18:55:32 INFO namenode.FSNamesystem: fsOwner=root

16/07/04 18:55:33 INFO namenode.FSNamesystem: supergroup=supergroup

16/07/04 18:55:33 INFO namenode.FSNamesystem: isPermissionEnabled=true

16/07/04 18:55:42 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100

16/07/04 18:55:42 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)

16/07/04 18:55:42 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0

16/07/04 18:55:42 INFO namenode.NameNode: Caching file names occuring more than 10 times

16/07/04 18:55:45 INFO common.Storage: Image file /hadoop/dfs/name/current/fsp_w_picpath of size 110 bytes saved in 0 seconds.

16/07/04 18:55:47 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/dfs/name/current/edits

16/07/04 18:55:47 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/dfs/name/current/edits

16/07/04 18:55:48 INFO common.Storage: Storage directory /hadoop/dfs/name has been successfully formatted.

16/07/04 18:55:48 INFO namenode.NameNode: SHUTDOWN_MSG:

/************************************************************

SHUTDOWN_MSG: Shutting down NameNode at iZ28c21psoeZ/10.251.57.77

************************************************************/

完美:接著做:

cd /opt/hadoop-1.2.1/bin

[root@iZ28c21psoeZ bin]# start-all.sh

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-iZ28c21psoeZ.out

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

starting jobtracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-iZ28c21psoeZ.out

localhost: socket: Address family not supported by protocol

localhost: ssh: connect to host localhost port 22: Address family not supported by protocol

[root@iZ28c21psoeZ bin]#

翻譯一下:

警告:$ HADOOP_HOME棄用。


namenode開始,日志/ opt / hadoop-1.2.1 / libexec / . . /日志/ hadoop-root-namenode-iZ28c21psoeZ.out
localhost:套接字:家庭地址不支持的協議
localhost:ssh連接到主機本地主機端口22:家庭地址不支持的協議
localhost:套接字:家庭地址不支持的協議
localhost:ssh連接到主機本地主機端口22:家庭地址不支持的協議
jobtracker開始,日志/ opt / hadoop-1.2.1 / libexec / . . /日志/ hadoop-root-jobtracker-iZ28c21psoeZ.out

localhost:套接字:家庭地址不支持的協議 

在修改下代碼:

根據日志所示是端口不對,將hadoop的端口改成和服務器的ssh端口一致即可。

在conf/hadoop-env.sh里改下 新增一條  export HADOOP_SSH_OPTS="-p 1234"

在執行一下:

[root@ldy bin]# sh start-all.sh

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-namenode-ldy.out

localhost: starting datanode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-datanode-ldy.out

localhost: starting secondarynamenode, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-secondarynamenode-ldy.out

starting jobtracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-ldy.out

localhost: starting tasktracker, logging to /opt/hadoop-1.2.1/libexec/../logs/hadoop-root-tasktracker-ldy.out

[root@ldy bin]# jps

27054 DataNode

26946 NameNode

27374 TaskTracker

27430 Jps

27250 JobTracker

27165 SecondaryNameNode

ok現在6個端口都起來了,成功。。


向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

根河市| 泌阳县| 九寨沟县| 上思县| 时尚| 历史| 汉川市| 津南区| 巴塘县| 鹤峰县| 富民县| 苍山县| 永福县| 潢川县| 四会市| 阳朔县| 长泰县| 灌南县| 彭水| 通辽市| 桃园市| 日喀则市| 五华县| 兴国县| 教育| 九龙县| 长春市| 黑水县| 东平县| 会理县| 容城县| 南康市| 资中县| 游戏| 广灵县| 临武县| 蓝田县| 平罗县| 乌审旗| 嘉荫县| 张家港市|