91超碰碰碰碰久久久久久综合_超碰av人澡人澡人澡人澡人掠_国产黄大片在线观看画质优化_txt小说免费全本

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

安裝SPARK和SCALA

發布時間:2020-07-14 22:25:16 來源:網絡 閱讀:418 作者:刀刀_高揚 欄目:大數據

1、下載 spark


http://mirrors.cnnic.cn/apache/spark/spark-1.3.0/spark-1.3.0-bin-hadoop2.3.tgz



2、下載scala


http://www.scala-lang.org/download/2.10.5.html



3、安裝scala

mkdir /usr/lib/scala

tar –zxvf scala-2.10.5.tgz

mv scala-2.10.5 /usr/lib/scala



4、設置scala路徑

vim /etc/bashrc

export SCALA_HOME=/usr/lib/scala/scala-2.10.5

export PATH=$SCALA_HOME/bin:$PATH


source /etc/bashrc


scala –version



5、分發

scp -r /usr/lib/scala/ hd2:/usr/lib/scala

scp -r /usr/lib/scala/ hd3:/usr/lib/scala

scp -r /usr/lib/scala/ hd4:/usr/lib/scala

scp -r /usr/lib/scala/ hd5:/usr/lib/scala


scp /etc/bashrc hd2:/etc/bashrc

scp /etc/bashrc hd3:/etc/bashrc

scp /etc/bashrc hd4:/etc/bashrc

scp /etc/bashrc hd5:/etc/bashrc



6、安裝spark

tar -zxvf spark-1.3.0-bin-hadoop2.3.tgz

mkdir /usr/local/spark

mv spark-1.3.0-bin-hadoop2.3 /usr/local/spark



vim /etc/bashrc

export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3

export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH



source /etc/bashrc


cd /usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf/

cp spark-env.sh.template spark-env.sh



vim spark-env.sh


export JAVA_HOME=/java

export SCALA_HOME=/usr/lib/scala/scala-2.10.5

export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3

export SPARK_MASTER_IP=192.168.137.101

export SPARK_WORKER_MEMORY=10g

export SPARK_DRIVER_MEMORY=9g

export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop

export SPARK_LIBRARY_PATH=$SPARK_HOME/lib

export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH


cp slaves.template slaves



vim slaves


hd1

hd2

hd3

hd4

hd5



7、分發

scp /etc/bashrc hd2:/etc

scp /etc/bashrc hd3:/etc

scp /etc/bashrc hd4:/etc

scp /etc/bashrc hd5:/etc


scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/

scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/



7、啟動

在hd1,啟動

cd $SPARK_HOME/sbin

./start-all.sh


向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

阳东县| 松滋市| 称多县| 深州市| 奉化市| 和田县| 兴文县| 蒙城县| 双柏县| 盖州市| 浦县| 台中市| 广东省| 千阳县| 南通市| 潮安县| 汝城县| 武汉市| 织金县| 巫山县| 江口县| 四会市| 临颍县| 营口市| 北票市| 新闻| 银川市| 云阳县| 建湖县| 治多县| 东莞市| 乌拉特后旗| 绥滨县| 玉溪市| 黄浦区| 泸定县| 绍兴县| 定结县| 和林格尔县| 民乐县| 莱芜市|