91超碰碰碰碰久久久久久综合_超碰av人澡人澡人澡人澡人掠_国产黄大片在线观看画质优化_txt小说免费全本

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

java寫hdfs程序

發布時間:2020-07-24 23:19:04 來源:網絡 閱讀:471 作者:zjy1002261870 欄目:大數據

1、hadoop默認臨時數據文件是存儲于Unix的tmp目錄下(cd /tmp 包含hadoop-root等文件),如果不進行修改,linux系統重啟后hadoop有可能出現不正常現象;故需要修改hadoop的臨時文件存放目錄
2、vim core-site.xml 配置如下,然后重啟hadoop集群,不要對namenode重新進行格式化操作
修改datanode /var/hadoop/dfs/data/current 目錄下VERSION文件的clusterid與namenode一致;然后啟動集群正常
<property>
<name>hadoop.tmp.dir</name>
<value>/var/hadoop</value>
</property>
在namenode執行格式化操作后,會導致namenode重新生成clusterid,而datanode的clusterID值沒變,
namenode與datanode clusterid不一致導致datanode啟動異常;需要手動改成與namenode一致
3、測試時,可以關閉權限檢查(否則沒有權限訪問),在namenode節點添加如下配置
vim hdfs-site.xml
<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.skcc</groupId>
<artifactId>wordcount</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>wordcount</name>
<description>count the word</description>

<properties>
    <project.build.sourceencoding>UTF-8</project.build.sourceencoding>
    <hadoop.version>2.7.3</hadoop.version>
</properties>
<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.12</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
</dependencies>

</project>

package com.skcc.hadoop;

import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.text.NumberFormat;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;

public class HelloHDFS {

public HelloHDFS() {
    // TODO Auto-generated constructor stub
}

public static FileSystem getFileSystemInstance() {
    Configuration conf = new Configuration();
    conf.set("fs.defaultFS", "hdfs://172.26.19.40:9000");
    FileSystem fileSystem = null;
    try {
        fileSystem = FileSystem.get(conf);
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    return fileSystem;
}

public static void getFileFromHDFS() throws Exception {
    //URL 默認處理http協議, FsUrlStreamHandlerFactory 處理hdfs協議
    URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());
    URL url=new URL("hdfs://172.26.19.40:9000/10803060234.txt");
    InputStream inputStream= url.openStream();
    IOUtils.copyBytes(inputStream, System.out, 4096,true);
}

public static void getFileFromBaiDu() throws IOException {

    URL url=new URL("http://skynet.skhynix-cq.com.cn/plusWare/Main.aspx");
    InputStream inputStream= url.openStream();
    IOUtils.copyBytes(inputStream, System.out, 4096,true);
}

public static void testHadoop() throws Exception {
    FileSystem fileSystem = getFileSystemInstance();

    Boolean success = fileSystem.mkdirs(new Path("/skcc"));
    System.out.println("mkdirs is " + success);

    success = fileSystem.exists(new Path("/10803060234.txt"));
    System.out.println("file exists is " + success);

    success = fileSystem.delete(new Path("/test2.data"),true);
    System.out.println("delete dirs is " + success);

    success = fileSystem.exists(new Path("/skcc"));
    System.out.println("dirs exists is "+ success);

}

public static void uploadFileToHDFS() throws Exception {
    FileSystem fileSystem = getFileSystemInstance();
    String filename = "/test2.data";
    // overwrite ==true
    FSDataOutputStream outputStream = fileSystem.create(new Path(filename), true);
    FileInputStream fis = new FileInputStream("D:\\2018\\u001.zip");

// IOUtils.copyBytes(fis, outputStream, 4096, true);

    long totalLen = fis.getChannel().size();
    long tmpSize = 0;
    double readPercent = 0;
    NumberFormat numberFormat = NumberFormat.getInstance();
    numberFormat.setMaximumFractionDigits(0);
    System.out.println("totalLen : " + totalLen + " available : " + fis.available());
    byte[] buf = new byte[4096];
    int len = fis.read(buf);
    while (len != -1) {
        tmpSize = tmpSize + len;
        String result = numberFormat.format((float)tmpSize / (float)totalLen * 100 );

        outputStream.write(buf,0,len);
        System.out.println("Upload Percent : " + result + "%");
        len = fis.read(buf);

    }

}

}

向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

始兴县| 深圳市| 安远县| 洞口县| 奉节县| 上蔡县| 昌吉市| 五华县| 循化| 沙坪坝区| 海伦市| 昌乐县| 瓦房店市| 十堰市| 新津县| 和林格尔县| 庆元县| 铁力市| 京山县| 黑河市| 灵丘县| 连山| 年辖:市辖区| 江阴市| 台中县| 佛坪县| 贵定县| 泰安市| 江津市| 渭源县| 桐柏县| 德令哈市| 宁陕县| 化州市| 河西区| 会东县| 永定县| 伊宁县| 依安县| 镶黄旗| 曲靖市|