使用 FileSystem JAVA API 对 HDFS 进行读、写、删除等操做

2020年05月15日 阅读数:101
这篇文章主要向大家介绍使用 FileSystem JAVA API 对 HDFS 进行读、写、删除等操做,主要内容包括基础应用、实用技巧、原理机制等方面,希望对大家有所帮助。
Hadoop文件系统 
基本的文件系统命令操做, 经过hadoop fs -help能够获取全部的命令的详细帮助文件。 

Java抽象类org.apache.hadoop.fs.FileSystem定义了hadoop的一个文件系统接口。该类是一个抽象类,经过如下两种静态工厂方法能够过去FileSystem实例: 
public static FileSystem.get(Configuration conf) throws IOException 
public static FileSystem.get(URI uri, Configuration conf) throws IOException
 

具体方法实现: 
一、public boolean mkdirs(Path f) throws IOException 
一次性新建全部目录(包括父目录), f是完整的目录路径。 

二、public FSOutputStream create(Path f) throws IOException 
建立指定path对象的一个文件,返回一个用于写入数据的输出流 
create()有多个重载版本,容许咱们指定是否强制覆盖已有的文件、文件备份数量、写入文件缓冲区大小、文件块大小以及文件权限。 

三、public boolean copyFromLocal(Path src, Path dst) throws IOException 
将本地文件拷贝到文件系统 

四、public boolean exists(Path f) throws IOException 
检查文件或目录是否存在 

五、public boolean delete(Path f, Boolean recursive) 
永久性删除指定的文件或目录,若是f是一个空目录或者文件,那么recursive的值就会被忽略。只有recursive=true时,一个非空目录及其内容才会被删除。 

六、FileStatus类封装了文件系统中文件和目录的元数据,包括文件长度、块大小、备份、修改时间、全部者以及权限信息。 

经过"FileStatus.getPath()"可查看指定HDFS中某个目录下全部文件。 html

01 package hdfsTest;
02  
03 import java.io.IOException;
04  
05 import org.apache.hadoop.conf.Configuration;
06 import org.apache.hadoop.fs.FSDataOutputStream;
07 import org.apache.hadoop.fs.FileStatus;
08 import org.apache.hadoop.fs.FileSystem;
09 import org.apache.hadoop.fs.Path;
10  
11 public class OperatingFiles {
12     //initialization
13     static Configuration conf = new Configuration();
14     static FileSystem hdfs;
15     static {
16         String path = "/usr/java/hadoop-1.0.3/conf/";
17         conf.addResource(new Path(path + "core-site.xml"));
18         conf.addResource(new Path(path + "hdfs-site.xml"));
19         conf.addResource(new Path(path + "mapred-site.xml"));
20         path = "/usr/java/hbase-0.90.3/conf/";
21         conf.addResource(new Path(path + "hbase-site.xml"));
22         try {
23             hdfs = FileSystem.get(conf);
24         catch (IOException e) {
25             e.printStackTrace();
26         }
27     }
28      
29     //create a direction
30     public void createDir(String dir) throws IOException {
31         Path path = new Path(dir);
32         hdfs.mkdirs(path);
33         System.out.println("new dir \t" + conf.get("fs.default.name") + dir);
34     }  
35      
36     //copy from local file to HDFS file
37     public void copyFile(String localSrc, String hdfsDst) throws IOException{
38         Path src = new Path(localSrc);     
39         Path dst = new Path(hdfsDst);
40         hdfs.copyFromLocalFile(src, dst);
41          
42         //list all the files in the current direction
43         FileStatus files[] = hdfs.listStatus(dst);
44         System.out.println("Upload to \t" + conf.get("fs.default.name") + hdfsDst);
45         for (FileStatus file : files) {
46             System.out.println(file.getPath());
47         }
48     }
49      
50     //create a new file
51     public void createFile(String fileName, String fileContent) throws IOException {
52         Path dst = new Path(fileName);
53         byte[] bytes = fileContent.getBytes();
54         FSDataOutputStream output = hdfs.create(dst);
55         output.write(bytes);
56         System.out.println("new file \t" + conf.get("fs.default.name") + fileName);
57     }
58      
59     //list all files
60     public void listFiles(String dirName) throws IOException {
61         Path f = new Path(dirName);
62         FileStatus[] status = hdfs.listStatus(f);
63         System.out.println(dirName + " has all files:");
64         for (int i = 0; i< status.length; i++) {
65             System.out.println(status[i].getPath().toString());
66         }
67     }
68  
69     //judge a file existed? and delete it!
70     public void deleteFile(String fileName) throws IOException {
71         Path f = new Path(fileName);
72         boolean isExists = hdfs.exists(f);
73         if (isExists) { //if exists, delete
74             boolean isDel = hdfs.delete(f,true);
75             System.out.println(fileName + "  delete? \t" + isDel);
76         else {
77             System.out.println(fileName + "  exist? \t" + isExists);
78         }
79     }
80  
81     public static void main(String[] args) throws IOException {
82         OperatingFiles ofs = new OperatingFiles();
83         System.out.println("\n=======create dir=======");
84         String dir = "/test";
85         ofs.createDir(dir);
86         System.out.println("\n=======copy file=======");
87         String src = "/home/ictclas/Configure.xml";
88         ofs.copyFile(src, dir);
89         System.out.println("\n=======create a file=======");
90         String fileContent = "Hello, world! Just a test.";
91         ofs.createFile(dir+"/word.txt", fileContent);
92     }
93 }

Using HDFS in java (0.20.0)

Below is a code sample of how to read from and write to HDFS in java. 

1. Creating a configuration object:   To be able to read from or write to HDFS, you need to create a Configuration object and pass configuration parameter to it using hadoop configuration files.  
  
    // Conf object will read the HDFS configuration parameters from these   XML
    // files. You may specify the parameters for your own if you want.
 

    Configuration conf = new Configuration(); 
    conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml")); 
    conf.addResource(new Path("/opt/hadoop-0.20.0/conf/hdfs-site.xml")); 

    If you do not assign the configurations to conf object (using hadoop xml file) your HDFS operation will be performed on the local file system and not on the HDFS. 

2. Adding file to HDFS:
 Create a FileSystem object and use a file stream to add a file. 

    FileSystem fileSystem = FileSystem.get(conf);
    
    // Check if the file already exists

    Path path = new Path("/path/to/file.ext");
    if (fileSystem.exists(path)) {
        System.out.println("File " + dest + " already exists");
        return;
    }

    // Create a new file and write data to it.
    FSDataOutputStream out = fileSystem.create(path);
    InputStream in = new BufferedInputStream(new FileInputStream(
        new File(source)));


    byte[] b = new byte[1024];
    int numBytes = 0;
    while ((numBytes = in.read(b)) > 0) {
        out.write(b, 0, numBytes);
    }

    // Close all the file descripters
    in.close();
    out.close();
    fileSystem.close();

3. Reading file from HDFS: Create a file stream object to a file in HDFS and read it. 

    FileSystem fileSystem = FileSystem.get(conf);

    Path path = new Path("/path/to/file.ext");
 
    if (!fileSystem.exists(path)) { 
        System.out.println("File does not exists"); 
        return; 
    }

    FSDataInputStream in = fileSystem.open(path);
 

    String filename = file.substring(file.lastIndexOf('/') + 1,
        file.length());
 

    OutputStream out = new BufferedOutputStream(new FileOutputStream(
        new File(filename)));
 

    byte[] b = new byte[1024]; 
    int numBytes = 0; 
    while ((numBytes = in.read(b)) > 0) { 
        out.write(b, 0, numBytes); 
    } 

    in.close(); 
    out.close(); 
    fileSystem.close(); 

3. Deleting file from HDFS: Create a file stream object to a file in HDFS and delete it. 

    FileSystem fileSystem = FileSystem.get(conf); 

    Path path = new Path("/path/to/file.ext"); 
    if (!fileSystem.exists(path)) { 
        System.out.println("File does not exists"); 
        return; 
    }

    // Delete file
    fileSystem.delete(new Path(file), true);
 

    fileSystem.close(); 

3. Create dir in HDFS: Create a file stream object to a file in HDFS and read it. 

    FileSystem fileSystem = FileSystem.get(conf); 

    Path path = new Path(dir); 
    if (fileSystem.exists(path)) { 
        System.out.println("Dir " + dir + " already not exists"); 
        return; 
    }

    // Create directories
    fileSystem.mkdirs(path);
 

    fileSystem.close(); 

Code:

001 import java.io.BufferedInputStream;
002 import java.io.BufferedOutputStream;
003 import java.io.File;
004 import java.io.FileInputStream;
005 import java.io.FileOutputStream;
006 import java.io.IOException;
007 import java.io.InputStream;
008 import java.io.OutputStream;
009  
010 import org.apache.hadoop.conf.Configuration;
011 import org.apache.hadoop.fs.FSDataInputStream;
012 import org.apache.hadoop.fs.FSDataOutputStream;
013 import org.apache.hadoop.fs.FileSystem;
014 import org.apache.hadoop.fs.Path;
015  
016 public class HDFSClient {
017     public HDFSClient() {
018  
019     }
020  
021     public void addFile(String source, String dest) throws IOException {
022         Configuration conf = new Configuration();
023  
024         // Conf object will read the HDFS configuration parameters from these
025         // XML files.
026         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
027         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/hdfs-site.xml"));
028  
029         FileSystem fileSystem = FileSystem.get(conf);
030  
031         // Get the filename out of the file path
032         String filename = source.substring(source.lastIndexOf('/') + 1,
033             source.length());
034  
035         // Create the destination path including the filename.
036         if (dest.charAt(dest.length() - 1) != '/') {
037             dest = dest + "/" + filename;
038         else {
039             dest = dest + filename;
040         }
041  
042         // System.out.println("Adding file to " + destination);
043  
044         // Check if the file already exists
045         Path path = new Path(dest);
046         if (fileSystem.exists(path)) {
047             System.out.println("File " + dest + " already exists");
048             return;
049         }
050  
051         // Create a new file and write data to it.
052         FSDataOutputStream out = fileSystem.create(path);
053         InputStream in = new BufferedInputStream(new FileInputStream(
054             new File(source)));
055  
056         byte[] b = new byte[1024];
057         int numBytes = 0;
058         while ((numBytes = in.read(b)) > 0) {
059             out.write(b, 0, numBytes);
060         }
061  
062         // Close all the file descripters
063         in.close();
064         out.close();
065         fileSystem.close();
066     }
067  
068     public void readFile(String file) throws IOException {
069         Configuration conf = new Configuration();
070         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
071  
072         FileSystem fileSystem = FileSystem.get(conf);
073  
074         Path path = new Path(file);
075         if (!fileSystem.exists(path)) {
076             System.out.println("File " + file + " does not exists");
077             return;
078         }
079  
080         FSDataInputStream in = fileSystem.open(path);
081  
082         String filename = file.substring(file.lastIndexOf('/') + 1,
083             file.length());
084  
085         OutputStream out = new BufferedOutputStream(new FileOutputStream(
086             new File(filename)));
087  
088         byte[] b = new byte[1024];
089         int numBytes = 0;
090         while ((numBytes = in.read(b)) > 0) {
091             out.write(b, 0, numBytes);
092         }
093  
094         in.close();
095         out.close();
096         fileSystem.close();
097     }
098  
099     public void deleteFile(String file) throws IOException {
100         Configuration conf = new Configuration();
101         conf.addResource(new Path("/opt/hadoop-0.20.0/conf/core-site.xml"));
102  
103         FileSystem fileSystem = FileSystem.get(conf);
104  
105