立即注册 登录
About云-梭伦科技 返回首页

Wyy_Ck的个人空间 https://www.aboutyun.com/?50800 [收藏] [复制] [分享] [RSS]

日志

hadoop学习1 --java操作hdfs

已有 1015 次阅读2017-6-9 22:24 |个人分类:BigData

1、创建目录
package hdfs.operation;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class MakeDir {
public static void main(String[] args) throws IOException {
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.86.133:9000"); // master

FileSystem fs = FileSystem.get(conf);
Path path = new Path("/user/hadoop/");
fs.mkdirs(path);
fs.close();
System.out.println("end");
}
}

2、创建文件
fs.create(path);

3、读取文件
package hdfs.operation;

import java.io.IOException;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class FileReadFromHdfs {

public static void main(String[] args) {
try {
String dsf = "/user/hadoop/write.txt";
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.86.133:9000"); // master

FileSystem fs = FileSystem.get(URI.create(dsf), conf);
FSDataInputStream hdfsInStream = fs.open(new Path(dsf));

byte[] ioBuffer = new byte[1024];
int readLen = hdfsInStream.read(ioBuffer);
while (readLen != -1) {
System.out.write(ioBuffer, 0, readLen);
readLen = hdfsInStream.read(ioBuffer);
}
hdfsInStream.close();
fs.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

}

}

等等

路过

雷人

握手

鲜花

鸡蛋

评论 (0 个评论)

facelist doodle 涂鸦板

您需要登录后才可以评论 登录 | 立即注册

关闭

推荐上一条 /2 下一条