hdfs dfs -ls hdfs://bigdata02:9000/
可以简写成 hdfs dfs -ls /

可以采用put命令进行文件上传



本地idea运行时候,可能报异常:HADOOP_HOME and hadoop.home.dir are unset
解决方案:Java运行Hadoop发生异常:HADOOP_HOME and hadoop.home.dir are unset_凌冰_的博客-CSDN博客
- public static void main(String[] args) {
- try {
- Configuration conf = new Configuration();
- conf.set("fs.defaultFS", "hdfs://192.168.221.131:9000");
- FileSystem fileSystem = FileSystem.get(conf);
- put(fileSystem);
- } catch (Exception e) {
- e.printStackTrace();
- }
- }
- //文件上传
- private static void put(FileSystem fileSystem) throws IOException{
- FSDataOutputStream fos = fileSystem.create(new Path("/user.txt"));
- FileInputStream fis = new FileInputStream("D:\\bigdata/user.txt");
- IOUtils.copyBytes(fis,fos,1024,true);
- }
运行结果:

-
- public static void main(String[] args) {
- try {
- Configuration conf = new Configuration();
- conf.set("fs.defaultFS", "hdfs://192.168.221.131:9000");
- FileSystem fileSystem = FileSystem.get(conf);
- get(fileSystem);
- } catch (Exception e) {
- e.printStackTrace();
- }
- }
-
- //文件下载
- private static void get(FileSystem fileSystem) throws IOException{
- FSDataInputStream fis = fileSystem.open(new Path("/user.txt"));
- FileOutputStream fos = new FileOutputStream("D:\\bigdata/user1.txt");
- IOUtils.copyBytes(fis,fos,1024,true);
- }
运行结果:

最终参考如下文章解决:Java运行Hadoop发生异常:HADOOP_HOME and hadoop.home.dir are unset_凌冰_的博客-CSDN博客