티스토리 뷰

공부

[Spark] hadoop.fs.default.name (default file system)

seunggabi 승가비 2020. 12. 13. 00:46
728x90
SparkContext context = new SparkContext(new SparkConf().setAppName("spark-ml").setMaster("local[*]")
                .set("spark.hadoop.fs.default.name", "hdfs://localhost:54310").set("spark.hadoop.fs.defaultFS", "hdfs://localhost:54310")
                .set("spark.hadoop.fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName())
                .set("spark.hadoop.fs.hdfs.server", org.apache.hadoop.hdfs.server.namenode.NameNode.class.getName())
                .set("spark.hadoop.conf", org.apache.hadoop.hdfs.HdfsConfiguration.class.getName()));
        this.session = SparkSession.builder().sparkContext(context).getOrCreate();

https://stackoverflow.com/questions/56612266/how-to-read-files-from-hdfs-using-spark

 

How to read files from HDFS using Spark?

I have built a recommendation system using Apache Spark with datasets stored locally in my project folder, now i need to access these files from HDFS. How can i read files from HDFS using Spark ? ...

stackoverflow.com

 

728x90
댓글
댓글쓰기 폼