配置了环境变量却依然报错Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

得意时要看淡,失意时要看开。不论得意失意,切莫大意;不论成功失败,切莫止步。志得意满时,需要的是淡然,给自己留一条退路;失意落魄时,需要的是泰然,给自己觅一条出路配置了环境变量却依然报错Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.,希望对大家有帮助,欢迎收藏,转发!站点地址:www.bmabk.com,来源:原文

Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
	at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
	at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
	at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
	at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
	at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
	at scala.Option.getOrElse(Option.scala:201)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
	at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
	at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
18:11:09.346 [main] WARN org.apache.hadoop.util.Shell - Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
	at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547)
	at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568)
	at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
	at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
	at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
	at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
	at scala.Option.getOrElse(Option.scala:201)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
	at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
	at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
	at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
	at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
	... 16 common frames omitted
18:11:09.348 [main] DEBUG org.apache.hadoop.util.Shell - Failed to find winutils.exe

Java通过spark提供的接口org.apache.spark.api.java调用spark但是这仅限于在普通项目中。例如

windows上配置hadoop并通过idea连接本地spark和服务器spark本篇文章中介绍了在普通Maven项目如何使用spark。但当同样的项目移植到spring boot时就行不通了。老是包如标题的错误。

[org.apache.hadoop.util.Shell] - Failed to detect a valid hadoop home directory
java.io.IOException : HADOOP_HOME or hadoop.home.dir are not set.

可以已经配置了环境变量 还是出现找不到路径

在这里插入图片描述

在代码中人工添加路径:

System.setProperty("hadoop.home.dir","D:\\SoftWares\\Apache\\spark-3.3.1-bin-hadoop3");

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。

文章由极客之音整理,本文链接:https://www.bmabk.com/index.php/post/156199.html

(0)
飞熊的头像飞熊bm

相关推荐

发表回复

登录后才能评论
极客之音——专业性很强的中文编程技术网站,欢迎收藏到浏览器,订阅我们!