Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
18:11:09.346 [main] WARN org.apache.hadoop.util.Shell - Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
... 16 common frames omitted
18:11:09.348 [main] DEBUG org.apache.hadoop.util.Shell - Failed to find winutils.exe
Java通过spark提供的接口org.apache.spark.api.java
调用spark但是这仅限于在普通项目中。例如
windows上配置hadoop并通过idea连接本地spark和服务器spark本篇文章中介绍了在普通Maven项目如何使用spark。但当同样的项目移植到spring boot时就行不通了。老是包如标题的错误。
[org.apache.hadoop.util.Shell] - Failed to detect a valid hadoop home directory
java.io.IOException : HADOOP_HOME or hadoop.home.dir are not set.
可以已经配置了环境变量 还是出现找不到路径
在代码中人工添加路径:
System.setProperty("hadoop.home.dir","D:\\SoftWares\\Apache\\spark-3.3.1-bin-hadoop3");
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。
文章由极客之音整理,本文链接:https://www.bmabk.com/index.php/post/156199.html