0
点赞
收藏
分享

微信扫一扫

在idea运行spark程序报错:The root scratch dir: /tmp/hive on HDFS should be writable

故障现象

win10 个人电脑,在idea运行spark程序【连接了hive】,代码类似如下:

 spark = SparkSession.builder()
        .appName(APP_NAME)
        .config("spark.master", MASTER)
        .config("spark.sql.warehouse.dir", new File(WAREHOUSE_LOC).getAbsolutePath())
        .enableHiveSupport()
        .getOrCreate();

报如下错误:

java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: r--r--r--;
org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: r--r--r--;
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:116)
	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:104)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)

故障问题分析

从报错日志看,应该与文件目录权限有关系

解决方法:

将/tmp/hive置为777权限,命令如下:

%HADOOP_HOME%\bin\winutils.exe chmod 777 H:\tmp\hive

如果没有配置HADOOP_HOME环境变量,请先配置,步骤如下:

  • 下载hadoop winutils hadoop.dll-and-winutils.exe-for-hadoop2.7.3-on-windows_X64-master.zip
  • 解压放在D盘根目录
  • 配置环境变量: HADOOP_HOME = D:\hadoop-lib
举报

相关推荐

0 条评论