程序问答   发布时间:2022-06-01  发布网站:大佬教程  code.js-code.com
大佬教程收集整理的这篇文章主要介绍了无法从本地机器上的 Intellij 连接 Hive Metastore大佬教程大佬觉得挺不错的,现在分享给大家,也给大家做个参考。

如何解决无法从本地机器上的 Intellij 连接 Hive Metastore?

开发过程中遇到无法从本地机器上的 Intellij 连接 Hive Metastore的问题如何解决?下面主要结合日常开发的经验,给出你关于无法从本地机器上的 Intellij 连接 Hive Metastore的解决方法建议,希望对你解决无法从本地机器上的 Intellij 连接 Hive Metastore有所启发或帮助;

在查看所有解决方案后,我问了这个问题。我一直在尝试从数据框创建视图。我能够连接数据框,但无法在此基础上创建视图。我无法使用最新版本的 Intellij 连接到本地 metastore。 我使用的是 Hadoop 3.0.0。下面是我的 POM。

<dependency>
<groupID>org.scala-lang</groupID>
<artifactID>scala-library</artifactID>
<version>2.12.12</version>
</dependency>

<dependency>
<groupID>org.apache.spark</groupID>
<artifactID>spark-core_2.12</artifactID>
<version>3.0.1</version>
</dependency>

<dependency>
<groupID>org.apache.spark</groupID>
<artifactID>spark-sql_2.12</artifactID>
<version>3.0.1</version>
</dependency>

我的 Spark 配置如下所示。

def getSparkSession(): SparkSession = {​​​​​
var session: SparkSession = null
var flag: Boolean = false
try {​​​​​
System.setProperty("HADOOP_HOME",constants.HADOOP_HOME)
System.setProperty("hadoop.home.dir",constants.HADOOP_HOME_DIR)
session = SparkSession
.builder()
.appname("ValIDation").master("local[1]")
.config("hive.metastore.uris","thrift://localhost:9083")
.config("fs.s3.impl",constants.S3_IMPL)
.config("fs.s3.awsAccessKeyID",prop.keys.get(constants.S3_FS_ACCESS_KEY))
.config("fs.s3.awsSecretAccessKey",prop.keys.get(constants.S3_FS_SECRET_KEY))
.enableHiveSupport()
.getorCreate()

println("-:Spark session created successfully:-" + session)
flag = true
}​​​​​ catch {​​​​​
case t: Exception =>
log.info(t.printstacktrace())
}​​​​​
session
}​​​​​

下面是读取数据框和创建视图的代码。

val jdbcDF = session.read
.format("jdbc")
.option("url","jdbc:POSTGResql://hostname/dbname?user=xxxxxxx&password=xxxxxxx")
.option("dbtable","offer")
.option("driver","org.POSTGResql.Driver")
.load().cache()
println(jdbcDF.show(20,falsE))
jdbcDF.createOrreplaceGlobalTempVIEw("abc")

请在下面找到日志。

21/03/26 15:05:27 WARN metastore: Failed to connect to the metastore Server...
21/03/26 15:05:27 INFO metastore: WaiTing 1 seconds before next connection attempt.
21/03/26 15:05:27 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize,as a result reporTing of Processtree metrics is stopped
21/03/26 15:05:28 WARN Hive: Failed to register all functions.
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.Metadata.SessionHivemetastoreClIEnt
at org.apache.hadoop.hive.metastore.metastoreUtils.newInstance(metastoreUtils.java:1709)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.<init>(retryingmetastoreClIEnt.java:83)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.getProxy(retryingmetastoreClIEnt.java:133)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.getProxy(retryingmetastoreClIEnt.java:104)
at org.apache.hadoop.hive.ql.Metadata.Hive.createmetastoreClIEnt(Hive.java:3600)
at org.apache.hadoop.hive.ql.Metadata.Hive.getmsC(Hive.java:3652)
at org.apache.hadoop.hive.ql.Metadata.Hive.getmsC(Hive.java:3632)
at org.apache.hadoop.hive.ql.Metadata.Hive.getAllFunctions(Hive.java:3894)
at org.apache.hadoop.hive.ql.Metadata.Hive.reloadFunctions(Hive.java:248)
at org.apache.hadoop.hive.ql.Metadata.Hive.registerallFunctionsOnce(Hive.java:231)
at org.apache.hadoop.hive.ql.Metadata.Hive.<init>(Hive.java:388)
at org.apache.hadoop.hive.ql.Metadata.Hive.create(Hive.java:332)
at org.apache.hadoop.hive.ql.Metadata.Hive.geTinternal(Hive.java:312)
at org.apache.hadoop.hive.ql.Metadata.Hive.get(Hive.java:288)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.clIEnt(HiveClIEntImpl.scala:260)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.$anonfun$withHiveState$1(HiveClIEntImpl.scala:286)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.liftedTree1$1(HiveClIEntImpl.scala:227)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.retryLocked(HiveClIEntImpl.scala:226)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.withHiveState(HiveClIEntImpl.scala:276)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.databaseExists(HiveClIEntImpl.scala:389)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:225)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClIEnt(HiveExternalCatalog.scala:103)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:225)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:137)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:127)
at org.apache.spark.sql.internal.SharedState.globalTempVIEwManager$lzycompute(SharedState.scala:157)
at org.apache.spark.sql.internal.SharedState.globalTempVIEwManager(SharedState.scala:155)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$2(HiveSessionStateBuilder.scala:60)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempVIEwManager$lzycompute(SessionCatalog.scala:93)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempVIEwManager(SessionCatalog.scala:93)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getGlobalTempVIEw(SessionCatalog.scala:587)
at org.apache.spark.sql.execution.command.CreateVIEwCommand.run(vIEws.scala:120)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sIDeEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sIDeEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3618)
at org.apache.spark.sql.execution.sqlExecution$.$anonfun$withNewExecutionID$5(sqlExecution.scala:100)
at org.apache.spark.sql.execution.sqlExecution$.witHSQLConfPropagated(sqlExecution.scala:160)
at org.apache.spark.sql.execution.sqlExecution$.$anonfun$withNewExecutionID$1(sqlExecution.scala:87)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.execution.sqlExecution$.withNewExecutionID(sqlExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3616)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:92)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withPlan(Dataset.scala:3646)
at org.apache.spark.sql.Dataset.createOrreplaceGlobalTempVIEw(Dataset.scala:3294)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.createTemptables(dataqualitycheckoperations.scala:262)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.$anonfun$allDqandTruncateReloadFunctions$1(dataqualitycheckoperations.scala:303)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.$anonfun$allDqandTruncateReloadFunctions$1$adapted(dataqualitycheckoperations.scala:300)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at scala.collection.Iterablelike.foreach(Iterablelike.scala:74)
at scala.collection.Iterablelike.foreach$(Iterablelike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.allDqandTruncateReloadFunctions(dataqualitycheckoperations.scala:300)
at com.amp.eolas.aim.valIDation.jobs.jobs$.main(jobs.scala:10)
at com.amp.eolas.aim.valIDation.jobs.jobs.main(jobs.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructOraccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructOraccessorImpl.newInstance(NativeConstructOraccessorImpl.java:62)
at sun.reflect.DelegaTingConstructOraccessorImpl.newInstance(DelegaTingConstructOraccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.metastoreUtils.newInstance(metastoreUtils.java:1707)
... 61 more
Caused by: MetaException(message:Could not connect to Meta store using any of the URIs provIDed. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HivemetastoreClIEnt.open(HivemetastoreClIEnt.java:480)
at org.apache.hadoop.hive.metastore.HivemetastoreClIEnt.<init>(HivemetastoreClIEnt.java:247)
at org.apache.hadoop.hive.ql.Metadata.SessionHivemetastoreClIEnt.<init>(SessionHivemetastoreClIEnt.java:70)
at sun.reflect.NativeConstructOraccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructOraccessorImpl.newInstance(NativeConstructOraccessorImpl.java:62)
at sun.reflect.DelegaTingConstructOraccessorImpl.newInstance(DelegaTingConstructOraccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.metastoreUtils.newInstance(metastoreUtils.java:1707)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.<init>(retryingmetastoreClIEnt.java:83)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.getProxy(retryingmetastoreClIEnt.java:133)
at org.apache.hadoop.hive.metastore.retryingmetastoreClIEnt.getProxy(retryingmetastoreClIEnt.java:104)
at org.apache.hadoop.hive.ql.Metadata.Hive.createmetastoreClIEnt(Hive.java:3600)
at org.apache.hadoop.hive.ql.Metadata.Hive.getmsC(Hive.java:3652)
at org.apache.hadoop.hive.ql.Metadata.Hive.getmsC(Hive.java:3632)
at org.apache.hadoop.hive.ql.Metadata.Hive.getAllFunctions(Hive.java:3894)
at org.apache.hadoop.hive.ql.Metadata.Hive.reloadFunctions(Hive.java:248)
at org.apache.hadoop.hive.ql.Metadata.Hive.registerallFunctionsOnce(Hive.java:231)
at org.apache.hadoop.hive.ql.Metadata.Hive.<init>(Hive.java:388)
at org.apache.hadoop.hive.ql.Metadata.Hive.create(Hive.java:332)
at org.apache.hadoop.hive.ql.Metadata.Hive.geTinternal(Hive.java:312)
at org.apache.hadoop.hive.ql.Metadata.Hive.get(Hive.java:288)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.clIEnt(HiveClIEntImpl.scala:260)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.$anonfun$withHiveState$1(HiveClIEntImpl.scala:286)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.liftedTree1$1(HiveClIEntImpl.scala:227)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.retryLocked(HiveClIEntImpl.scala:226)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.withHiveState(HiveClIEntImpl.scala:276)
at org.apache.spark.sql.hive.clIEnt.HiveClIEntImpl.databaseExists(HiveClIEntImpl.scala:389)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:225)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClIEnt(HiveExternalCatalog.scala:103)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:225)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:137)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:127)
at org.apache.spark.sql.internal.SharedState.globalTempVIEwManager$lzycompute(SharedState.scala:157)
at org.apache.spark.sql.internal.SharedState.globalTempVIEwManager(SharedState.scala:155)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$2(HiveSessionStateBuilder.scala:60)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempVIEwManager$lzycompute(SessionCatalog.scala:93)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempVIEwManager(SessionCatalog.scala:93)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getGlobalTempVIEw(SessionCatalog.scala:587)
at org.apache.spark.sql.execution.command.CreateVIEwCommand.run(vIEws.scala:120)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sIDeEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sIDeEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3618)
at org.apache.spark.sql.execution.sqlExecution$.$anonfun$withNewExecutionID$5(sqlExecution.scala:100)
at org.apache.spark.sql.execution.sqlExecution$.witHSQLConfPropagated(sqlExecution.scala:160)
at org.apache.spark.sql.execution.sqlExecution$.$anonfun$withNewExecutionID$1(sqlExecution.scala:87)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.execution.sqlExecution$.withNewExecutionID(sqlExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3616)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:92)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:89)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withPlan(Dataset.scala:3646)
at org.apache.spark.sql.Dataset.createOrreplaceGlobalTempVIEw(Dataset.scala:3294)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.createTemptables(dataqualitycheckoperations.scala:262)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.$anonfun$allDqandTruncateReloadFunctions$1(dataqualitycheckoperations.scala:303)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.$anonfun$allDqandTruncateReloadFunctions$1$adapted(dataqualitycheckoperations.scala:300)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at scala.collection.Iterablelike.foreach(Iterablelike.scala:74)
at scala.collection.Iterablelike.foreach$(Iterablelike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at com.amp.eolas.aim.valIDation.checks.dataqualitychecksoperations$.allDqandTruncateReloadFunctions(dataqualitycheckoperations.scala:300)
at com.amp.eolas.aim.valIDation.jobs.jobs$.main(jobs.scala:10)
at com.amp.eolas.aim.valIDation.jobs.jobs.main(jobs.scala)
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:81)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:476)
at java.net.AbstractPlainSocketImpl.connecttoaddress(AbstractPlainSocketImpl.java:218)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:200)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:162)
at java.net.socksSocketImpl.connect(SocksSocketImpl.java:394)
at java.net.socket.connect(Socket.java:606)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 69 more
)

感谢任何帮助。

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

大佬总结

以上是大佬教程为你收集整理的无法从本地机器上的 Intellij 连接 Hive Metastore全部内容,希望文章能够帮你解决无法从本地机器上的 Intellij 连接 Hive Metastore所遇到的程序开发问题。

如果觉得大佬教程网站内容还不错,欢迎将大佬教程推荐给程序员好友。

本图文内容来源于网友网络收集整理提供,作为学习参考使用,版权属于原作者。
如您有任何意见或建议可联系处理。小编QQ:384754419,请注明来意。
标签:连接