使用产品: supermap iobjects for spark 10.1.0
os: centos7.9
jdk: 1.8
scala:2.11.8
val outHbase = Map[String, java.io.Serializable](
HBaseCatalogParam.getName -> tableName,
ZookeeperParam.getName -> "slave1:2181,slave2:2181,slave3:2181",
"providerType" -> "HBase"
).asJava
FeatureRDDProviderFactory(outHbase).save(resultRDD, outHbase, tableName)
User class threw exception: java.lang.NoClassDefFoundError: Could not initialize class org.locationtech.geomesa.index.view.RoutedDataStoreViewFactory$
at org.locationtech.geomesa.index.view.RoutedDataStoreViewFactory.getDisplayName(RoutedDataStoreViewFactory.scala:86)
或
User class threw exception: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: Class org.locationtech.geomesa.hbase.coprocessor.GeoMesaCoprocessor cannot be loaded Set hbase.table.sanity.checks to false at conf or table descriptor if you want to bypass sanity checks
已经安装了geomesa hbase, 已经将geomesa-hbase-distributed-runtime-hbase2.jar, geomesa-hbase-spark.jar,geomesa-hbase-spark-runtime-hbase2.jar配置到类路径下
已经配置过hbase.table.sanity.checks = false
已经阅读过编程指南和api doc