首页 / 浏览问题 / 组件GIS / 问题详情
Iobjects for spark 缺少依赖库。
18EXP 2021年03月03日
2021-03-03 13:08:35 INFO  BlockManagerInfo:54 - Added broadcast_7_piece0 in memory on 192.168.1.218:4823 (size: 253.0 B, free: 2004.6 MB)
2021-03-03 13:08:36 INFO  BlockManagerInfo:54 - Added broadcast_7_piece0 in memory on 192.168.1.217:33173 (size: 253.0 B, free: 2004.6 MB)
2021-03-03 13:08:36 WARN  TaskSetManager:66 - Lost task 3.0 in stage 5.0 (TID 1503, 192.168.1.217, executor 0): java.lang.UnsatisfiedLinkError: com.supermap.data.GeoRegionNative.jni_New()J
        at com.supermap.data.GeoRegionNative.jni_New(Native Method)
        at com.supermap.data.GeoRegion.<init>(GeoRegion.java:28)
        at com.supermap.bdt.cpp.base.WrapJTS$.asSmGeoRegion(WrapJTS.scala:512)
        at com.supermap.bdt.cpp.base.WrapJTS$.asSmGeo(WrapJTS.scala:560)
        at com.supermap.bdt.cpp.base.WrapJTS$.asSmGeo(WrapJTS.scala:529)
        at com.supermap.bdt.cpp.base.WrapJTS$WrapGeometryAsSm.toSuperMap(WrapJTS.scala:157)
        at com.supermap.bdt.analyst.vector.cpp.algorithm.CalRegionRectRelationCpp$.apply(CalRegionRectRelationCpp.scala:14)
        at com.supermap.bdt.analyst.vector.cpp.package$.clipRegionWithRect(package.scala:252)
        at com.supermap.bdt.analyst.vector.cpp.OverlayImpl$$anonfun$overlayRegion$1$$anonfun$apply$3.apply(OverlayImpl.scala:46)
        at com.supermap.bdt.analyst.vector.cpp.OverlayImpl$$anonfun$overlayRegion$1$$anonfun$apply$3.apply(OverlayImpl.scala:45)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.Iterator$JoinIterator.foreach(Iterator.scala:206)
        at com.supermap.bdt.analyst.vector.cpp.OverlayImpl$$anonfun$overlayRegion$1.apply(OverlayImpl.scala:45)
        at com.supermap.bdt.analyst.vector.cpp.OverlayImpl$$anonfun$overlayRegion$1.apply(OverlayImpl.scala:34)
        at org.apache.spark.rdd.ZippedPartitionsRDD2.compute(ZippedPartitionsRDD.scala:89)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
        at org.apache.spark.scheduler.Task.run(Task.scala:109)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

我在 /opt/SuperMap/iobjects/1010 都配置了Bin,但还是在报错,包括在启动iserver时也会报这个错。

iobject spark版本10.1.0

iserver版本:10.1.0 

Bin版本:10.1.0

Linux系统:centos7

...