centos上安装IObject for spark,试用许可问题。

0 投票

在centos系统的spark集群中使用Supermap IObject for spark,目前的部署情况为:

1. 已有hadoop和spark,且可正常使用

2. 我已将10.0.1版本的Supermap IObject for spark下载到系统中,并拷贝bdt-all-runtime-10.0.1-SNAPSHOT.jar到spark目录下,执行spark任务用。

3.下载了Supermap IObject for Java,并使用/usr/local/supermap-java/Support/aksusbd-2.2.1-i386/dinst进行了许可的安装。但是有一个error,不知道是否安装成功。日志如下:

[root@web-1 aksusbd-2.2.1-i386]# ./dinst 
Copy AKSUSB daemon to /usr/sbin ...
Copy WINEHASP daemon to /usr/sbin ...
Copy HASPLMD daemon to /usr/sbin ...
Copy start-up script to /etc/init.d ...
Link HASP SRM runtime environment startup script to system startup folder
Killing already running daemons...
Stopping aksusbd (via systemctl):  Warning: aksusbd.service changed on disk. Run 'systemctl daemon-reload' to reload units.
                                                           [  OK  ]
Starting HASP SRM runtime environment ... 
Starting aksusbd (via systemctl):  Warning: aksusbd.service changed on disk. Run 'systemctl daemon-reload' to reload units.
                                                           [  OK  ]
copy so ...
install v2c ...
Input/Output error

press ENTER

Done

3.我已经将Supermap IObject for Java的Bin目录复制到了/opt/SuperMap/iobjects/1001 目录下。

[root@web-1 1001]# pwd
/opt/SuperMap/iobjects/1001
[root@web-1 1001]# ll
total 36
drwxr-xr-x 5 root root 32768 Sep 23 09:10 Bin
[root@web-1 1001]# 

4.在使用Supermap IObject for Java的Bin目录下的com.supermap.license.jar并没有生成c2v文件。无法进行下一步。

[root@web-1 Bin]# java -jar com.supermap.license.jar -create /root/dist.c2v
[root@web-1 Bin]# cd /root/
[root@web-1 ~]# ll
total 0
[root@web-1 ~]# 

5.spark日志报错如下。

2020-09-24 09:21:51 INFO  License:311 - lichasp.connect return exception
java.lang.UnsatisfiedLinkError: Aladdin.Hasp.Login(JLjava/lang/String;[I)I
        at Aladdin.Hasp.Login(Native Method)
        at Aladdin.Hasp.login(Hasp.java:91)
        at com.supermap.LicenseHaspServiceImpl.internalLogin(LicenseHaspServiceImpl.java:203)
        at com.supermap.LicenseHaspServiceImpl.connect(LicenseHaspServiceImpl.java:135)
        at com.supermap.License.connect(License.java:309)
        at com.supermap.License.connect(License.java:276)
        at com.supermap.bdt.license.iObjectsLicense$$anonfun$checkSparkObjectsLicense$1.apply(iObjectsLicense.scala:26)
        at com.supermap.bdt.license.iObjectsLicense$$anonfun$checkSparkObjectsLicense$1.apply(iObjectsLicense.scala:25)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at com.supermap.bdt.license.iObjectsLicense$.checkSparkObjectsLicense(iObjectsLicense.scala:25)
        at com.supermap.bdt.license.iObjectsLicense$.checkCoreLicense(iObjectsLicense.scala:55)
        at com.supermap.bdt.license.BDTLicense$.checkCoreLicense(BDTLicense.scala:59)
        at com.supermap.bdt.rddprovider.gdb.GDBFeatureRDDProvider.rdd(GDBFeatureRDDProvider.scala:52)
        at com.dist.vector.overlayAnalysis.OverlayAnalysisDemo$.main(OverlayAnalysisDemo.scala:41)
        at com.dist.vector.overlayAnalysis.OverlayAnalysisDemo.main(OverlayAnalysisDemo.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:890)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.IllegalStateException: SuperMap_License_Error_Cannot_Find_LicenseInstance
        at com.supermap.bdt.license.iObjectsLicense$.checkSparkObjectsLicense(iObjectsLicense.scala:31)
        at com.supermap.bdt.license.iObjectsLicense$.checkCoreLicense(iObjectsLicense.scala:55)
        at com.supermap.bdt.license.BDTLicense$.checkCoreLicense(BDTLicense.scala:59)
        at com.supermap.bdt.rddprovider.gdb.GDBFeatureRDDProvider.rdd(GDBFeatureRDDProvider.scala:52)
        at com.dist.vector.overlayAnalysis.OverlayAnalysisDemo$.main(OverlayAnalysisDemo.scala:41)
        at com.dist.vector.overlayAnalysis.OverlayAnalysisDemo.main(OverlayAnalysisDemo.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:890)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

使用产品:Supermap IObject for spark,操作系统:Centos 7.5

9月 24, 2020 分类:  232次浏览 | 用户: zhaozm 才疏学浅 (15 分)

1个回答

1 投票
INFO License:311 - lichasp.connect return exception这个错误属于info级别的,是检查许可驱动服务抛出来的异常,如果用的试用许可,那么这个信息不影响使用。

先检查许可状态是否有效,用licensetool.sh脚本检查下。
9月 24, 2020 用户: 杨兵 名扬四海 (1,040 分)
您好,这个脚本是在哪里,我怎么没有找到。使用的是试用许可。
...