2017-05-26 34 views
2

我正在嘗試使用DAS 3.1.0安裝wso2 API Manager 1.10.0。 DAS將使用MySQL 5.7.18。我從DAS包運行mysql5.7.sql以在MySQL中創建數據庫模式。我還下載了MySQL-connector-java-5.1.35-bin.jar並將其複製到repository \ components \ lib目錄中。未指定選項「架構」設置wso2 AM 1.10.x與DAS 3.1.0

我打開了API管理器中的配置分析,併成功保存了配置。我可以看到API管理器可以與DAS通信而不會出現問題。

但在DAS的碳日誌,我看到的例外是這樣的:

TID: [-1234] [] [2017-05-26 15:30:00,368] ERROR {org.wso2.carbon.analytics.spark.core.AnalyticsTask} - Error while executing the scheduled task for the script: APIM_STAT_SCRIPT {org.wso2.carbon.analytics.spark.core.AnalyticsTask} 
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException: Exception in executing query create temporary table APIRequestSummaryData using CarbonJDBC options (dataSource "WSO2AM_STATS_DB", tableName "API_REQUEST_SUMMARY") 
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764) 
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721) 
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201) 
    at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151) 
    at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60) 
    at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67) 
    at org.quartz.core.JobRunShell.run(JobRunShell.java:213) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.lang.RuntimeException: Option 'schema' not specified 
    at scala.sys.package$.error(package.scala:27) 
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113) 
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113) 
    at scala.collection.MapLike$class.getOrElse(MapLike.scala:128) 
    at org.apache.spark.sql.execution.datasources.CaseInsensitiveMap.getOrElse(ddl.scala:150) 
    at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:113) 
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158) 
    at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:92) 
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58) 
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56) 
    at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70) 
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132) 
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130) 
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) 
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130) 
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55) 
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55) 
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145) 
    at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130) 
    at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52) 
    at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817) 
    at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:760) 

如何解決呢? 謝謝。

回答

1

API Manager 1.10和DAS 3.1.0彼此不兼容。除非你自定義數據庫和CApps,否則它將不起作用。

您可以使用API​​ Manager 2.1的DAS 3.1.0 API經理1.10 DAS 3.0.x的

+0

謝謝。 AIM經理1.10與DAS 3.0.x的關係如何?他們是否一起工作? – laomao

+0

@Laomao:是的,您可以使用該組合。 –

+0

嗨Abimaran, 我現在切換到DAS 3.0.1,但我仍然得到確切的錯誤。我使用MySQL-connector-java-5.1.39-bin.jar作爲連接器。 – laomao

0

事實證明,我需要從/ dbscript/STAT/SQL文件夾導入正確的架構聲明腳本到我在這裏設置的DAS數據庫。