2016-06-19 168 views
0

我有一個需要訪問DynamoDB表的應用程序。每個工作人員自己建立與數據庫的連接。從Spark Worker訪問環境變量

我已將AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEY同時添加到主人和工作人員spark-env.sh文件中。我還使用sh運行該文件以確保變量已導出。

當代碼運行時,我總是得到錯誤:

Caused by: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain 
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131) 
    at com.amazonaws.http.AmazonHttpClient.getCredentialsFromContext(AmazonHttpClient.java:774) 
    at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:800) 
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:695) 
    at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:447) 
    at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:409) 
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:358) 
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:2051) 
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:2021) 
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.describeTable(AmazonDynamoDBClient.java:1299) 
    at com.amazon.titan.diskstorage.dynamodb.DynamoDBDelegate.describeTable(DynamoDBDelegate.java:635) 
    ... 27 more 

看來,AWS SDK未能加載即使它們導出的證書。我應該嘗試什麼類型的解決方案?

+1

你可以嘗試在代碼中明確地設置它嗎? http://stackoverflow.com/questions/33475931/spark-streaming-checkpoint-to-amazon-s3 – Knight71

+0

@ Knight71,我可以但不是不安全? –

回答

2

您可以使用SparkConf上的​​方法。 E.g

/** 
    * Set an environment variable to be used when launching executors for this application. 
    * These variables are stored as properties of the form spark.executorEnv.VAR_NAME 
    * (for example spark.executorEnv.PATH) but this method makes them easier to set. 
    */ 
    def setExecutorEnv(variable: String, value: String): SparkConf = { 
    set("spark.executorEnv." + variable, value) 
    } 

而且

/** 
    * Set multiple environment variables to be used when launching executors. 
    * These variables are stored as properties of the form spark.executorEnv.VAR_NAME 
    * (for example spark.executorEnv.PATH) but this method makes them easier to set. 
    */ 
    def setExecutorEnv(variables: Seq[(String, String)]): SparkConf = { 
    for ((k, v) <- variables) { 
     setExecutorEnv(k, v) 
    } 
    this 
    } 

你可能會考慮其他選項,如設置Java系統屬性:SparkConf會自動接他們回家。

+0

並在如下所示的執行程序中讀取該值:::::: val property_value = System.getEnv(「property_key」) – Suresh