下面的curl命令完美的作品來調用,通過論證和執行我的「jobified」星火計劃節點HTTP POST參數星火作業服務器
curl 'http://someserver:8090/jobs?appName=secondtest&classPath=Works.epJob&context=hiveContext' -d "inputparms=/somepath1 /somepath2"
這裏是星火計劃
override def runJob(hive: HiveContext, config: Config):Any = {
var inputParms = config.getString("inputparms").split(" "); //comes from node
var path1 = inputParms.apply(0)
var path2 = inputParms.apply(1)
相反的curl命令,我需要在node.js中做一個http post。這是我有什麼
var postData = JSON.stringify({
"inputparms": paths
})
var options = {
hostname: 'someserver',
port: 8090,
path: '/jobs?appName=secondtest&classPath=Works.epJob context=hiveContext',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(postData , 'utf8')
}
};
http.request(options, function(response) {...
如何上述腳本不起作用。我錯過了什麼嗎? 謝謝!
編輯1:
var myreq = http.request(options, function(response) { ...})
myreq.write(postData);
myreq.end();
我得到一個解析錯誤
Error: Parse Error
at Error (native)
at Socket.socketOnData (_http_client.js:361:20)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:177:18)
at Socket.Readable.push (_stream_readable.js:135:10)
at TCP.onread (net.js:542:20) bytesParsed: 2, code: 'HPE_INVALID_CONSTANT' }
我看到你設置兩個頭,Content-Type和內容長度。但你身體好嗎? – noorul
@noorul - 通過上述更新,上面的錯誤顯示。與此同時,這裏是從SJS 消息錯誤「:」org.apache.hadoop.mapreduce.lib.input.InvalidInputException:輸入路徑不存在:file:/ home/someid/workspace/sparkjobserver/spark-jobserver/\「inputparms =/path1」, – user1384205