1
例如,我的Spark集羣有100個節點(工作人員),當我運行一個作業時,我只希望它在大約10個特定節點上運行,我應該如何實現這一點。順便說一句,我正在使用Spark獨立模塊。如何在特定節點上運行Spark作業
爲什麼需要上述要求:
One of my Spark job needs to access NFS, but there are only 10 nodes were
permitted to access NFS, so if the job was distributed on each Worker nodes(100 nodes),
then access deny exception would happen and the job would failed.
添加更多詳細信息(例如,設置的作業,配置,作業代碼等)。 –
Mesos上的Spark允許您根據屬性和資源設置約束,但AFAIK standalone不具備此類功能。你爲什麼需要這個? – zero323
@ zero323謝謝,我修改了我的問題。 – Jack