我對Spark和Scala相對較新。將RDD [org.apache.spark.sql.Row]轉換爲RDD [org.apache.spark.mllib.linalg.Vector]
我開始用下面的數據幀(單柱做出來雙打的緻密載體的):
scala> val scaledDataOnly_pruned = scaledDataOnly.select("features")
scaledDataOnly_pruned: org.apache.spark.sql.DataFrame = [features: vector]
scala> scaledDataOnly_pruned.show(5)
+--------------------+
| features|
+--------------------+
|[-0.0948337274182...|
|[-0.0948337274182...|
|[-0.0948337274182...|
|[-0.0948337274182...|
|[-0.0948337274182...|
+--------------------+
直轉化爲RDD產生org.apache.spark.rdd.RDD的一個實例[ org.apache.spark.sql.Row]:
scala> val scaledDataOnly_rdd = scaledDataOnly_pruned.rdd
scaledDataOnly_rdd: org.apache.spark.rdd.RDD[org.apache.spark.sql.Row] = MapPartitionsRDD[32] at rdd at <console>:66
有誰知道如何將這種DF轉換爲org.apache.spark.rdd.RDD [org.apache.spark.mllib.linalg的一個實例。而不是?迄今爲止,我的各種嘗試都失敗了。
預先感謝您的任何指示!