aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorcodlife <1004910847@qq.com>2016-09-15 09:38:13 +0100
committerSean Owen <sowen@cloudera.com>2016-09-15 09:38:13 +0100
commit647ee05e5815bde361662a9286ac602c44b4d4e6 (patch)
treedc4ab9c9ca48b4008a55332ab3258a3cbb028303 /core
parentf893e262500e2f183de88e984300dd5b085e1f71 (diff)
downloadspark-647ee05e5815bde361662a9286ac602c44b4d4e6.tar.gz
spark-647ee05e5815bde361662a9286ac602c44b4d4e6.tar.bz2
spark-647ee05e5815bde361662a9286ac602c44b4d4e6.zip
[SPARK-17521] Error when I use sparkContext.makeRDD(Seq())
## What changes were proposed in this pull request? when i use sc.makeRDD below ``` val data3 = sc.makeRDD(Seq()) println(data3.partitions.length) ``` I got an error: Exception in thread "main" java.lang.IllegalArgumentException: Positive number of slices required We can fix this bug just modify the last line ,do a check of seq.size ``` def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope { assertNotStopped() val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap new ParallelCollectionRDD[T](this, seq.map(_._1), math.max(seq.size, defaultParallelism), indexToPrefs) } ``` ## How was this patch tested? manual tests (If this patch involves UI changes, please attach a screenshot; otherwise, remove this) Author: codlife <1004910847@qq.com> Author: codlife <wangjianfei15@otcaix.iscas.ac.cn> Closes #15077 from codlife/master.
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/SparkContext.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala
index e32e4aa5b8..35b6334832 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -795,7 +795,7 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli
def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope {
assertNotStopped()
val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap
- new ParallelCollectionRDD[T](this, seq.map(_._1), seq.size, indexToPrefs)
+ new ParallelCollectionRDD[T](this, seq.map(_._1), math.max(seq.size, 1), indexToPrefs)
}
/**