diff options
author | codlife <1004910847@qq.com> | 2016-09-15 09:38:13 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-09-15 09:38:13 +0100 |
commit | 647ee05e5815bde361662a9286ac602c44b4d4e6 (patch) | |
tree | dc4ab9c9ca48b4008a55332ab3258a3cbb028303 /build | |
parent | f893e262500e2f183de88e984300dd5b085e1f71 (diff) | |
download | spark-647ee05e5815bde361662a9286ac602c44b4d4e6.tar.gz spark-647ee05e5815bde361662a9286ac602c44b4d4e6.tar.bz2 spark-647ee05e5815bde361662a9286ac602c44b4d4e6.zip |
[SPARK-17521] Error when I use sparkContext.makeRDD(Seq())
## What changes were proposed in this pull request?
when i use sc.makeRDD below
```
val data3 = sc.makeRDD(Seq())
println(data3.partitions.length)
```
I got an error:
Exception in thread "main" java.lang.IllegalArgumentException: Positive number of slices required
We can fix this bug just modify the last line ,do a check of seq.size
```
def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope {
assertNotStopped()
val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap
new ParallelCollectionRDD[T](this, seq.map(_._1), math.max(seq.size, defaultParallelism), indexToPrefs)
}
```
## How was this patch tested?
manual tests
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
Author: codlife <1004910847@qq.com>
Author: codlife <wangjianfei15@otcaix.iscas.ac.cn>
Closes #15077 from codlife/master.
Diffstat (limited to 'build')
0 files changed, 0 insertions, 0 deletions