aboutsummaryrefslogtreecommitdiff
path: root/sql
diff options
context:
space:
mode:
authorCodingCat <zhunansjtu@gmail.com>2014-04-18 10:01:16 -0700
committerReynold Xin <rxin@apache.org>2014-04-18 10:01:16 -0700
commite31c8ffca65e0e5cd5f1a6229f3d654a24b7b18c (patch)
treeb0923d192066b8f44bad5047f0ca03719af5c789 /sql
parent7863ecca35be9af1eca0dfe5fd8806c5dd710fd6 (diff)
downloadspark-e31c8ffca65e0e5cd5f1a6229f3d654a24b7b18c.tar.gz
spark-e31c8ffca65e0e5cd5f1a6229f3d654a24b7b18c.tar.bz2
spark-e31c8ffca65e0e5cd5f1a6229f3d654a24b7b18c.zip
SPARK-1483: Rename minSplits to minPartitions in public APIs
https://issues.apache.org/jira/browse/SPARK-1483 From the original JIRA: " The parameter name is part of the public API in Scala and Python, since you can pass named parameters to a method, so we should name it to this more descriptive term. Everywhere else we refer to "splits" as partitions." - @mateiz Author: CodingCat <zhunansjtu@gmail.com> Closes #430 from CodingCat/SPARK-1483 and squashes the following commits: 4b60541 [CodingCat] deprecate defaultMinSplits ba2c663 [CodingCat] Rename minSplits to minPartitions in public APIs
Diffstat (limited to 'sql')
-rw-r--r--sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
index 0da5eb754c..8cfde46186 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
@@ -52,7 +52,7 @@ class HadoopTableReader(@transient _tableDesc: TableDesc, @transient sc: HiveCon
// Choose the minimum number of splits. If mapred.map.tasks is set, then use that unless
// it is smaller than what Spark suggests.
private val _minSplitsPerRDD = math.max(
- sc.hiveconf.getInt("mapred.map.tasks", 1), sc.sparkContext.defaultMinSplits)
+ sc.hiveconf.getInt("mapred.map.tasks", 1), sc.sparkContext.defaultMinPartitions)
// TODO: set aws s3 credentials.