diff options
author | Zhenglai Zhang <zhenglaizhang@hotmail.com> | 2016-08-14 16:10:34 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-08-14 16:10:34 +0100 |
commit | 2a3d286f3421f6836b71afcbda3084222752e6b1 (patch) | |
tree | b41e7073298addb27d33e3526303df098df0ea70 /core/src/main | |
parent | 0ebf7c1bff736cf54ec47957d71394d5b75b47a7 (diff) | |
download | spark-2a3d286f3421f6836b71afcbda3084222752e6b1.tar.gz spark-2a3d286f3421f6836b71afcbda3084222752e6b1.tar.bz2 spark-2a3d286f3421f6836b71afcbda3084222752e6b1.zip |
[WIP][MINOR][TYPO] Fix several trivival typos
## What changes were proposed in this pull request?
* Fixed one typo `"overriden"` as `"overridden"`, also make sure no other same typo.
* Fixed one typo `"lowcase"` as `"lowercase"`, also make sure no other same typo.
## How was this patch tested?
Since the change is very tiny, so I just make sure compilation is successful.
I am new to the spark community, please feel free to let me do other necessary steps.
Thanks in advance!
----
Updated: Found another typo `lowcase` later and fixed then in the same patch
Author: Zhenglai Zhang <zhenglaizhang@hotmail.com>
Closes #14622 from zhenglaizhang/fixtypo.
Diffstat (limited to 'core/src/main')
-rw-r--r-- | core/src/main/scala/org/apache/spark/SparkContext.scala | 2 | ||||
-rw-r--r-- | core/src/main/scala/org/apache/spark/util/Utils.scala | 2 |
2 files changed, 2 insertions, 2 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala index 4f3bb1c877..a6853fe398 100644 --- a/core/src/main/scala/org/apache/spark/SparkContext.scala +++ b/core/src/main/scala/org/apache/spark/SparkContext.scala @@ -355,7 +355,7 @@ class SparkContext(config: SparkConf) extends Logging with ExecutorAllocationCli * Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN */ def setLogLevel(logLevel: String) { - // let's allow lowcase or mixed case too + // let's allow lowercase or mixed case too val upperCased = logLevel.toUpperCase(Locale.ENGLISH) require(SparkContext.VALID_LOG_LEVELS.contains(upperCased), s"Supplied level $logLevel did not match one of:" + diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala b/core/src/main/scala/org/apache/spark/util/Utils.scala index 6ab9e99d89..0ae44a2ed7 100644 --- a/core/src/main/scala/org/apache/spark/util/Utils.scala +++ b/core/src/main/scala/org/apache/spark/util/Utils.scala @@ -82,7 +82,7 @@ private[spark] object Utils extends Logging { /** * The performance overhead of creating and logging strings for wide schemas can be large. To - * limit the impact, we bound the number of fields to include by default. This can be overriden + * limit the impact, we bound the number of fields to include by default. This can be overridden * by setting the 'spark.debug.maxToStringFields' conf in SparkEnv. */ val DEFAULT_MAX_TO_STRING_FIELDS = 25 |