aboutsummaryrefslogtreecommitdiff
path: root/mllib
diff options
context:
space:
mode:
authorDongjoon Hyun <dongjoon@apache.org>2016-03-28 12:04:21 +0100
committerSean Owen <sowen@cloudera.com>2016-03-28 12:04:21 +0100
commitb66aa900619a86b7acbb7c3f96abc96ea2faa53c (patch)
treeedd23acfb7a0ec84af0a322c8bd1b02a9fe139c4 /mllib
parent7b841540180e8d1403d6c95b02e93f129267b34f (diff)
downloadspark-b66aa900619a86b7acbb7c3f96abc96ea2faa53c.tar.gz
spark-b66aa900619a86b7acbb7c3f96abc96ea2faa53c.tar.bz2
spark-b66aa900619a86b7acbb7c3f96abc96ea2faa53c.zip
[SPARK-14102][CORE] Block `reset` command in SparkShell
## What changes were proposed in this pull request? Spark Shell provides an easy way to use Spark in Scala environment. This PR adds `reset` command to a blocked list, also cleaned up according to the Scala coding style. ```scala scala> sc res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext718fad24 scala> :reset scala> sc <console>:11: error: not found: value sc sc ^ ``` If we blocks `reset`, Spark Shell works like the followings. ```scala scala> :reset reset: no such command. Type :help for help. scala> :re re is ambiguous: did you mean :replay or :require? ``` ## How was this patch tested? Manual. Run `bin/spark-shell` and type `:reset`. Author: Dongjoon Hyun <dongjoon@apache.org> Closes #11920 from dongjoon-hyun/SPARK-14102.
Diffstat (limited to 'mllib')
0 files changed, 0 insertions, 0 deletions