aboutsummaryrefslogtreecommitdiff
path: root/pyspark2.cmd
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2013-10-18 20:24:10 -0700
committerMatei Zaharia <matei@eecs.berkeley.edu>2013-10-18 20:24:10 -0700
commit8d528af829dc989d4701c08fd90d230c15df7f7e (patch)
treeb1806dc4a32f16d1a4201eeb52b0bd1da1322508 /pyspark2.cmd
parentfc26e5b8320556b9edb93741391b759813b4079b (diff)
parent74737264c4a9b2a9a99bf3aa77928f6960bad78c (diff)
downloadspark-8d528af829dc989d4701c08fd90d230c15df7f7e.tar.gz
spark-8d528af829dc989d4701c08fd90d230c15df7f7e.tar.bz2
spark-8d528af829dc989d4701c08fd90d230c15df7f7e.zip
Merge pull request #71 from aarondav/scdefaults
Spark shell exits if it cannot create SparkContext Mainly, this occurs if you provide a messed up MASTER url (one that doesn't match one of our regexes). Previously, we would default to Mesos, fail, and then start the shell anyway, except that any Spark command would fail. Simply exiting seems clearer.
Diffstat (limited to 'pyspark2.cmd')
0 files changed, 0 insertions, 0 deletions