aboutsummaryrefslogtreecommitdiff
path: root/run
diff options
context:
space:
mode:
authorJosh Rosen <joshrosen@eecs.berkeley.edu>2012-12-27 22:47:37 -0800
committerJosh Rosen <joshrosen@eecs.berkeley.edu>2012-12-27 22:47:37 -0800
commit665466dfff4f89196627a0777eabd3d3894cd296 (patch)
tree7fa580209756c5fdbb0a52930f30959bbbbc2ba3 /run
parentac32447cd38beac8f6bc7a90be9fd24666bb46ad (diff)
downloadspark-665466dfff4f89196627a0777eabd3d3894cd296.tar.gz
spark-665466dfff4f89196627a0777eabd3d3894cd296.tar.bz2
spark-665466dfff4f89196627a0777eabd3d3894cd296.zip
Simplify PySpark installation.
- Bundle Py4J binaries, since it's hard to install - Uses Spark's `run` script to launch the Py4J gateway, inheriting the settings in spark-env.sh With these changes, (hopefully) nothing more than running `sbt/sbt package` will be necessary to run PySpark.
Diffstat (limited to 'run')
-rwxr-xr-xrun4
1 files changed, 4 insertions, 0 deletions
diff --git a/run b/run
index 15db23bbe0..8fa61b086f 100755
--- a/run
+++ b/run
@@ -40,6 +40,7 @@ CORE_DIR="$FWDIR/core"
REPL_DIR="$FWDIR/repl"
EXAMPLES_DIR="$FWDIR/examples"
BAGEL_DIR="$FWDIR/bagel"
+PYSPARK_DIR="$FWDIR/pyspark"
# Build up classpath
CLASSPATH="$SPARK_CLASSPATH"
@@ -61,6 +62,9 @@ for jar in `find $REPL_DIR/lib -name '*jar'`; do
CLASSPATH+=":$jar"
done
CLASSPATH+=":$BAGEL_DIR/target/scala-$SCALA_VERSION/classes"
+for jar in `find $PYSPARK_DIR/lib -name '*jar'`; do
+ CLASSPATH+=":$jar"
+done
export CLASSPATH # Needed for spark-shell
# Figure out whether to run our class with java or with the scala launcher.