diff options
author | Josh Rosen <joshrosen@eecs.berkeley.edu> | 2012-12-27 22:47:37 -0800 |
---|---|---|
committer | Josh Rosen <joshrosen@eecs.berkeley.edu> | 2012-12-27 22:47:37 -0800 |
commit | 665466dfff4f89196627a0777eabd3d3894cd296 (patch) | |
tree | 7fa580209756c5fdbb0a52930f30959bbbbc2ba3 /run | |
parent | ac32447cd38beac8f6bc7a90be9fd24666bb46ad (diff) | |
download | spark-665466dfff4f89196627a0777eabd3d3894cd296.tar.gz spark-665466dfff4f89196627a0777eabd3d3894cd296.tar.bz2 spark-665466dfff4f89196627a0777eabd3d3894cd296.zip |
Simplify PySpark installation.
- Bundle Py4J binaries, since it's hard to install
- Uses Spark's `run` script to launch the Py4J
gateway, inheriting the settings in spark-env.sh
With these changes, (hopefully) nothing more than
running `sbt/sbt package` will be necessary to run
PySpark.
Diffstat (limited to 'run')
-rwxr-xr-x | run | 4 |
1 files changed, 4 insertions, 0 deletions
@@ -40,6 +40,7 @@ CORE_DIR="$FWDIR/core" REPL_DIR="$FWDIR/repl" EXAMPLES_DIR="$FWDIR/examples" BAGEL_DIR="$FWDIR/bagel" +PYSPARK_DIR="$FWDIR/pyspark" # Build up classpath CLASSPATH="$SPARK_CLASSPATH" @@ -61,6 +62,9 @@ for jar in `find $REPL_DIR/lib -name '*jar'`; do CLASSPATH+=":$jar" done CLASSPATH+=":$BAGEL_DIR/target/scala-$SCALA_VERSION/classes" +for jar in `find $PYSPARK_DIR/lib -name '*jar'`; do + CLASSPATH+=":$jar" +done export CLASSPATH # Needed for spark-shell # Figure out whether to run our class with java or with the scala launcher. |