| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
| |
make-distribution.sh gives a slightly off error message when using --with-hive.
Author: Mark Wagner <mwagner@mwagner-ld.linkedin.biz>
Closes #1489 from wagnermarkd/SPARK-2587 and squashes the following commits:
7b5d3ff [Mark Wagner] SPARK-2587: Fix error message in make-distribution.sh
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Right now we have a bunch of parallel logic in make-distribution.sh
that's just extra work to maintain. We should just pass through
Maven profiles in this case and keep the script simple. See
the JIRA for more details.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #1445 from pwendell/make-distribution.sh and squashes the following commits:
f1294ea [Patrick Wendell] Simplify options in make-distribution.sh.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
RELEASE file
This patch adds the git revision hash (short version) to the RELEASE file. It uses git instead of simply checking for the existence of .git, so as to make sure that this is a functional repository.
Author: Guillaume Ballet <gballet@gmail.com>
Closes #1216 from gballet/master and squashes the following commits:
eabc50f [Guillaume Ballet] Refactored the script to take comments into account.
d93e5e8 [Guillaume Ballet] [SPARK 2233] make-distribution script now lists the git hash tag in the RELEASE file.
|
|
|
|
|
|
|
|
| |
Author: Matthew Farrellee <matt@redhat.com>
Closes #1185 from mattf/master-1 and squashes the following commits:
42150fc [Matthew Farrellee] Autodetect JAVA_HOME on RPM-based systems
|
|
|
|
|
|
|
|
|
|
|
|
| |
When mvn is not detected (not in executor's path), 'set -e' causes the
detection to terminate the script before the helpful error message can
be displayed.
Author: Matthew Farrellee <matt@redhat.com>
Closes #1181 from mattf/master-0 and squashes the following commits:
506549f [Matthew Farrellee] Fix mvn detection
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit requires the user to manually say "yes" when buiding Spark
without Java 6. The prompt can be bypassed with a flag (e.g. if the user
is scripting around make-distribution).
Author: Patrick Wendell <pwendell@gmail.com>
Closes #859 from pwendell/java6 and squashes the following commits:
4921133 [Patrick Wendell] Adding Pyspark Notice
fee8c9e [Patrick Wendell] SPARK-1911: Emphasize that Spark jars should be built with Java 6.
|
|
|
|
|
|
|
|
| |
Author: Patrick Wendell <pwendell@gmail.com>
Closes #818 from pwendell/reamde and squashes the following commits:
4020b11 [Patrick Wendell] SPARK-1873: Add README.md file when making distributions
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Gives a nicely formatted message to the user when `run-example` is run to
tell them to use `spark-submit`.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #704 from pwendell/examples and squashes the following commits:
1996ee8 [Patrick Wendell] Feedback form Andrew
3eb7803 [Patrick Wendell] Suggestions from TD
2474668 [Patrick Wendell] SPARK-1565 (Addendum): Replace `run-example` with `spark-submit`.
|
|
|
|
|
|
|
|
| |
Author: Andrew Ash <andrew@andrewash.com>
Closes #680 from ash211/patch-3 and squashes the following commits:
9ce3746 [Andrew Ash] Typo fix: fetchting -> fetching
|
|
|
|
|
|
|
|
|
|
|
| |
Also moves a few lines of code around in make-distribution.sh.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #669 from pwendell/make-distribution and squashes the following commits:
8bfac49 [Patrick Wendell] Small fix
46918ec [Patrick Wendell] SPARK-1737: Warn rather than fail when Java 7+ is used to create distributions.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
73b0cbcc241cca3d318ff74340e80b02f884acbd introduced a few special profiles that are not covered in the `make-distribution.sh`. This affects hadoop versions 2.2.x, 2.3.x, and 2.4.x. Without these special profiles, a java version error for protobufs is thrown at run time.
I took the opportunity to rewrite the way we construct the maven command. Previously, the only hadoop version that triggered the `yarn-alpha` profile was 0.23.x, which was inconsistent with the [docs](https://github.com/apache/spark/blob/master/docs/building-with-maven.md). This is now generalized to hadoop versions from 0.23.x to 2.1.x.
Author: Andrew Or <andrewor14@gmail.com>
Closes #660 from andrewor14/hadoop-distribution and squashes the following commits:
6740126 [Andrew Or] Generalize the yarn profile to hadoop versions 2.2+
88f192d [Andrew Or] Add the required special profiles to make-distribution.sh
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This copies the datanucleus jars over from `lib_managed` into `dist/lib`, if any. The `CLASSPATH` must also be updated to reflect this change.
Author: Andrew Or <andrewor14@gmail.com>
Closes #610 from andrewor14/hive-distribution and squashes the following commits:
a4bc96f [Andrew Or] Rename search path in jar error check
fa205e1 [Andrew Or] Merge branch 'master' of github.com:apache/spark into hive-distribution
7855f58 [Andrew Or] Have jar command respect JAVA_HOME + check for jar errors both cases
c16bbfd [Andrew Or] Merge branch 'master' of github.com:apache/spark into hive-distribution
32f6826 [Andrew Or] Leave the double colons
940a1bb [Andrew Or] Add back 2>/dev/null
58357cc [Andrew Or] Include datanucleus jars in Spark distribution built with Hive support
|
|
|
|
|
|
|
|
|
|
|
| |
This add some guards and good warning messages if users hit this issue. /cc @aarondav with whom I discussed parts of the design.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #627 from pwendell/jdk6 and squashes the following commits:
a38a958 [Patrick Wendell] Code review feedback
94e9f84 [Patrick Wendell] SPARK-1703 Warn users if Spark is run on JRE6 but compiled with JDK7.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The current test is checking the exit code of "tail" rather than "mvn".
This new check will make sure that mvn is installed and was able to
execute the "version command".
Author: Rahul Singhal <rahul.singhal@guavus.com>
Closes #580 from rahulsinghaliitd/SPARK-1658 and squashes the following commits:
83c0313 [Rahul Singhal] SPARK-1658: Correctly identify if maven is installed and working
bf821b9 [Rahul Singhal] SPARK-1658: Correctly identify if maven is installed and working
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
1, Fix SPARK-1441: compile spark core error with hadoop 0.23.x
2, Fix SPARK-1491: maven hadoop-provided profile fails to build
3, Fix org.scala-lang: * ,org.apache.avro:* inconsistent versions dependency
4, A modified on the sql/catalyst/pom.xml,sql/hive/pom.xml,sql/core/pom.xml (Four spaces formatted into two spaces)
Author: witgo <witgo@qq.com>
Closes #480 from witgo/format_pom and squashes the following commits:
03f652f [witgo] review commit
b452680 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom
bee920d [witgo] revert fix SPARK-1629: Spark Core missing commons-lang dependence
7382a07 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom
6902c91 [witgo] fix SPARK-1629: Spark Core missing commons-lang dependence
0da4bc3 [witgo] merge master
d1718ed [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom
e345919 [witgo] add avro dependency to yarn-alpha
77fad08 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom
62d0862 [witgo] Fix org.scala-lang: * inconsistent versions dependency
1a162d7 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom
934f24d [witgo] review commit
cf46edc [witgo] exclude jruby
06e7328 [witgo] Merge branch 'SparkBuild' into format_pom
99464d2 [witgo] fix maven hadoop-provided profile fails to build
0c6c1fc [witgo] Fix compile spark core error with hadoop 0.23.x
6851bec [witgo] Maintain consistent SparkBuild.scala, pom.xml
|
|
|
|
|
|
|
|
|
|
|
| |
Small bug fix to make sure the "spark contents" are copied to the
deployment directory correctly.
Author: Rahul Singhal <rahul.singhal@guavus.com>
Closes #573 from rahulsinghaliitd/SPARK-1651 and squashes the following commits:
402c999 [Rahul Singhal] SPARK-1651: Delete existing deployment directory
|
|
|
|
|
|
|
|
|
|
|
| |
Better account for various side-effect outputs while executing
"mvn help:evaluate -Dexpression=project.version"
Author: Rahul Singhal <rahul.singhal@guavus.com>
Closes #572 from rahulsinghaliitd/SPARK-1650 and squashes the following commits:
fd6a611 [Rahul Singhal] SPARK-1650: Correctly identify maven project version
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This simplifies the shell a bunch and passes all arguments through to spark-submit.
There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #542 from pwendell/spark-shell and squashes the following commits:
9eb3e6f [Patrick Wendell] Updating Spark docs
b552459 [Patrick Wendell] Andrew's feedback
97720fa [Patrick Wendell] Review feedback
aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
1. Makes assembly and examples jar naming consistent in maven/sbt.
2. Updates make-distribution.sh to use Maven and fixes some bugs.
3. Updates the create-release script to call make-distribution script.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #502 from pwendell/make-distribution and squashes the following commits:
1a97f0d [Patrick Wendell] SPARK-1119 and other build improvements
|
|
|
|
|
|
|
|
| |
Author: Nick Lanham <nick@afternight.org>
Closes #264 from nicklan/make-distribution-fixes and squashes the following commits:
172b981 [Nick Lanham] fix path for jar, make sed actually work on OSX
|
|
|
|
|
|
|
|
|
|
|
| |
I don't have access to an OSX machine, so if someone could test this that would be great.
Author: Nick Lanham <nick@afternight.org>
Closes #258 from nicklan/osx-sed-fix and squashes the following commits:
a6f158f [Nick Lanham] Also make mktemp work on OSX
558fd6e [Nick Lanham] Make sed do -i '' on OSX
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This should all work as expected with the current version of the tachyon tarball (0.4.1)
Author: Nick Lanham <nick@afternight.org>
Closes #137 from nicklan/bundle-tachyon and squashes the following commits:
2eee15b [Nick Lanham] Put back in exec, start tachyon first
738ba23 [Nick Lanham] Move tachyon out of sbin
f2f9bc6 [Nick Lanham] More checks for tachyon script
111e8e1 [Nick Lanham] Only try tachyon operations if tachyon script exists
0561574 [Nick Lanham] Copy over web resources so web interface can run
4dc9809 [Nick Lanham] Update to tachyon 0.4.1
0a1a20c [Nick Lanham] Add scripts using tachyon tarball
|
| |
|
| |
|
| |
|
|\
| |
| |
| |
| |
| | |
Conflicts:
core/src/test/scala/org/apache/spark/DriverSuite.scala
docs/python-programming-guide.md
|
| | |
|
| | |
|
| |\
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
spark-915-segregate-scripts
Conflicts:
bin/spark-shell
core/pom.xml
core/src/main/scala/org/apache/spark/SparkContext.scala
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
core/src/main/scala/org/apache/spark/ui/UIWorkloadGenerator.scala
core/src/test/scala/org/apache/spark/DriverSuite.scala
python/run-tests
sbin/compute-classpath.sh
sbin/spark-class
sbin/stop-slaves.sh
|
| | |
| | |
| | |
| | | |
Signed-off-by: shane-huang <shengsheng.huang@intel.com>
|
| | |
| | |
| | |
| | | |
Closes #316
|
|/ / |
|
|/ |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit makes Spark invocation saner by using an assembly JAR to
find all of Spark's dependencies instead of adding all the JARs in
lib_managed. It also packages the examples into an assembly and uses
that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script
with two better-named scripts: "run-examples" for examples, and
"spark-class" for Spark internal classes (e.g. REPL, master, etc). This
is also designed to minimize the confusion people have in trying to use
"run" to run their own classes; it's not meant to do that, but now at
least if they look at it, they can modify run-examples to do a decent
job for them.
As part of this, Bagel's examples are also now properly moved to the
examples package instead of bagel.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
requiring Spark to be installed. Using 'make_distribution.sh' a user
can put a Spark distribution at a URI supported by Mesos (e.g.,
'hdfs://...') and then set that when launching their job. Also added
SPARK_EXECUTOR_URI for the REPL.
|
|\
| |
| |
| |
| | |
Conflicts:
make-distribution.sh
|
| | |
|
|/ |
|
|
|
|
|
| |
Conflicts:
bin/compute-classpath.sh
|
| |
|
|
|