aboutsummaryrefslogtreecommitdiff
path: root/make-distribution.sh
Commit message (Collapse)AuthorAgeFilesLines
* Improved build configurationwitgo2014-04-281-3/+12
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1, Fix SPARK-1441: compile spark core error with hadoop 0.23.x 2, Fix SPARK-1491: maven hadoop-provided profile fails to build 3, Fix org.scala-lang: * ,org.apache.avro:* inconsistent versions dependency 4, A modified on the sql/catalyst/pom.xml,sql/hive/pom.xml,sql/core/pom.xml (Four spaces formatted into two spaces) Author: witgo <witgo@qq.com> Closes #480 from witgo/format_pom and squashes the following commits: 03f652f [witgo] review commit b452680 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom bee920d [witgo] revert fix SPARK-1629: Spark Core missing commons-lang dependence 7382a07 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom 6902c91 [witgo] fix SPARK-1629: Spark Core missing commons-lang dependence 0da4bc3 [witgo] merge master d1718ed [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom e345919 [witgo] add avro dependency to yarn-alpha 77fad08 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom 62d0862 [witgo] Fix org.scala-lang: * inconsistent versions dependency 1a162d7 [witgo] Merge branch 'master' of https://github.com/apache/spark into format_pom 934f24d [witgo] review commit cf46edc [witgo] exclude jruby 06e7328 [witgo] Merge branch 'SparkBuild' into format_pom 99464d2 [witgo] fix maven hadoop-provided profile fails to build 0c6c1fc [witgo] Fix compile spark core error with hadoop 0.23.x 6851bec [witgo] Maintain consistent SparkBuild.scala, pom.xml
* SPARK-1651: Delete existing deployment directoryRahul Singhal2014-04-271-0/+1
| | | | | | | | | | | Small bug fix to make sure the "spark contents" are copied to the deployment directory correctly. Author: Rahul Singhal <rahul.singhal@guavus.com> Closes #573 from rahulsinghaliitd/SPARK-1651 and squashes the following commits: 402c999 [Rahul Singhal] SPARK-1651: Delete existing deployment directory
* SPARK-1650: Correctly identify maven project versionRahul Singhal2014-04-271-1/+1
| | | | | | | | | | | Better account for various side-effect outputs while executing "mvn help:evaluate -Dexpression=project.version" Author: Rahul Singhal <rahul.singhal@guavus.com> Closes #572 from rahulsinghaliitd/SPARK-1650 and squashes the following commits: fd6a611 [Rahul Singhal] SPARK-1650: Correctly identify maven project version
* SPARK-1619 Launch spark-shell with spark-submitPatrick Wendell2014-04-241-1/+1
| | | | | | | | | | | | | | | This simplifies the shell a bunch and passes all arguments through to spark-submit. There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users. Author: Patrick Wendell <pwendell@gmail.com> Closes #542 from pwendell/spark-shell and squashes the following commits: 9eb3e6f [Patrick Wendell] Updating Spark docs b552459 [Patrick Wendell] Andrew's feedback 97720fa [Patrick Wendell] Review feedback aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
* Small changes to release scriptPatrick Wendell2014-04-241-0/+1
|
* SPARK-1119 and other build improvementsPatrick Wendell2014-04-231-23/+47
| | | | | | | | | | | | 1. Makes assembly and examples jar naming consistent in maven/sbt. 2. Updates make-distribution.sh to use Maven and fixes some bugs. 3. Updates the create-release script to call make-distribution script. Author: Patrick Wendell <pwendell@gmail.com> Closes #502 from pwendell/make-distribution and squashes the following commits: 1a97f0d [Patrick Wendell] SPARK-1119 and other build improvements
* fix path for jar, make sed actually work on OSXNick Lanham2014-03-281-4/+3
| | | | | | | | Author: Nick Lanham <nick@afternight.org> Closes #264 from nicklan/make-distribution-fixes and squashes the following commits: 172b981 [Nick Lanham] fix path for jar, make sed actually work on OSX
* Make sed do -i '' on OSXNick Lanham2014-03-271-2/+9
| | | | | | | | | | | I don't have access to an OSX machine, so if someone could test this that would be great. Author: Nick Lanham <nick@afternight.org> Closes #258 from nicklan/osx-sed-fix and squashes the following commits: a6f158f [Nick Lanham] Also make mktemp work on OSX 558fd6e [Nick Lanham] Make sed do -i '' on OSX
* Bundle tachyon: SPARK-1269Nick Lanham2014-03-181-0/+32
| | | | | | | | | | | | | | | | This should all work as expected with the current version of the tachyon tarball (0.4.1) Author: Nick Lanham <nick@afternight.org> Closes #137 from nicklan/bundle-tachyon and squashes the following commits: 2eee15b [Nick Lanham] Put back in exec, start tachyon first 738ba23 [Nick Lanham] Move tachyon out of sbin f2f9bc6 [Nick Lanham] More checks for tachyon script 111e8e1 [Nick Lanham] Only try tachyon operations if tachyon script exists 0561574 [Nick Lanham] Copy over web resources so web interface can run 4dc9809 [Nick Lanham] Update to tachyon 0.4.1 0a1a20c [Nick Lanham] Add scripts using tachyon tarball
* fix make-distribution.sh show version: command not foundliguoqiang2014-01-091-1/+1
|
* Finish documentation changesHolden Karau2014-01-051-1/+1
|
* Code review feedbackHolden Karau2014-01-051-2/+5
|
* Merge remote-tracking branch 'apache-github/master' into remove-binariesPatrick Wendell2014-01-031-7/+4
|\ | | | | | | | | | | Conflicts: core/src/test/scala/org/apache/spark/DriverSuite.scala docs/python-programming-guide.md
| * a few left over document changePrashant Sharma2014-01-021-2/+2
| |
| * spark-shell -> bin/spark-shellPrashant Sharma2014-01-021-1/+1
| |
| * Merge branch 'scripts-reorg' of github.com:shane-huang/incubator-spark into ↵Prashant Sharma2014-01-021-4/+1
| |\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | spark-915-segregate-scripts Conflicts: bin/spark-shell core/pom.xml core/src/main/scala/org/apache/spark/SparkContext.scala core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala core/src/main/scala/org/apache/spark/ui/UIWorkloadGenerator.scala core/src/test/scala/org/apache/spark/DriverSuite.scala python/run-tests sbin/compute-classpath.sh sbin/spark-class sbin/stop-slaves.sh
| | * added spark-class and spark-executor to sbinshane-huang2013-09-231-4/+1
| | | | | | | | | | | | Signed-off-by: shane-huang <shengsheng.huang@intel.com>
* | | Changes on top of Prashant's patch.Patrick Wendell2014-01-031-1/+2
| | | | | | | | | | | | Closes #316
* | | Removed sbt folder and changed docs accordinglyPrashant Sharma2014-01-021-2/+10
|/ /
* / fixed a bug of using wildcard in quotesDu Li2013-10-011-1/+1
|/
* Fix copy issue in https://github.com/mesos/spark/pull/899Matei Zaharia2013-09-091-1/+1
|
* Fix path to assembly in make-distribution.shMatei Zaharia2013-08-291-1/+1
|
* Fix PySpark for assembly run and include it in distMatei Zaharia2013-08-291-1/+4
|
* Change build and run instructions to use assembliesMatei Zaharia2013-08-291-8/+9
| | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
* Change default SPARK_HADOOP_VERSION in make-distribution.sh tooMatei Zaharia2013-08-211-1/+1
|
* Rename YARN build flag to SPARK_WITH_YARNJey Kottalam2013-08-161-4/+4
|
* Allow make-distribution.sh to specify Hadoop version usedJey Kottalam2013-08-161-11/+42
|
* Added property 'spark.executor.uri' for launching on Mesos withoutBenjamin Hindman2013-07-291-0/+1
| | | | | | | requiring Spark to be installed. Using 'make_distribution.sh' a user can put a Spark distribution at a URI supported by Mesos (e.g., 'hdfs://...') and then set that when launching their job. Also added SPARK_EXECUTOR_URI for the REPL.
* Merge remote-tracking branch 'origin/pr/704'Matei Zaharia2013-07-161-3/+21
|\ | | | | | | | | Conflicts: make-distribution.sh
| * dding tgz option to make-distribution.shseanm2013-07-151-3/+21
| |
* | Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-161-0/+19
|/
* Merge remote-tracking branch 'origin/pr/662'Matei Zaharia2013-07-131-1/+2
| | | | | Conflicts: bin/compute-classpath.sh
* Add deploy/testing procedureEvan Chan2013-06-251-0/+8
|
* Script to create binary distribution for SparkEvan Chan2013-06-241-0/+30