Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Merge pull request #454 from jey/atomic-sbt-download. Closes #454. | Jey Kottalam | 2014-02-08 | 1 | -2/+3 |
| | | | | | | | | | | | | | | | | Make sbt download an atomic operation Modifies the `sbt/sbt` script to gracefully recover when a previous invocation died in the middle of downloading the SBT jar. Author: Jey Kottalam <jey@cs.berkeley.edu> == Merge branch commits == commit 6c600eb434a2f3e7d70b67831aeebde9b5c0f43b Author: Jey Kottalam <jey@cs.berkeley.edu> Date: Fri Jan 17 10:43:54 2014 -0800 Make sbt download an atomic operation | ||||
* | Small typo fix | Patrick Wendell | 2014-01-09 | 1 | -1/+1 |
| | |||||
* | Don't delegate to users `sbt`. | Patrick Wendell | 2014-01-08 | 1 | -31/+20 |
| | | | | | | | | | | | | | | | | | | | | | | | | This changes our `sbt/sbt` script to not delegate to the user's `sbt` even if it is present. If users already have sbt installed and they want to use their own sbt, we'd expect them to just call sbt directly from within Spark. We no longer set any enironment variables or anything from this script, so they should just launch sbt directly on their own. There are a number of hard-to-debug issues which can come from the current appraoch. One is if the user is unaware of an existing sbt installation and now without explanation their build breaks because they haven't configured options correctly (such as permgen size) within their sbt. Another is if the user has a much older version of sbt hanging around, in which case some of the older versions don't acutally work well when newer verisons of sbt are specified in the build file (reported by @marmbrus). A third is if the user has done some other modification to their sbt script, such as setting it to delegate to sbt/sbt in Spark, and this causes that to break (also reported by @marmbrus). So to keep things simple let's just avoid this path and remove it. Any user who already has sbt and wants to build spark with it should be able to understand easily how to do it. | ||||
* | Add ASF header to the new sbt script. | Henry Saputra | 2014-01-07 | 1 | -0/+18 |
| | |||||
* | Use awk to extract the version | Holden Karau | 2014-01-06 | 1 | -1/+1 |
| | |||||
* | Put quote arround arguments passed down to system sbt | Holden Karau | 2014-01-06 | 1 | -1/+1 |
| | |||||
* | CR feedback (sbt -> sbt/sbt and correct JAR path in script) :) | Holden Karau | 2014-01-05 | 1 | -1/+1 |
| | |||||
* | Fix indentatation | Holden Karau | 2014-01-05 | 1 | -16/+16 |
| | |||||
* | Code review feedback | Holden Karau | 2014-01-05 | 1 | -9/+4 |
| | |||||
* | reindent | Holden Karau | 2014-01-04 | 1 | -31/+31 |
| | |||||
* | And update docs to match | Holden Karau | 2014-01-04 | 1 | -1/+1 |
| | |||||
* | Make sbt in the sbt directory | Holden Karau | 2014-01-04 | 1 | -0/+0 |
| | |||||
* | Spelling | Holden Karau | 2014-01-04 | 1 | -1/+1 |
| | |||||
* | Pass commands down to system sbt as well | Holden Karau | 2014-01-04 | 1 | -1/+1 |
| | |||||
* | Add a script to download sbt if not present on the system | Holden Karau | 2014-01-04 | 1 | -0/+48 |
| | |||||
* | Removed sbt folder and changed docs accordingly | Prashant Sharma | 2014-01-02 | 3 | -68/+0 |
| | |||||
* | Fix Cygwin support in several scripts. | Josh Rosen | 2013-12-15 | 1 | -3/+18 |
| | | | | | | | | | This allows the spark-shell, spark-class, run-example, make-distribution.sh, and ./bin/start-* scripts to work under Cygwin. Note that this doesn't support PySpark under Cygwin, since that requires many additional `cygpath` calls from within Python and will be non-trivial to implement. This PR was inspired by, and subsumes, #253 (so close #253 after this is merged). | ||||
* | Run script fixes for Windows after package & assembly change | Matei Zaharia | 2013-09-01 | 1 | -1/+1 |
| | |||||
* | Fix finding of assembly JAR, as well as some pointers to ./run | Matei Zaharia | 2013-08-29 | 1 | -1/+1 |
| | |||||
* | Pass SBT_OPTS environment through to sbt_launcher | Ian Buss | 2013-08-23 | 1 | -1/+1 |
| | |||||
* | Increase ReservedCodeCacheSize to 256m | Jey Kottalam | 2013-08-21 | 1 | -1/+1 |
| | |||||
* | Revert "Allow build configuration to be set in conf/spark-env.sh" | Jey Kottalam | 2013-08-21 | 1 | -9/+6 |
| | | | | This reverts commit 66e7a38a3229eeb6d980193048ebebcda1522acb. | ||||
* | Allow build configuration to be set in conf/spark-env.sh | Jey Kottalam | 2013-08-21 | 1 | -6/+9 |
| | |||||
* | Merge pull request #714 from adatao/master | Matei Zaharia | 2013-07-18 | 1 | -1/+1 |
|\ | | | | | [BUGFIX] Fix for sbt/sbt script SPARK_HOME setting | ||||
| * | [BUGFIX] Fix for sbt/sbt script SPARK_HOME setting | ctn | 2013-07-17 | 1 | -1/+1 |
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | In some environments, this command export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd) echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting erroneous -jar paths below: ctn@ubuntu:~/src/spark$ sbt/sbt + EXTRA_ARGS= + '[' '' '!=' '' ']' +++ dirname sbt/sbt ++ cd sbt/.. ++ pwd + export 'SPARK_HOME=/home/ctn/src/spark /home/ctn/src/spark' + SPARK_HOME='/home/ctn/src/spark /home/ctn/src/spark' + export SPARK_TESTING=1 + SPARK_TESTING=1 + java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar Error: Invalid or corrupt jarfile /home/ctn/src/spark Committer: ctn <ctn@adatao.com> On branch master Changes to be committed: - Send output of the "cd .." part to /dev/null modified: sbt/sbt | ||||
* | | Consistently invoke bash with /usr/bin/env bash in scripts to make code more ↵ | Ubuntu | 2013-07-18 | 1 | -1/+1 |
|/ | | | | portable (JIRA Ticket SPARK-817) | ||||
* | Add Apache license headers and LICENSE and NOTICE files | Matei Zaharia | 2013-07-16 | 2 | -0/+40 |
| | |||||
* | Increase PermGen size | Matei Zaharia | 2013-07-13 | 1 | -1/+1 |
| | |||||
* | Increase ReservedCodeCacheSize for sbt | Jey Kottalam | 2013-04-16 | 1 | -1/+1 |
| | |||||
* | Update Windows scripts to launch daemons with less RAM and fix a few | Matei Zaharia | 2013-02-10 | 1 | -1/+1 |
| | | | | | | | other issues Conflicts: run2.cmd | ||||
* | Track workers by executor ID instead of hostname to allow multiple | Matei Zaharia | 2013-01-27 | 1 | -1/+1 |
| | | | | | executors per machine and remove the need for multiple IP addresses in unit tests. | ||||
* | Made run script add test-classes onto the classpath only if SPARK_TESTING is ↵ | root | 2012-10-07 | 1 | -0/+1 |
| | | | | set; fixes #216 | ||||
* | Echo off | Ravi Pandya | 2012-09-24 | 1 | -1/+1 |
| | |||||
* | Windows command scripts for sbt and run | Ravi Pandya | 2012-09-24 | 1 | -0/+5 |
| | |||||
* | Merge branch 'dev' of github.com:mesos/spark into dev | Matei Zaharia | 2012-06-15 | 1 | -1/+1 |
|\ | |||||
| * | Added shutdown for akka to SparkContext.stop(). Helps a little, but many ↵ | Tathagata Das | 2012-06-13 | 1 | -1/+1 |
| | | | | | | | | testsuites still fail. | ||||
* | | Update SBT to version 0.11.3-2. | Matei Zaharia | 2012-06-07 | 2 | -0/+0 |
|/ | |||||
* | Update to SBT 0.11.1 | Matei Zaharia | 2011-11-07 | 1 | -0/+0 |
| | |||||
* | Upgrade to SBT 0.11.0. | Ismael Juma | 2011-09-26 | 2 | -0/+0 |
| | |||||
* | Initial work on converting build to SBT 0.10.1 | Ismael Juma | 2011-07-15 | 2 | -0/+0 |
| | |||||
* | Give SBT a bit more memory so it can do a update / compile / test in one JVM | Matei Zaharia | 2011-05-31 | 1 | -1/+1 |
| | |||||
* | Various minor fixes | Matei Zaharia | 2011-05-19 | 1 | -1/+6 |
| | |||||
* | Upgraded to SBT 0.7.5 | Matei Zaharia | 2011-05-09 | 1 | -0/+0 |
| | |||||
* | Increased SBT mem to 700 MB so that unit tests run more nicely | Matei Zaharia | 2011-02-08 | 1 | -1/+1 |
| | |||||
* | Initial work to get Spark compiling with SBT 0.7.5 RC0 | Matei Zaharia | 2010-11-13 | 2 | -0/+2 |