aboutsummaryrefslogtreecommitdiff
path: root/sbt
Commit message (Collapse)AuthorAgeFilesLines
* [SPARK-4312] bash doesn't have "die"Jey Kottalam2014-11-101-1/+2
| | | | | | | | | | | | | | | sbt-launch-lib.bash includes `die` command but it's not valid command for Linux, MacOS X or Windows. Closes #2898 Author: Jey Kottalam <jey@kottalam.net> Closes #3182 from sarutak/SPARK-4312 and squashes the following commits: 24c6677 [Jey Kottalam] bash doesn't have "die" (cherry picked from commit c5db8e2c07e442654f3d368608108e714e080184) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
* SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.Prashant Sharma2014-09-082-16/+16
| | | | | | | | | | | | ... Tested ! TBH, it isn't a great idea to have directory with spaces within. Because emacs doesn't like it then hadoop doesn't like it. and so on... Author: Prashant Sharma <prashant.s@imaginea.com> Closes #2229 from ScrapCodes/SPARK-3337/quoting-shell-scripts and squashes the following commits: d4ad660 [Prashant Sharma] SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.
* [Build] suppress curl/wget progress barsNicholas Chammas2014-09-051-2/+2
| | | | | | | | | | | | | | In the Jenkins console output, `curl` gives us mountains of `#` symbols as it tries to show its download progress. ![noise from curl in Jenkins output](http://i.imgur.com/P2E7yUw.png) I don't think this is useful so I've changed things to suppress these progress bars. If there is actually some use to this, feel free to reject this proposal. Author: Nicholas Chammas <nicholas.chammas@gmail.com> Closes #2279 from nchammas/trim-test-output and squashes the following commits: 14a720c [Nicholas Chammas] suppress curl/wget progress bars
* [SPARK-2437] Rename MAVEN_PROFILES to SBT_MAVEN_PROFILES and add ↵Prashant Sharma2014-07-111-1/+1
| | | | | | | | | | | | SBT_MAVEN_PROPERTIES NOTE: It is not possible to use both env variable `SBT_MAVEN_PROFILES` and `-P` flag at same time. `-P` if specified takes precedence. Author: Prashant Sharma <prashant.s@imaginea.com> Closes #1374 from ScrapCodes/SPARK-2437/rename-MAVEN_PROFILES and squashes the following commits: 8694bde [Prashant Sharma] [SPARK-2437] Rename MAVEN_PROFILES to SBT_MAVEN_PROFILES and add SBT_MAVEN_PROPERTIES
* [SPARK-1776] Have Spark's SBT build read dependencies from Maven.Prashant Sharma2014-07-102-1/+11
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Patch introduces the new way of working also retaining the existing ways of doing things. For example build instruction for yarn in maven is `mvn -Pyarn -PHadoop2.2 clean package -DskipTests` in sbt it can become `MAVEN_PROFILES="yarn, hadoop-2.2" sbt/sbt clean assembly` Also supports `sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 clean assembly` Author: Prashant Sharma <prashant.s@imaginea.com> Author: Patrick Wendell <pwendell@gmail.com> Closes #772 from ScrapCodes/sbt-maven and squashes the following commits: a8ac951 [Prashant Sharma] Updated sbt version. 62b09bb [Prashant Sharma] Improvements. fa6221d [Prashant Sharma] Excluding sql from mima 4b8875e [Prashant Sharma] Sbt assembly no longer builds tools by default. 72651ca [Prashant Sharma] Addresses code reivew comments. acab73d [Prashant Sharma] Revert "Small fix to run-examples script." ac4312c [Prashant Sharma] Revert "minor fix" 6af91ac [Prashant Sharma] Ported oldDeps back. + fixes issues with prev commit. 65cf06c [Prashant Sharma] Servelet API jars mess up with the other servlet jars on the class path. 446768e [Prashant Sharma] minor fix 89b9777 [Prashant Sharma] Merge conflicts d0a02f2 [Prashant Sharma] Bumped up pom versions, Since the build now depends on pom it is better updated there. + general cleanups. dccc8ac [Prashant Sharma] updated mima to check against 1.0 a49c61b [Prashant Sharma] Fix for tools jar a2f5ae1 [Prashant Sharma] Fixes a bug in dependencies. cf88758 [Prashant Sharma] cleanup 9439ea3 [Prashant Sharma] Small fix to run-examples script. 96cea1f [Prashant Sharma] SPARK-1776 Have Spark's SBT build read dependencies from Maven. 36efa62 [Patrick Wendell] Set project name in pom files and added eclipse/intellij plugins. 4973dbd [Patrick Wendell] Example build using pom reader.
* [SQL] Un-ignore a test that is now passing.Michael Armbrust2014-03-261-0/+8
| | | | | | | | | | | | Add golden answer for aforementioned test. Also, fix golden test generation from sbt/sbt by setting the classpath correctly. Author: Michael Armbrust <michael@databricks.com> Closes #244 from marmbrus/partTest and squashes the following commits: 37a33c9 [Michael Armbrust] Un-ignore a test that is now passing, add golden answer for aforementioned test. Fix golden test generation from sbt/sbt.
* Allow sbt to use more than 1G of heap.Reynold Xin2014-03-071-1/+1
| | | | | | | | | | | | There was a mistake in sbt build file ( introduced by 012bd5fbc97dc40bb61e0e2b9cc97ed0083f37f6 ) in which we set the default to 2048 and the immediately reset it to 1024. Without this, building Spark can run out of permgen space on my machine. Author: Reynold Xin <rxin@apache.org> Closes #103 from rxin/sbt and squashes the following commits: 8829c34 [Reynold Xin] Allow sbt to use more than 1G of heap.
* [java8API] SPARK-964 Investigate the potential for using JDK 8 lambda ↵Prashant Sharma2014-03-031-2/+9
| | | | | | | | | | | | | | | | | | | expressions for the Java/Scala APIs Author: Prashant Sharma <prashant.s@imaginea.com> Author: Patrick Wendell <pwendell@gmail.com> Closes #17 from ScrapCodes/java8-lambdas and squashes the following commits: 95850e6 [Patrick Wendell] Some doc improvements and build changes to the Java 8 patch. 85a954e [Prashant Sharma] Nit. import orderings. 673f7ac [Prashant Sharma] Added support for -java-home as well 80a13e8 [Prashant Sharma] Used fake class tag syntax 26eb3f6 [Prashant Sharma] Patrick's comments on PR. 35d8d79 [Prashant Sharma] Specified java 8 building in the docs 31d4cd6 [Prashant Sharma] Maven build to support -Pjava8-tests flag. 4ab87d3 [Prashant Sharma] Review feedback on the pr c33dc2c [Prashant Sharma] SPARK-964, Java 8 API Support.
* Merge the old sbt-launch-lib.bash with the new sbt-launcher jar downloading ↵Michael Armbrust2014-03-022-51/+280
| | | | | | | | | | | | logic. This allows developers to pass options (such as -D) to sbt. I also modified the SparkBuild to ensure spark specific properties are propagated to forked test JVMs. Author: Michael Armbrust <michael@databricks.com> Closes #14 from marmbrus/sbtScripts and squashes the following commits: c008b18 [Michael Armbrust] Merge the old sbt-launch-lib.bash with the new sbt-launcher jar downloading logic.
* Merge pull request #454 from jey/atomic-sbt-download. Closes #454.Jey Kottalam2014-02-081-2/+3
| | | | | | | | | | | | | | | | Make sbt download an atomic operation Modifies the `sbt/sbt` script to gracefully recover when a previous invocation died in the middle of downloading the SBT jar. Author: Jey Kottalam <jey@cs.berkeley.edu> == Merge branch commits == commit 6c600eb434a2f3e7d70b67831aeebde9b5c0f43b Author: Jey Kottalam <jey@cs.berkeley.edu> Date: Fri Jan 17 10:43:54 2014 -0800 Make sbt download an atomic operation
* Small typo fixPatrick Wendell2014-01-091-1/+1
|
* Don't delegate to users `sbt`.Patrick Wendell2014-01-081-31/+20
| | | | | | | | | | | | | | | | | | | | | | | | This changes our `sbt/sbt` script to not delegate to the user's `sbt` even if it is present. If users already have sbt installed and they want to use their own sbt, we'd expect them to just call sbt directly from within Spark. We no longer set any enironment variables or anything from this script, so they should just launch sbt directly on their own. There are a number of hard-to-debug issues which can come from the current appraoch. One is if the user is unaware of an existing sbt installation and now without explanation their build breaks because they haven't configured options correctly (such as permgen size) within their sbt. Another is if the user has a much older version of sbt hanging around, in which case some of the older versions don't acutally work well when newer verisons of sbt are specified in the build file (reported by @marmbrus). A third is if the user has done some other modification to their sbt script, such as setting it to delegate to sbt/sbt in Spark, and this causes that to break (also reported by @marmbrus). So to keep things simple let's just avoid this path and remove it. Any user who already has sbt and wants to build spark with it should be able to understand easily how to do it.
* Add ASF header to the new sbt script.Henry Saputra2014-01-071-0/+18
|
* Use awk to extract the versionHolden Karau2014-01-061-1/+1
|
* Put quote arround arguments passed down to system sbtHolden Karau2014-01-061-1/+1
|
* CR feedback (sbt -> sbt/sbt and correct JAR path in script) :)Holden Karau2014-01-051-1/+1
|
* Fix indentatationHolden Karau2014-01-051-16/+16
|
* Code review feedbackHolden Karau2014-01-051-9/+4
|
* reindentHolden Karau2014-01-041-31/+31
|
* And update docs to matchHolden Karau2014-01-041-1/+1
|
* Make sbt in the sbt directoryHolden Karau2014-01-041-0/+0
|
* SpellingHolden Karau2014-01-041-1/+1
|
* Pass commands down to system sbt as wellHolden Karau2014-01-041-1/+1
|
* Add a script to download sbt if not present on the systemHolden Karau2014-01-041-0/+48
|
* Removed sbt folder and changed docs accordinglyPrashant Sharma2014-01-023-68/+0
|
* Fix Cygwin support in several scripts.Josh Rosen2013-12-151-3/+18
| | | | | | | | | This allows the spark-shell, spark-class, run-example, make-distribution.sh, and ./bin/start-* scripts to work under Cygwin. Note that this doesn't support PySpark under Cygwin, since that requires many additional `cygpath` calls from within Python and will be non-trivial to implement. This PR was inspired by, and subsumes, #253 (so close #253 after this is merged).
* Run script fixes for Windows after package & assembly changeMatei Zaharia2013-09-011-1/+1
|
* Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-291-1/+1
|
* Pass SBT_OPTS environment through to sbt_launcherIan Buss2013-08-231-1/+1
|
* Increase ReservedCodeCacheSize to 256mJey Kottalam2013-08-211-1/+1
|
* Revert "Allow build configuration to be set in conf/spark-env.sh"Jey Kottalam2013-08-211-9/+6
| | | | This reverts commit 66e7a38a3229eeb6d980193048ebebcda1522acb.
* Allow build configuration to be set in conf/spark-env.shJey Kottalam2013-08-211-6/+9
|
* Merge pull request #714 from adatao/masterMatei Zaharia2013-07-181-1/+1
|\ | | | | [BUGFIX] Fix for sbt/sbt script SPARK_HOME setting
| * [BUGFIX] Fix for sbt/sbt script SPARK_HOME settingctn2013-07-171-1/+1
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | In some environments, this command export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd) echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting erroneous -jar paths below: ctn@ubuntu:~/src/spark$ sbt/sbt + EXTRA_ARGS= + '[' '' '!=' '' ']' +++ dirname sbt/sbt ++ cd sbt/.. ++ pwd + export 'SPARK_HOME=/home/ctn/src/spark /home/ctn/src/spark' + SPARK_HOME='/home/ctn/src/spark /home/ctn/src/spark' + export SPARK_TESTING=1 + SPARK_TESTING=1 + java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar Error: Invalid or corrupt jarfile /home/ctn/src/spark Committer: ctn <ctn@adatao.com> On branch master Changes to be committed: - Send output of the "cd .." part to /dev/null modified: sbt/sbt
* | Consistently invoke bash with /usr/bin/env bash in scripts to make code more ↵Ubuntu2013-07-181-1/+1
|/ | | | portable (JIRA Ticket SPARK-817)
* Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-162-0/+40
|
* Increase PermGen sizeMatei Zaharia2013-07-131-1/+1
|
* Increase ReservedCodeCacheSize for sbtJey Kottalam2013-04-161-1/+1
|
* Update Windows scripts to launch daemons with less RAM and fix a fewMatei Zaharia2013-02-101-1/+1
| | | | | | | other issues Conflicts: run2.cmd
* Track workers by executor ID instead of hostname to allow multipleMatei Zaharia2013-01-271-1/+1
| | | | | executors per machine and remove the need for multiple IP addresses in unit tests.
* Made run script add test-classes onto the classpath only if SPARK_TESTING is ↵root2012-10-071-0/+1
| | | | set; fixes #216
* Echo offRavi Pandya2012-09-241-1/+1
|
* Windows command scripts for sbt and runRavi Pandya2012-09-241-0/+5
|
* Merge branch 'dev' of github.com:mesos/spark into devMatei Zaharia2012-06-151-1/+1
|\
| * Added shutdown for akka to SparkContext.stop(). Helps a little, but many ↵Tathagata Das2012-06-131-1/+1
| | | | | | | | testsuites still fail.
* | Update SBT to version 0.11.3-2.Matei Zaharia2012-06-072-0/+0
|/
* Update to SBT 0.11.1Matei Zaharia2011-11-071-0/+0
|
* Upgrade to SBT 0.11.0.Ismael Juma2011-09-262-0/+0
|
* Initial work on converting build to SBT 0.10.1Ismael Juma2011-07-152-0/+0
|
* Give SBT a bit more memory so it can do a update / compile / test in one JVMMatei Zaharia2011-05-311-1/+1
|