| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
| |
There was a mistake in sbt build file ( introduced by 012bd5fbc97dc40bb61e0e2b9cc97ed0083f37f6 ) in which we set the default to 2048 and the immediately reset it to 1024.
Without this, building Spark can run out of permgen space on my machine.
Author: Reynold Xin <rxin@apache.org>
Closes #103 from rxin/sbt and squashes the following commits:
8829c34 [Reynold Xin] Allow sbt to use more than 1G of heap.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
expressions for the Java/Scala APIs
Author: Prashant Sharma <prashant.s@imaginea.com>
Author: Patrick Wendell <pwendell@gmail.com>
Closes #17 from ScrapCodes/java8-lambdas and squashes the following commits:
95850e6 [Patrick Wendell] Some doc improvements and build changes to the Java 8 patch.
85a954e [Prashant Sharma] Nit. import orderings.
673f7ac [Prashant Sharma] Added support for -java-home as well
80a13e8 [Prashant Sharma] Used fake class tag syntax
26eb3f6 [Prashant Sharma] Patrick's comments on PR.
35d8d79 [Prashant Sharma] Specified java 8 building in the docs
31d4cd6 [Prashant Sharma] Maven build to support -Pjava8-tests flag.
4ab87d3 [Prashant Sharma] Review feedback on the pr
c33dc2c [Prashant Sharma] SPARK-964, Java 8 API Support.
|
|
|
|
|
|
|
|
|
|
|
|
| |
logic.
This allows developers to pass options (such as -D) to sbt. I also modified the SparkBuild to ensure spark specific properties are propagated to forked test JVMs.
Author: Michael Armbrust <michael@databricks.com>
Closes #14 from marmbrus/sbtScripts and squashes the following commits:
c008b18 [Michael Armbrust] Merge the old sbt-launch-lib.bash with the new sbt-launcher jar downloading logic.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Make sbt download an atomic operation
Modifies the `sbt/sbt` script to gracefully recover when a previous invocation died in the middle of downloading the SBT jar.
Author: Jey Kottalam <jey@cs.berkeley.edu>
== Merge branch commits ==
commit 6c600eb434a2f3e7d70b67831aeebde9b5c0f43b
Author: Jey Kottalam <jey@cs.berkeley.edu>
Date: Fri Jan 17 10:43:54 2014 -0800
Make sbt download an atomic operation
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This changes our `sbt/sbt` script to not delegate to the user's `sbt`
even if it is present. If users already have sbt installed and they
want to use their own sbt, we'd expect them to just call sbt directly
from within Spark. We no longer set any enironment variables or anything
from this script, so they should just launch sbt directly on their own.
There are a number of hard-to-debug issues which can come from the
current appraoch. One is if the user is unaware of an existing sbt
installation and now without explanation their build breaks because
they haven't configured options correctly (such as permgen size)
within their sbt. Another is if the user has a much older version
of sbt hanging around, in which case some of the older versions
don't acutally work well when newer verisons of sbt are specified
in the build file (reported by @marmbrus). A third is if the user
has done some other modification to their sbt script, such as
setting it to delegate to sbt/sbt in Spark, and this causes
that to break (also reported by @marmbrus).
So to keep things simple let's just avoid this path and
remove it. Any user who already has sbt and wants to build
spark with it should be able to understand easily how to do it.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
| |
This allows the spark-shell, spark-class, run-example, make-distribution.sh,
and ./bin/start-* scripts to work under Cygwin. Note that this doesn't
support PySpark under Cygwin, since that requires many additional `cygpath`
calls from within Python and will be non-trivial to implement.
This PR was inspired by, and subsumes, #253 (so close #253 after this is merged).
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
This reverts commit 66e7a38a3229eeb6d980193048ebebcda1522acb.
|
| |
|
|\
| |
| | |
[BUGFIX] Fix for sbt/sbt script SPARK_HOME setting
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
In some environments, this command
export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd)
echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting
erroneous -jar paths below:
ctn@ubuntu:~/src/spark$ sbt/sbt
+ EXTRA_ARGS=
+ '[' '' '!=' '' ']'
+++ dirname sbt/sbt
++ cd sbt/..
++ pwd
+ export 'SPARK_HOME=/home/ctn/src/spark
/home/ctn/src/spark'
+ SPARK_HOME='/home/ctn/src/spark
/home/ctn/src/spark'
+ export SPARK_TESTING=1
+ SPARK_TESTING=1
+ java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar
Error: Invalid or corrupt jarfile /home/ctn/src/spark
Committer: ctn <ctn@adatao.com>
On branch master
Changes to be committed:
- Send output of the "cd .." part to /dev/null
modified: sbt/sbt
|
|/
|
|
| |
portable (JIRA Ticket SPARK-817)
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
other issues
Conflicts:
run2.cmd
|
|
|
|
|
| |
executors per machine and remove the need for multiple IP addresses in
unit tests.
|
|
|
|
| |
set; fixes #216
|
| |
|
| |
|
|\ |
|
| |
| |
| |
| | |
testsuites still fail.
|
|/ |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|