aboutsummaryrefslogtreecommitdiff
path: root/project
Commit message (Collapse)AuthorAgeFilesLines
* Update Maven build to create assemblies expected by new scriptsMatei Zaharia2013-08-291-2/+2
| | | | | | | | | | | | | | | This includes the following changes: - The "assembly" package now builds in Maven by default, and creates an assembly containing both hadoop-client and Spark, unlike the old BigTop distribution assembly that skipped hadoop-client - There is now a bigtop-dist package to build the old BigTop assembly - The repl-bin package is no longer built by default since the scripts don't reply on it; instead it can be enabled with -Prepl-bin - Py4J is now included in the assembly/lib folder as a local Maven repo, so that the Maven package can link to it - run-example now adds the original Spark classpath as well because the Maven examples assembly lists spark-core and such as provided - The various Maven projects add a spark-yarn dependency correctly
* Provide more memory for testsMatei Zaharia2013-08-291-1/+1
|
* Change build and run instructions to use assembliesMatei Zaharia2013-08-293-22/+34
| | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
* Revert "Merge pull request #841 from rxin/json"Reynold Xin2013-08-261-0/+1
| | | | | This reverts commit 1fb1b0992838c8cdd57eec45793e67a0490f1a52, reversing changes made to c69c48947d5102c81a9425cb380d861c3903685c.
* Upgrade SBT IDE project generatorsJey Kottalam2013-08-231-2/+2
|
* Fix SBT generation of IDE project filesJey Kottalam2013-08-231-5/+12
|
* Re-add removed dependency on 'commons-daemon'Jey Kottalam2013-08-221-0/+1
| | | | Fixes SBT build under Hadoop 0.23.9 and 2.0.4
* Merge pull request #855 from jey/update-build-docsMatei Zaharia2013-08-221-4/+3
|\ | | | | Update build docs
| * Remove references to unsupported Hadoop versionsJey Kottalam2013-08-211-4/+3
| |
* | Merge pull request #854 from markhamstra/pomUpdateMatei Zaharia2013-08-221-4/+1
|\ \ | |/ |/| Synced sbt and maven builds to use the same dependencies, etc.
| * Synced sbt and maven buildsMark Hamstra2013-08-211-4/+1
| |
* | Downgraded default build hadoop version to 1.0.4.Reynold Xin2013-08-211-1/+1
|/
* Merge remote-tracking branch 'jey/hadoop-agnostic'Matei Zaharia2013-08-201-41/+35
|\ | | | | | | | | Conflicts: core/src/main/scala/spark/PairRDDFunctions.scala
| * Update SBT build to use simpler fix for Hadoop 0.23.9Jey Kottalam2013-08-191-11/+2
| |
| * Rename YARN build flag to SPARK_WITH_YARNJey Kottalam2013-08-161-5/+7
| |
| * Fix SBT build under Hadoop 0.23.xJey Kottalam2013-08-161-0/+11
| |
| * Fix repl/assembly when YARN enabledJey Kottalam2013-08-161-3/+4
| |
| * Allow make-distribution.sh to specify Hadoop version usedJey Kottalam2013-08-161-6/+22
| |
| * Update default version of Hadoop to 1.2.1Jey Kottalam2013-08-151-1/+1
| |
| * yarn supportJey Kottalam2013-08-151-6/+6
| |
| * yarn sbtJey Kottalam2013-08-151-13/+15
| |
| * dynamically detect hadoop versionJey Kottalam2013-08-151-32/+3
| |
* | Use the JSON formatter from Scala library and removed dependency on lift-json.Reynold Xin2013-08-151-1/+0
|/ | | | It made the JSON creation slightly more complicated, but reduces one external dependency. The scala library also properly escape "/" (which lift-json doesn't).
* Update to Mesos 0.12.1Matei Zaharia2013-08-131-1/+1
|
* Add MetricsServlet for Spark metrics systemjerryshao2013-08-121-0/+1
|
* Merge pull request #800 from dlyubimov/HBASE_VERSIONMatei Zaharia2013-08-091-1/+4
|\ | | | | Pull HBASE_VERSION in the head of sbt build
| * fewer wordsDmitriy Lyubimov2013-08-091-1/+1
| |
| * Pull HBASE_VERSION in the head of sbt buildDmitriy Lyubimov2013-08-091-1/+4
| |
* | Merge pull request #786 from shivaram/mllib-javaMatei Zaharia2013-08-091-1/+1
|\ \ | |/ |/| Java fixes, tests and examples for ALS, KMeans
| * Java examples, tests for KMeans and ALSShivaram Venkataraman2013-08-061-1/+1
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - Changes ALS to accept RDD[Rating] instead of (Int, Int, Double) making it easier to call from Java - Renames class methods from `train` to `run` to enable static methods to be called from Java. - Add unit tests which check if both static / class methods can be called. - Also add examples which port the main() function in ALS, KMeans to the examples project. Couple of minor changes to existing code: - Add a toJavaRDD method in RDD to convert scala RDD to java RDD easily - Workaround a bug where using double[] from Java leads to class cast exception in KMeans init
* | Update to Chill 0.3.1Matei Zaharia2013-08-081-2/+2
|/
* Revert Mesos version to 0.9 since the 0.12 artifact has target Java 7Matei Zaharia2013-08-011-1/+1
|
* Merge pull request #753 from shivaram/glm-refactorMatei Zaharia2013-07-311-2/+4
|\ | | | | Build changes for ML lib
| * Add mllib, bagel to repl dependenciesShivaram Venkataraman2013-07-301-3/+3
| | | | | | | | Also don't build an assembly jar for them
| * Add bagel, mllib to SBT assembly.Shivaram Venkataraman2013-07-301-2/+4
| | | | | | | | Also add jblas dependency to mllib pom.xml
* | Merge pull request #749 from benh/spark-executor-uriMatei Zaharia2013-07-311-1/+1
|\ \ | | | | | | Added property 'spark.executor.uri' for launching on Mesos.
| * | Added property 'spark.executor.uri' for launching on Mesos withoutBenjamin Hindman2013-07-291-1/+1
| |/ | | | | | | | | | | | | requiring Spark to be installed. Using 'make_distribution.sh' a user can put a Spark distribution at a URI supported by Mesos (e.g., 'hdfs://...') and then set that when launching their job. Also added SPARK_EXECUTOR_URI for the REPL.
* | Exclude older version of Snappy in streaming and examples.Reynold Xin2013-07-301-1/+3
| |
* | Merge branch 'lazy_file_open' of github.com:lyogavin/spark into compressionReynold Xin2013-07-301-0/+1
|\ \ | |/ |/| | | | | Conflicts: project/SparkBuild.scala
| * fix dependenciesGavin Li2013-07-031-1/+2
| |
* | refactor Kryo serializer support to use chill/chill-javaryanlecompte2013-07-241-2/+3
| |
* | Fix some typosjerryshao2013-07-241-1/+1
| |
* | Add dependency of Codahale's metrics libraryjerryshao2013-07-241-0/+2
| |
* | Add JavaAPICompletenessChecker.Josh Rosen2013-07-221-1/+7
| | | | | | | | | | | | | | | | | | | | | | This is used to find methods in the Scala API that need to be ported to the Java API. To use it: ./run spark.tools.JavaAPICompletenessChecker Conflicts: project/SparkBuild.scala run run2.cmd
* | also exclude asm for hadoop2. hadoop1 looks like no need to do that too.Liang-Chi Hsieh2013-07-201-2/+2
| |
* | fix a bug in build process that pulls in two versionf of ASM.Liang-Chi Hsieh2013-07-191-4/+4
| |
* | Merge pull request #708 from ScrapCodes/dependencies-upgradeMatei Zaharia2013-07-161-4/+4
|\ \ | | | | | | Dependency upgrade Akka 2.0.3 -> 2.0.5
| * | Dependency upgrade Akka 2.0.3 -> 2.0.5Prashant Sharma2013-07-161-4/+4
| | |
* | | Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-162-0/+33
|/ /
* | Merge branch 'master' of github.com:mesos/sparkMatei Zaharia2013-07-131-1/+1
|\ \