aboutsummaryrefslogtreecommitdiff
path: root/core/src
Commit message (Collapse)AuthorAgeFilesLines
* Merge pull request #857 from mateiz/assemblyMatei Zaharia2013-08-294-4/+5
|\ | | | | Change build and run instructions to use assemblies
| * Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-293-2/+3
| |
| * Change build and run instructions to use assembliesMatei Zaharia2013-08-292-2/+2
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
* | Fix removed block zero size log reportingjerryshao2013-08-301-2/+2
|/
* Merge pull request #871 from pwendell/expose-localPatrick Wendell2013-08-281-1/+1
|\ | | | | Expose `isLocal` in SparkContext.
| * Make local variable publicPatrick Wendell2013-08-281-1/+1
| |
* | Merge pull request #865 from tgravescs/fixtmpdirMatei Zaharia2013-08-281-0/+22
|\ \ | | | | | | Spark on Yarn should use yarn approved directories for spark.local.dir and tmp
| * | Change Executor to only look at the env variable SPARK_YARN_MODEY.CORP.YAHOO.COM\tgraves2013-08-281-1/+1
| | |
| * | Updated based on review comments.Y.CORP.YAHOO.COM\tgraves2013-08-271-9/+6
| | |
| * | Allow for Executors to have different directories then the Spark Master for YarnY.CORP.YAHOO.COM\tgraves2013-08-271-0/+25
| |/
* | Added worker state to the cluster master JSON ui.Reynold Xin2013-08-261-1/+2
| |
* | Revert "Merge pull request #841 from rxin/json"Reynold Xin2013-08-265-64/+66
|/ | | | | This reverts commit 1fb1b0992838c8cdd57eec45793e67a0490f1a52, reversing changes made to c69c48947d5102c81a9425cb380d861c3903685c.
* Merge pull request #832 from alig/coalesceMatei Zaharia2013-08-224-46/+389
|\ | | | | Coalesced RDD with locality
| * Merged in from upstream to use TaskLocation instead of stringsAli Ghodsi2013-08-202-8/+11
| |
| * added curly braces to make the code more consistentAli Ghodsi2013-08-201-1/+2
| |
| * indentAli Ghodsi2013-08-201-1/+1
| |
| * Bug in test fixedAli Ghodsi2013-08-201-3/+3
| |
| * Added a test to make sure no locality preferences are ignoredAli Ghodsi2013-08-201-0/+5
| |
| * Simpler codeAli Ghodsi2013-08-202-5/+4
| |
| * simpler codeAli Ghodsi2013-08-201-16/+7
| |
| * Fixed almost all of Matei's feedbackAli Ghodsi2013-08-202-31/+26
| |
| * fixed Matei's commentsAli Ghodsi2013-08-203-73/+99
| |
| * making CoalescedRDDPartition publicAli Ghodsi2013-08-201-2/+1
| |
| * comment in the test to make it more understandableAli Ghodsi2013-08-201-1/+1
| |
| * Coalescer now uses current preferred locations for derived RDDs. Made run() ↵Ali Ghodsi2013-08-204-34/+59
| | | | | | | | in DAGScheduler thread safe and added a method to be able to ask it for preferred locations. Added a similar method that wraps the former inside SparkContext.
| * added one test that will test a future functionalityAli Ghodsi2013-08-201-1/+10
| |
| * Added error messages to the tests to make failed tests less crypticAli Ghodsi2013-08-201-7/+7
| |
| * fixed matei's commentsAli Ghodsi2013-08-201-15/+16
| |
| * Made a function object that returns the coalesced groupsAli Ghodsi2013-08-201-30/+35
| |
| * several of Reynold's suggestions implementedAli Ghodsi2013-08-201-15/+14
| |
| * space removedAli Ghodsi2013-08-201-1/+1
| |
| * use count rather than foreachAli Ghodsi2013-08-201-2/+1
| |
| * made preferredLocation a val of the surrounding case classAli Ghodsi2013-08-201-10/+3
| |
| * Fix bug in testsAli Ghodsi2013-08-202-6/+6
| |
| * Renamed split to partitionAli Ghodsi2013-08-201-11/+11
| |
| * word wrap before 100 chars per lineAli Ghodsi2013-08-202-41/+51
| |
| * added goals inline as commentAli Ghodsi2013-08-201-0/+21
| |
| * Large scale load and locality tests for the coalesced partitions addedAli Ghodsi2013-08-202-63/+118
| |
| * Bug, should compute slack wrt parent partition size, not number of binsAli Ghodsi2013-08-201-2/+2
| |
| * load balancing coalescerAli Ghodsi2013-08-202-11/+218
| |
* | Removed meaningless typesMark Hamstra2013-08-201-1/+1
|/
* Merge remote-tracking branch 'jey/hadoop-agnostic'Matei Zaharia2013-08-2028-2075/+166
|\ | | | | | | | | Conflicts: core/src/main/scala/spark/PairRDDFunctions.scala
| * Rename HadoopWriter to SparkHadoopWriter since it's outside of our packageJey Kottalam2013-08-152-6/+6
| |
| * Fix newTaskAttemptID to work under YARNJey Kottalam2013-08-151-1/+19
| |
| * re-enable YARN supportJey Kottalam2013-08-151-1/+13
| |
| * SparkEnv isn't available this early, and not needed anywayJey Kottalam2013-08-152-25/+0
| |
| * make SparkHadoopUtil a member of SparkEnvJey Kottalam2013-08-158-26/+31
| |
| * rename HadoopMapRedUtil => SparkHadoopMapRedUtil, HadoopMapReduceUtil => ↵Jey Kottalam2013-08-155-6/+7
| | | | | | | | SparkHadoopMapReduceUtil
| * add commentJey Kottalam2013-08-151-4/+4
| |
| * dynamically detect hadoop versionJey Kottalam2013-08-152-8/+48
| |