aboutsummaryrefslogtreecommitdiff
path: root/core
Commit message (Collapse)AuthorAgeFilesLines
* Merge pull request #898 from ilikerps/660Matei Zaharia2013-09-081-3/+3
|\ | | | | SPARK-660: Add StorageLevel support in Python
| * Export StorageLevel and refactorAaron Davidson2013-09-071-3/+3
| |
| * Remove reflection, hard-code StorageLevelsAaron Davidson2013-09-071-11/+0
| | | | | | | | | | | | | | | | | | | | | | The sc.StorageLevel -> StorageLevel pathway is a bit janky, but otherwise the shell would have to call a private method of SparkContext. Having StorageLevel available in sc also doesn't seem like the end of the world. There may be a better solution, though. As for creating the StorageLevel object itself, this seems to be the best way in Python 2 for creating singleton, enum-like objects: http://stackoverflow.com/questions/36932/how-can-i-represent-an-enum-in-python
| * Memoize StorageLevels read from JVMAaron Davidson2013-09-061-1/+1
| |
| * SPARK-660: Add StorageLevel support in PythonAaron Davidson2013-09-051-0/+11
| | | | | | | | | | It uses reflection... I am not proud of that fact, but it at least ensures compatibility (sans refactoring of the StorageLevel stuff).
* | Fixed the bug that ResultTask was not properly deserializing outputId.Reynold Xin2013-09-071-2/+2
| |
* | Hot fix to resolve the compilation error caused by SPARK-821.Reynold Xin2013-09-061-1/+1
| |
* | Merge pull request #895 from ilikerps/821Patrick Wendell2013-09-057-7/+102
|\ \ | | | | | | SPARK-821: Don't cache results when action run locally on driver
| * | Reynold's second round of commentsAaron Davidson2013-09-052-17/+19
| | |
| * | Add unit test and address commentsAaron Davidson2013-09-055-6/+98
| | |
| * | SPARK-821: Don't cache results when action run locally on driverAaron Davidson2013-09-054-4/+5
| |/ | | | | | | | | | | Caching the results of local actions (e.g., rdd.first()) causes the driver to store entire partitions in its own memory, which may be highly constrained. This patch simply makes the CacheManager avoid caching the result of all locally-run computations.
* | Merge pull request #891 from xiajunluan/SPARK-864Matei Zaharia2013-09-051-1/+8
|\ \ | | | | | | [SPARK-864]DAGScheduler Exception if we delete Worker and StandaloneExecutorBackend then add Worker
| * | Fix bug SPARK-864Andrew xia2013-09-051-1/+8
| | |
* | | Merge pull request #893 from ilikerps/masterPatrick Wendell2013-09-041-0/+92
|\ \ \ | | |/ | |/| SPARK-884: Add unit test to validate Spark JSON output
| * | Fix line over 100 charsAaron Davidson2013-09-041-2/+2
| | |
| * | Address Patrick's commentsAaron Davidson2013-09-041-8/+15
| | |
| * | SPARK-884: Add unit test to validate Spark JSON outputAaron Davidson2013-09-041-0/+85
| |/ | | | | | | | | This unit test simply validates that the outputs of the JsonProtocol methods are syntactically valid JSON.
* | Minor spacing fixPatrick Wendell2013-09-031-2/+4
| |
* | Merge pull request #878 from tgravescs/yarnUILinkPatrick Wendell2013-09-039-26/+36
|\ \ | | | | | | Link the Spark UI up to the Yarn UI
| * | Update based on review comments. Change function to prependBaseUri and fix ↵Y.CORP.YAHOO.COM\tgraves2013-09-034-24/+23
| | | | | | | | | | | | formatting.
| * | Review comment changes and update to org.apache packagingY.CORP.YAHOO.COM\tgraves2013-09-036-32/+20
| | |
| * | Merge remote-tracking branch 'mesos/master' into yarnUILinkY.CORP.YAHOO.COM\tgraves2013-09-03333-1491/+2664
| |\| | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/ui/UIUtils.scala core/src/main/scala/org/apache/spark/ui/jobs/PoolTable.scala core/src/main/scala/org/apache/spark/ui/jobs/StageTable.scala docs/running-on-yarn.md
| * | fix up minor thingsY.CORP.YAHOO.COM\tgraves2013-08-302-5/+6
| | |
| * | Link the Spark UI to the Yarn UIY.CORP.YAHOO.COM\tgraves2013-08-306-27/+50
| | |
* | | Merge pull request #889 from alig/masterMatei Zaharia2013-09-033-5/+22
|\ \ \ | |_|/ |/| | Return the port the WebUI is bound to (useful if port 0 was used)
| * | Merge branch 'master' of https://github.com/alig/sparkAli Ghodsi2013-09-032-5/+5
| |\ \ | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/deploy/master/Master.scala
| | * | Sort order of imports to match project guidelinesAli Ghodsi2013-09-021-5/+5
| | | |
| | * | Reynold's comment fixedAli Ghodsi2013-09-021-1/+1
| | | |
| * | | Using configured akka timeoutsAli Ghodsi2013-09-031-3/+5
| |/ /
| * | Brushing the code up slightlyAli Ghodsi2013-09-022-4/+4
| | |
| * | Enabling getting the actual WEBUI portAli Ghodsi2013-09-023-4/+19
| | |
* | | Add missing license headers found with RATMatei Zaharia2013-09-0216-3/+256
|/ /
* | Fix testMatei Zaharia2013-09-021-1/+1
| |
* | Fix spark.io.compression.codec and change default codec to LZFMatei Zaharia2013-09-021-8/+4
| |
* | Allow PySpark to launch worker.py directly on WindowsMatei Zaharia2013-09-011-8/+99
| |
* | Run script fixes for Windows after package & assembly changeMatei Zaharia2013-09-011-11/+17
| |
* | Move some classes to more appropriate packages:Matei Zaharia2013-09-01128-284/+303
| | | | | | | | | | | | * RDD, *RDDFunctions -> org.apache.spark.rdd * Utils, ClosureCleaner, SizeEstimator -> org.apache.spark.util * JavaSerializer, KryoSerializer -> org.apache.spark.serializer
* | Fix some URLsMatei Zaharia2013-09-011-1/+1
| |
* | Remove shutdown hook to stop jetty; this is unnecessary for releasingMatei Zaharia2013-09-011-1/+0
| | | | | | | | ports and creates noisy log messages
* | Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-01326-936/+941
| |
* | Merge pull request #883 from alig/masterMatei Zaharia2013-09-011-2/+1
|\ \ | | | | | | Don't require the spark home environment variable to be set for standalone mode (change needed by SIMR)
| * | Don't require spark home to be set for standalone modeAli Ghodsi2013-08-311-2/+1
| |/
* | Small tweakMatei Zaharia2013-08-311-1/+1
| |
* | Print output from spark-daemon only when it fails to launchMatei Zaharia2013-08-311-1/+0
| |
* | Various web UI improvements:Matei Zaharia2013-08-3116-164/+971
|/ | | | | | | | | - Use "fluid" layout that can expand to wide browser windows, instead of the old one's limit of 1200 px - Remove unnecessary <hr> elements - Switch back to Bootstrap's default theme and tweak progress bar colors - Make headers more consistent between deploy and app UIs - Replace some inline CSS with stylesheets
* Also add getConf to NewHadoopRDDMikhail Bautin2013-08-301-0/+3
|
* Make HadoopRDD's configuration accessibleMikhail Bautin2013-08-301-1/+3
|
* Merge pull request #857 from mateiz/assemblyMatei Zaharia2013-08-294-4/+5
|\ | | | | Change build and run instructions to use assemblies
| * Update Maven build to create assemblies expected by new scriptsMatei Zaharia2013-08-293-28/+0
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This includes the following changes: - The "assembly" package now builds in Maven by default, and creates an assembly containing both hadoop-client and Spark, unlike the old BigTop distribution assembly that skipped hadoop-client - There is now a bigtop-dist package to build the old BigTop assembly - The repl-bin package is no longer built by default since the scripts don't reply on it; instead it can be enabled with -Prepl-bin - Py4J is now included in the assembly/lib folder as a local Maven repo, so that the Maven package can link to it - run-example now adds the original Spark classpath as well because the Maven examples assembly lists spark-core and such as provided - The various Maven projects add a spark-yarn dependency correctly
| * Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-293-2/+3
| |