aboutsummaryrefslogtreecommitdiff
path: root/repl/src
Commit message (Collapse)AuthorAgeFilesLines
...
* SPARK-1189: Add Security to Spark - Akka, Http, ConnectionManager, UI use ↵Thomas Graves2014-03-063-15/+33
| | | | | | | | | | | | | | | | | | | | | | | | | | | servlets resubmit pull request. was https://github.com/apache/incubator-spark/pull/332. Author: Thomas Graves <tgraves@apache.org> Closes #33 from tgravescs/security-branch-0.9-with-client-rebase and squashes the following commits: dfe3918 [Thomas Graves] Fix merge conflict since startUserClass now using runAsUser 05eebed [Thomas Graves] Fix dependency lost in upmerge d1040ec [Thomas Graves] Fix up various imports 05ff5e0 [Thomas Graves] Fix up imports after upmerging to master ac046b3 [Thomas Graves] Merge remote-tracking branch 'upstream/master' into security-branch-0.9-with-client-rebase 13733e1 [Thomas Graves] Pass securityManager and SparkConf around where we can. Switch to use sparkConf for reading config whereever possible. Added ConnectionManagerSuite unit tests. 4a57acc [Thomas Graves] Change UI createHandler routines to createServlet since they now return servlets 2f77147 [Thomas Graves] Rework from comments 50dd9f2 [Thomas Graves] fix header in SecurityManager ecbfb65 [Thomas Graves] Fix spacing and formatting b514bec [Thomas Graves] Fix reference to config ed3d1c1 [Thomas Graves] Add security.md 6f7ddf3 [Thomas Graves] Convert SaslClient and SaslServer to scala, change spark.authenticate.ui to spark.ui.acls.enable, and fix up various other things from review comments 2d9e23e [Thomas Graves] Merge remote-tracking branch 'upstream/master' into security-branch-0.9-with-client-rebase_rework 5721c5a [Thomas Graves] update AkkaUtilsSuite test for the actorSelection changes, fix typos based on comments, and remove extra lines I missed in rebase from AkkaUtils f351763 [Thomas Graves] Add Security to Spark - Akka, Http, ConnectionManager, UI to use servlets
* [SPARK-1089] fix the regression problem on ADD_JARS in 0.9CodingCat2014-02-261-2/+7
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | https://spark-project.atlassian.net/browse/SPARK-1089 copied from JIRA, reported by @ash211 "Using the ADD_JARS environment variable with spark-shell used to add the jar to both the shell and the various workers. Now it only adds to the workers and importing a custom class in the shell is broken. The workaround is to add custom jars to both ADD_JARS and SPARK_CLASSPATH. We should fix ADD_JARS so it works properly again. See various threads on the user list: https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201402.mbox/%3CCAJbo4neMLiTrnm1XbyqomWmp0m+EUcg4yE-txuRGSVKOb5KLeA@mail.gmail.com%3E (another one that doesn't appear in the archives yet titled "ADD_JARS not working on 0.9")" The reason of this bug is two-folds in the current implementation of SparkILoop.scala, the settings.classpath is not set properly when the process() method is invoked the weird behaviour of Scala 2.10, (I personally thought it is a bug) if we simply set value of a PathSettings object (like settings.classpath), the isDefault is not set to true (this is a flag showing if the variable is modified), so it makes the PathResolver loads the default CLASSPATH environment variable value to calculated the path (see https://github.com/scala/scala/blob/2.10.x/src/compiler/scala/tools/util/PathResolver.scala#L215) what we have to do is to manually make this flag set, (https://github.com/CodingCat/incubator-spark/blob/e3991d97ddc33e77645e4559b13bf78b9e68239a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L884) Author: CodingCat <zhunansjtu@gmail.com> Closes #13 from CodingCat/SPARK-1089 and squashes the following commits: 8af81e7 [CodingCat] impose non-null settings 9aa2125 [CodingCat] code cleaning ce36676 [CodingCat] code cleaning e045582 [CodingCat] fix the regression problem on ADD_JARS in 0.9
* [SPARK-1090] improvement on spark_shell (help information, configure memory)CodingCat2014-02-171-1/+1
| | | | | | | | | | | | | | | | | | | | https://spark-project.atlassian.net/browse/SPARK-1090 spark-shell should print help information about parameters and should allow user to configure exe memory there is no document about hot to set --cores/-c in spark-shell and also users should be able to set executor memory through command line options In this PR I also check the format of the options passed by the user Author: CodingCat <zhunansjtu@gmail.com> Closes #599 from CodingCat/spark_shell_improve and squashes the following commits: de5aa38 [CodingCat] add parameter to set driver memory 915cbf8 [CodingCat] improvement on spark_shell (help information, configure memory)
* Merge pull request #557 from ScrapCodes/style. Closes #557.Patrick Wendell2014-02-099-6/+25
| | | | | | | | | | | | | | | | | | | | | SPARK-1058, Fix Style Errors and Add Scala Style to Spark Build. Author: Patrick Wendell <pwendell@gmail.com> Author: Prashant Sharma <scrapcodes@gmail.com> == Merge branch commits == commit 1a8bd1c059b842cb95cc246aaea74a79fec684f4 Author: Prashant Sharma <scrapcodes@gmail.com> Date: Sun Feb 9 17:39:07 2014 +0530 scala style fixes commit f91709887a8e0b608c5c2b282db19b8a44d53a43 Author: Patrick Wendell <pwendell@gmail.com> Date: Fri Jan 24 11:22:53 2014 -0800 Adding scalastyle snapshot
* Merge pull request #542 from markhamstra/versionBump. Closes #542.Mark Hamstra2014-02-081-1/+1
| | | | | | | | | | | | | | | | | | Version number to 1.0.0-SNAPSHOT Since 0.9.0-incubating is done and out the door, we shouldn't be building 0.9.0-incubating-SNAPSHOT anymore. @pwendell Author: Mark Hamstra <markhamstra@gmail.com> == Merge branch commits == commit 1b00a8a7c1a7f251b4bb3774b84b9e64758eaa71 Author: Mark Hamstra <markhamstra@gmail.com> Date: Wed Feb 5 09:30:32 2014 -0800 Version number to 1.0.0-SNAPSHOT
* Add missing header filesPatrick Wendell2014-01-141-0/+17
|
* Removing mentions in testsPatrick Wendell2014-01-121-2/+0
|
* Merge pull request #327 from lucarosellini/masterMatei Zaharia2014-01-083-3/+73
|\ | | | | | | | | | | | | | | | | Added ‘-i’ command line option to Spark REPL We had to create a new implementation of both scala.tools.nsc.CompilerCommand and scala.tools.nsc.Settings, because using scala.tools.nsc.GenericRunnerSettings would bring in other options (-howtorun, -save and -execute) which don’t make sense in Spark. Any new Spark specific command line option could now be added to org.apache.spark.repl.SparkRunnerSettings class. Since the behavior of loading a script from the command line should be the same as loading it using the “:load” command inside the shell, the script should be loaded when the SparkContext is available, that’s why we had to move the call to ‘loadfiles(settings)’ _after_ the call to postInitialization(). This still doesn’t work if ‘isAsync = true’.
| * Added license header and removed @author tagLuca Rosellini2014-01-072-4/+34
| |
| * Added ‘-i’ command line option to spark REPL.Luca Rosellini2014-01-033-3/+43
| | | | | | | | | | | | | | We had to create a new implementation of both scala.tools.nsc.CompilerCommand and scala.tools.nsc.Settings, because using scala.tools.nsc.GenericRunnerSettings would bring in other options (-howtorun, -save and -execute) which don’t make sense in Spark. Any new Spark specific command line option could now be added to org.apache.spark.repl.SparkRunnerSettings class. Since the behavior of loading a script from the command line should be the same as loading it using the “:load” command inside the shell, the script should be loaded when the SparkContext is available, that’s why we had to move the call to ‘loadfiles(settings)’ _after_ the call to postInitialization(). This still doesn’t work if ‘isAsync = true’.
* | fixed review commentsPrashant Sharma2014-01-031-1/+3
|/
* Miscellaneous fixes from code review.Matei Zaharia2014-01-011-1/+1
| | | | | | Also replaced SparkConf.getOrElse with just a "get" that takes a default value, and added getInt, getLong, etc to make code that uses this simpler later on.
* Various fixes to configuration codeMatei Zaharia2013-12-282-7/+13
| | | | | | | | | | | | | | - Got rid of global SparkContext.globalConf - Pass SparkConf to serializers and compression codecs - Made SparkConf public instead of private[spark] - Improved API of SparkContext and SparkConf - Switched executor environment vars to be passed through SparkConf - Fixed some places that were still using system properties - Fixed some tests, though others are still failing This still fails several tests in core, repl and streaming, likely due to properties not being set or cleared correctly (some of the tests run fine in isolation).
* spark-544, introducing SparkConf and related configuration overhaul.Prashant Sharma2013-12-252-8/+6
|
* Review comments on the PR for scala 2.10 migration.Prashant Sharma2013-12-131-1/+0
|
* Fixed compile time warnings and formatting post merge.Prashant Sharma2013-11-261-65/+74
|
* Various merge correctionsAaron Davidson2013-11-142-12/+4
| | | | | | | | | | I've diff'd this patch against my own -- since they were both created independently, this means that two sets of eyes have gone over all the merge conflicts that were created, so I'm feeling significantly more confident in the resulting PR. @rxin has looked at the changes to the repl and is resoundingly confident that they are correct.
* Merge branch 'master' into scala-2.10Raymond Liu2013-11-141-2/+34
|\
| * Propagate the SparkContext local property from the thread that calls the ↵Reynold Xin2013-11-092-4/+42
| | | | | | | | spark-repl to the actual execution thread.
* | Merge branch 'master' into scala-2.10Raymond Liu2013-11-131-7/+29
|\|
| * Makes Spark SIMR ready.Ali Ghodsi2013-10-241-0/+14
| |
| * Spark shell exits if it cannot create SparkContextAaron Davidson2013-10-171-1/+8
| | | | | | | | | | | | Mainly, this occurs if you provide a messed up MASTER url (one that doesn't match one of our regexes). Previously, we would default to Mesos, fail, and then start the shell anyway, except that any Spark command would fail.
* | Merge branch 'master' into scala-2.10Prashant Sharma2013-10-011-1/+1
|\| | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/ui/jobs/JobProgressUI.scala docs/_config.yml project/SparkBuild.scala repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
| * Update build version in masterPatrick Wendell2013-09-241-1/+1
| |
* | ported repl improvements from masterPrashant Sharma2013-09-152-2/+11
| |
* | Fixed repl suitePrashant Sharma2013-09-152-7/+7
| |
* | Few more fixes to tests broken during mergePrashant Sharma2013-09-101-50/+0
| |
* | Merged with masterPrashant Sharma2013-09-0617-201/+300
|\|
| * Updated LICENSE with third-party licensesMatei Zaharia2013-09-021-0/+17
| |
| * Move some classes to more appropriate packages:Matei Zaharia2013-09-011-1/+1
| | | | | | | | | | | | * RDD, *RDDFunctions -> org.apache.spark.rdd * Utils, ClosureCleaner, SizeEstimator -> org.apache.spark.util * JavaSerializer, KryoSerializer -> org.apache.spark.serializer
| * Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-0111-24/+24
| |
| * Format cleanup.Benjamin Hindman2013-07-301-1/+3
| |
| * Added property 'spark.executor.uri' for launching on Mesos withoutBenjamin Hindman2013-07-291-0/+2
| | | | | | | | | | | | | | requiring Spark to be installed. Using 'make_distribution.sh' a user can put a Spark distribution at a URI supported by Mesos (e.g., 'hdfs://...') and then set that when launching their job. Also added SPARK_EXECUTOR_URI for the REPL.
| * Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-164-1/+69
| |
* | Added add jars functionality to new repl, which was dropped while merging ↵Prashant Sharma2013-07-123-4/+11
| | | | | | | | with old.
* | Removed an unnecessary test casePrashant Sharma2013-07-111-101/+0
| |
* | Merge branch 'master' into master-mergePrashant Sharma2013-07-032-3/+23
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/pom.xml core/src/main/scala/spark/MapOutputTracker.scala core/src/main/scala/spark/RDD.scala core/src/main/scala/spark/RDDCheckpointData.scala core/src/main/scala/spark/SparkContext.scala core/src/main/scala/spark/Utils.scala core/src/main/scala/spark/api/python/PythonRDD.scala core/src/main/scala/spark/deploy/client/Client.scala core/src/main/scala/spark/deploy/master/MasterWebUI.scala core/src/main/scala/spark/deploy/worker/Worker.scala core/src/main/scala/spark/deploy/worker/WorkerWebUI.scala core/src/main/scala/spark/rdd/BlockRDD.scala core/src/main/scala/spark/rdd/ZippedRDD.scala core/src/main/scala/spark/scheduler/cluster/StandaloneSchedulerBackend.scala core/src/main/scala/spark/storage/BlockManager.scala core/src/main/scala/spark/storage/BlockManagerMaster.scala core/src/main/scala/spark/storage/BlockManagerMasterActor.scala core/src/main/scala/spark/storage/BlockManagerUI.scala core/src/main/scala/spark/util/AkkaUtils.scala core/src/test/scala/spark/SizeEstimatorSuite.scala pom.xml project/SparkBuild.scala repl/src/main/scala/spark/repl/SparkILoop.scala repl/src/test/scala/spark/repl/ReplSuite.scala streaming/src/main/scala/spark/streaming/StreamingContext.scala streaming/src/main/scala/spark/streaming/api/java/JavaStreamingContext.scala streaming/src/main/scala/spark/streaming/dstream/KafkaInputDStream.scala streaming/src/main/scala/spark/streaming/util/MasterFailureTest.scala
| * FormattingMatei Zaharia2013-06-251-3/+4
| |
| * Added a local-cluster mode test to ReplSuiteMatei Zaharia2013-06-251-5/+26
| |
| * Fix search path for REPL class loader to really find added JARsMatei Zaharia2013-06-221-1/+3
| |
| * ADD_JARS environment variable for spark-shellMatei Zaharia2013-06-221-2/+7
| |
| * Update ASM to version 4.0Matei Zaharia2013-06-191-2/+1
| |
| * Attempt to fix streaming test failures after yarn branch mergeMridul Muralidharan2013-04-281-0/+1
| |
* | Fix StandaloneClusterReplSuite to allow running multiple testsMatei Zaharia2013-06-082-16/+5
| |
* | Fixied other warningsPrashant Sharma2013-04-291-1/+1
| |
* | scala 2.10 and master mergePrashant Sharma2013-04-243-16/+2
| |
* | Manually merged scala-2.10 and masterPrashant Sharma2013-04-222-3/+18
|\|
| * Merge remote-tracking branch 'jey/bump-development-version-to-0.8.0'Matei Zaharia2013-04-081-1/+1
| |\ | | | | | | | | | | | | | | | Conflicts: docs/_config.yml project/SparkBuild.scala
| | * Bump development version to 0.8.0Jey Kottalam2013-03-281-1/+1
| |/
| * Change version to 0.7.1-SNAPSHOT for development branchMatei Zaharia2013-02-271-1/+1
| |