diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-01-01 21:29:12 -0800 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-01-01 21:29:12 -0800 |
commit | 3713f8129a618a633a7aca8c944960c3e7ac9d3b (patch) | |
tree | ff3aa8fa3460078007259a6a6479dc4aec27b50a /python/epydoc.conf | |
parent | c1d928a897f8daed5d7e74f4af476b67046f348d (diff) | |
parent | 7e8d2e8a5c88d16c771923504c433491b109ab2a (diff) | |
download | spark-3713f8129a618a633a7aca8c944960c3e7ac9d3b.tar.gz spark-3713f8129a618a633a7aca8c944960c3e7ac9d3b.tar.bz2 spark-3713f8129a618a633a7aca8c944960c3e7ac9d3b.zip |
Merge pull request #309 from mateiz/conf2
SPARK-544. Migrate configuration to a SparkConf class
This is still a work in progress based on Prashant and Evan's code. So far I've done the following:
- Got rid of global SparkContext.globalConf
- Passed SparkConf to serializers and compression codecs
- Made SparkConf public instead of private[spark]
- Improved API of SparkContext and SparkConf
- Switched executor environment vars to be passed through SparkConf
- Fixed some places that were still using system properties
- Fixed some tests, though others are still failing
This still fails several tests in core, repl and streaming, likely due to properties not being set or cleared correctly (some of the tests run fine in isolation). But the API at least is hopefully ready for review. Unfortunately there was a lot of global stuff before due to a "SparkContext.globalConf" method that let you set a "default" configuration of sorts, which meant I had to make some pretty big changes.
Diffstat (limited to 'python/epydoc.conf')
-rw-r--r-- | python/epydoc.conf | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/python/epydoc.conf b/python/epydoc.conf index 0b42e729f8..95a6af0974 100644 --- a/python/epydoc.conf +++ b/python/epydoc.conf @@ -34,4 +34,4 @@ private: no exclude: pyspark.cloudpickle pyspark.worker pyspark.join pyspark.java_gateway pyspark.examples pyspark.shell pyspark.test - pyspark.rddsampler pyspark.daemon + pyspark.rddsampler pyspark.daemon pyspark.mllib._common |