| Commit message (Collapse) | Author | Age | Files | Lines |
|\
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
spark-915-segregate-scripts
Conflicts:
bin/spark-shell
core/pom.xml
core/src/main/scala/org/apache/spark/SparkContext.scala
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
core/src/main/scala/org/apache/spark/ui/UIWorkloadGenerator.scala
core/src/test/scala/org/apache/spark/DriverSuite.scala
python/run-tests
sbin/compute-classpath.sh
sbin/spark-class
sbin/stop-slaves.sh
|
| |
| |
| |
| | |
Signed-off-by: shane-huang <shengsheng.huang@intel.com>
|
| |
| |
| |
| | |
tests so we don't get the test spark.conf on the classpath.
|
| |
| |
| |
| |
| |
| |
| |
| | |
The test in context.py created two different instances of the
SparkContext class by copying "globals", so that some tests can have a
global "sc" object and others can try initializing their own contexts.
This led to two JVM gateways being created since SparkConf also looked
at pyspark.context.SparkContext to get the JVM.
|
|/
|
|
|
|
|
|
|
| |
For now, this only adds MarshalSerializer, but it lays the groundwork
for other supporting custom serializers. Many of these mechanisms
can also be used to support deserialization of different data formats
sent by Java, such as data encoded by MsgPack.
This also fixes a bug in SparkContext.union().
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
This should avoid exceptions caused by existing
files with different contents.
I also removed some unused code.
|
| |
|
| |
|
| |
|
|
Expand the PySpark programming guide.
|