aboutsummaryrefslogtreecommitdiff
path: root/core/src/test
diff options
context:
space:
mode:
authorXiangrui Meng <meng@databricks.com>2014-06-24 19:06:07 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-06-24 19:06:07 -0700
commit8ca41769fb16115a5a14ac842199d16cb28641ba (patch)
treec6782c657a1caf222b58787cd14d271c63a05da0 /core/src/test
parenta162c9b337d99dd2a6102a80deb2a9707cdd93e9 (diff)
downloadspark-8ca41769fb16115a5a14ac842199d16cb28641ba.tar.gz
spark-8ca41769fb16115a5a14ac842199d16cb28641ba.tar.bz2
spark-8ca41769fb16115a5a14ac842199d16cb28641ba.zip
[SPARK-1112, 2156] Bootstrap to fetch the driver's Spark properties.
This is an alternative solution to #1124 . Before launching the executor backend, we first fetch driver's spark properties and use it to overwrite executor's spark properties. This should be better than #1124. @pwendell Are there spark properties that might be different on the driver and on the executors? Author: Xiangrui Meng <meng@databricks.com> Closes #1132 from mengxr/akka-bootstrap and squashes the following commits: 77ff32d [Xiangrui Meng] organize imports 68e1dfb [Xiangrui Meng] use timeout from AkkaUtils; remove props from RegisteredExecutor 46d332d [Xiangrui Meng] fix a test 7947c18 [Xiangrui Meng] increase slack size for akka 4ab696a [Xiangrui Meng] bootstrap to retrieve driver spark conf
Diffstat (limited to 'core/src/test')
-rw-r--r--core/src/test/scala/org/apache/spark/scheduler/CoarseGrainedSchedulerBackendSuite.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/test/scala/org/apache/spark/scheduler/CoarseGrainedSchedulerBackendSuite.scala b/core/src/test/scala/org/apache/spark/scheduler/CoarseGrainedSchedulerBackendSuite.scala
index efef9d26da..f77661ccbd 100644
--- a/core/src/test/scala/org/apache/spark/scheduler/CoarseGrainedSchedulerBackendSuite.scala
+++ b/core/src/test/scala/org/apache/spark/scheduler/CoarseGrainedSchedulerBackendSuite.scala
@@ -35,7 +35,7 @@ class CoarseGrainedSchedulerBackendSuite extends FunSuite with LocalSparkContext
val thrown = intercept[SparkException] {
larger.collect()
}
- assert(thrown.getMessage.contains("Consider using broadcast variables for large values"))
+ assert(thrown.getMessage.contains("using broadcast variables for large values"))
val smaller = sc.parallelize(1 to 4).collect()
assert(smaller.size === 4)
}