diff options
author | Davies Liu <davies.liu@gmail.com> | 2014-09-13 16:22:04 -0700 |
---|---|---|
committer | Josh Rosen <joshrosen@apache.org> | 2014-09-13 16:22:04 -0700 |
commit | 2aea0da84c58a179917311290083456dfa043db7 (patch) | |
tree | 6cda208e50f24c31883f1fdf2f51b7a6a8399ff1 /python/pyspark/shuffle.py | |
parent | 0f8c4edf4e750e3d11da27cc22c40b0489da7f37 (diff) | |
download | spark-2aea0da84c58a179917311290083456dfa043db7.tar.gz spark-2aea0da84c58a179917311290083456dfa043db7.tar.bz2 spark-2aea0da84c58a179917311290083456dfa043db7.zip |
[SPARK-3030] [PySpark] Reuse Python worker
Reuse Python worker to avoid the overhead of fork() Python process for each tasks. It also tracks the broadcasts for each worker, avoid sending repeated broadcasts.
This can reduce the time for dummy task from 22ms to 13ms (-40%). It can help to reduce the latency for Spark Streaming.
For a job with broadcast (43M after compress):
```
b = sc.broadcast(set(range(30000000)))
print sc.parallelize(range(24000), 100).filter(lambda x: x in b.value).count()
```
It will finish in 281s without reused worker, and it will finish in 65s with reused worker(4 CPUs). After reusing the worker, it can save about 9 seconds for transfer and deserialize the broadcast for each tasks.
It's enabled by default, could be disabled by `spark.python.worker.reuse = false`.
Author: Davies Liu <davies.liu@gmail.com>
Closes #2259 from davies/reuse-worker and squashes the following commits:
f11f617 [Davies Liu] Merge branch 'master' into reuse-worker
3939f20 [Davies Liu] fix bug in serializer in mllib
cf1c55e [Davies Liu] address comments
3133a60 [Davies Liu] fix accumulator with reused worker
760ab1f [Davies Liu] do not reuse worker if there are any exceptions
7abb224 [Davies Liu] refactor: sychronized with itself
ac3206e [Davies Liu] renaming
8911f44 [Davies Liu] synchronized getWorkerBroadcasts()
6325fc1 [Davies Liu] bugfix: bid >= 0
e0131a2 [Davies Liu] fix name of config
583716e [Davies Liu] only reuse completed and not interrupted worker
ace2917 [Davies Liu] kill python worker after timeout
6123d0f [Davies Liu] track broadcasts for each worker
8d2f08c [Davies Liu] reuse python worker
Diffstat (limited to 'python/pyspark/shuffle.py')
0 files changed, 0 insertions, 0 deletions