| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
This reverts commit 7feeda3d729f9397aa15ee8750c01ef5aa601962.
|
|
|
|
| |
This reverts commit ea1a455a755f83f46fc8bf242410917d93d0c52c.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
shutdown
Tobias noted today on the mailing list:
========
I am trying to use Spark Streaming with Kafka, which works like a
charm – except for shutdown. When I run my program with "sbt
run-main", sbt will never exit, because there are two non-daemon
threads left that don't die.
I created a minimal example at
<https://gist.github.com/tgpfeiffer/b1e765064e983449c6b6#file-kafkadoesntshutdown-scala>.
It starts a StreamingContext and does nothing more than connecting to
a Kafka server and printing what it receives. Using the `future
Unknown macro: { ... }
` construct, I shut down the StreamingContext after some seconds and
then print the difference between the threads at start time and at end
time. The output can be found at
<https://gist.github.com/tgpfeiffer/b1e765064e983449c6b6#file-output1>.
There are a number of threads remaining that will prevent sbt from
exiting.
When I replace `KafkaUtils.createStream(...)` with a call that does
exactly the same, except that it calls `consumerConnector.shutdown()`
in `KafkaReceiver.onStop()` (which it should, IMO), the output is as
shown at <https://gist.github.com/tgpfeiffer/b1e765064e983449c6b6#file-output2>.
Does anyone have any idea what is going on here and why the program
doesn't shut down properly? The behavior is the same with both kafka
0.8.0 and 0.8.1.1, by the way.
========
Something similar was noted last year:
http://mail-archives.apache.org/mod_mbox/spark-dev/201309.mbox/%3C1380220041.2428.YahooMailNeo@web160804.mail.bf1.yahoo.com%3E
KafkaInputDStream doesn't close `ConsumerConnector` in `onStop()`, and does not close the `Executor` it creates. The latter leaves non-daemon threads and can prevent the JVM from shutting down even if streaming is closed properly.
Author: Sean Owen <sowen@cloudera.com>
Closes #980 from srowen/SPARK-2034 and squashes the following commits:
9f31a8d [Sean Owen] Restore ClassTag to private class because MIMA flags it; is the shadowing intended?
2d579a8 [Sean Owen] Close ConsumerConnector in onStop; shutdown() the local Executor that is created so that its threads stop when done; close the Zookeeper client even on exception; fix a few typos; log exceptions that otherwise vanish
(cherry picked from commit 476581e8c8ca03a5940c404fee8a06361ff94cb5)
Signed-off-by: Patrick Wendell <pwendell@gmail.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
flume event sent to Spark will fail if the body is too large and numHeaders is greater than zero
Author: joyyoj <sunshch@gmail.com>
Closes #951 from joyyoj/master and squashes the following commits:
f4660c5 [joyyoj] [SPARK-1998] SparkFlumeEvent with body bigger than 1020 bytes are not read properly
(cherry picked from commit 29660443077619ee854025b8d0d3d64181724054)
Signed-off-by: Patrick Wendell <pwendell@gmail.com>
|
|
|
|
|
|
|
|
|
|
|
|
| |
The changes could be ported back to 0.9 as well.
Changing in.read to in.readFully to read the whole input stream rather than the first 1020 bytes.
This should ok considering that Flume caps the body size to 32K by default.
Author: David Lemieux <david.lemieux@radialpoint.com>
Closes #865 from lemieud/SPARK-1916 and squashes the following commits:
a265673 [David Lemieux] Updated SparkFlumeEvent to read the whole stream rather than the first X bytes.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 2f1dc868e5714882cf40d2633fb66772baf34789.
|
|
|
|
| |
This reverts commit 832dc594e7666f1d402334f8015ce29917d9c888.
|
| |
|
| |
|
|
|
|
| |
This reverts commit d807023479ce10aec28ef3c1ab646ddefc2e663c.
|
|
|
|
| |
This reverts commit 67dd53d2556f03ce292e6889128cf441f1aa48f8.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 920f947eb5a22a679c0c3186cf69ee75f6041c75.
|
|
|
|
| |
This reverts commit f8e611955096c5c1c7db5764b9d2851b1d295f0d.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 80eea0f111c06260ffaa780d2f3f7facd09c17bc.
|
|
|
|
| |
This reverts commit e5436b8c1a79ce108f3af402455ac5f6dc5d1eb3.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 9212b3e5bb5545ccfce242da8d89108e6fb1c464.
|
|
|
|
| |
This reverts commit c4746aa6fe4aaf383e69e34353114d36d1eb9ba6.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 54133abdce0246f6643a1112a5204afb2c4caa82.
|
|
|
|
| |
This reverts commit e480bcfbd269ae1d7a6a92cfb50466cf192fe1fb.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is a few changes based on the original patch by @scrapcodes.
Author: Prashant Sharma <prashant.s@imaginea.com>
Author: Patrick Wendell <pwendell@gmail.com>
Closes #785 from pwendell/package-docs and squashes the following commits:
c32b731 [Patrick Wendell] Changes based on Prashant's patch
c0463d3 [Prashant Sharma] added eof new line
ce8bf73 [Prashant Sharma] Added eof new line to all files.
4c35f2e [Prashant Sharma] SPARK-1563 Add package-info.java and package.scala files for all packages that appear in docs
(cherry picked from commit 46324279dae2fa803267d788f7c56b0ed643b4c8)
Signed-off-by: Patrick Wendell <pwendell@gmail.com>
|
| |
|
| |
|
|
|
|
| |
This reverts commit 18f062303303824139998e8fc8f4158217b0dbc3.
|
|
|
|
| |
This reverts commit d08e9604fc9958b7c768e91715c8152db2ed6fd0.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Pretty self-explanatory
Author: Tathagata Das <tathagata.das1565@gmail.com>
Closes #722 from tdas/example-fix and squashes the following commits:
7839979 [Tathagata Das] Minor changes.
0673441 [Tathagata Das] Fixed java docs of java streaming example
e687123 [Tathagata Das] Fixed scala style errors.
9b8d112 [Tathagata Das] Fixed streaming examples docs to use run-example instead of spark-submit.
|
| |
|
| |
|
|
|
|
| |
This reverts commit 3d0a44833ab50360bf9feccc861cb5e8c44a4866.
|
|
|
|
| |
This reverts commit 9772d85c6f3893d42044f4bab0e16f8b6287613a.
|
| |
|
| |
|
| |
|
| |
|
| |
|