summaryrefslogtreecommitdiff
path: root/site/news/submit-talks-to-spark-summit-2014.html
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@apache.org>2014-05-30 09:09:04 +0000
committerPatrick Wendell <pwendell@apache.org>2014-05-30 09:09:04 +0000
commit39693fdedee6970b5aabf491bdfe192bfc4bb86a (patch)
tree314fa6171b1db25992ea55be5c0a36e4d048c4f4 /site/news/submit-talks-to-spark-summit-2014.html
parent9d3ca69383db14952c0e22a2d7d61854b8e9944f (diff)
downloadspark-website-39693fdedee6970b5aabf491bdfe192bfc4bb86a.tar.gz
spark-website-39693fdedee6970b5aabf491bdfe192bfc4bb86a.tar.bz2
spark-website-39693fdedee6970b5aabf491bdfe192bfc4bb86a.zip
Downloads and docs page updates for 1.0.0
Diffstat (limited to 'site/news/submit-talks-to-spark-summit-2014.html')
-rw-r--r--site/news/submit-talks-to-spark-summit-2014.html4
1 files changed, 2 insertions, 2 deletions
diff --git a/site/news/submit-talks-to-spark-summit-2014.html b/site/news/submit-talks-to-spark-summit-2014.html
index 12976adfd..a0bc98da2 100644
--- a/site/news/submit-talks-to-spark-summit-2014.html
+++ b/site/news/submit-talks-to-spark-summit-2014.html
@@ -160,12 +160,12 @@
<h2>Submissions and registration open for Spark Summit 2014</h2>
-<p>After last year’s successful <a href="http://spark-summit.org/2013">first Spark Summit</a>, registrations
+<p>After last year&#8217;s successful <a href="http://spark-summit.org/2013">first Spark Summit</a>, registrations
and talk submissions are now open for <a href="http://spark-summit.org/2014">Spark Summit 2014</a>.
This will be a 3-day event in San Francisco organized by multiple companies in the Spark community.
The event will run <strong>June 30th to July 2nd</strong> in San Francisco, CA.</p>
-<p>If you’d like to present at the Summit, <a href="http://spark-summit.org/submit">submit a talk</a>
+<p>If you&#8217;d like to present at the Summit, <a href="http://spark-summit.org/submit">submit a talk</a>
before April 11th, 2014. We welcome talks on use cases, open source development, and applications built
on Spark.</p>