summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--documentation.md2
-rw-r--r--downloads.md10
-rw-r--r--js/downloads.js1
-rw-r--r--news/_posts/2014-11-26-spark-1-1-1-released.md16
-rw-r--r--releases/_posts/2014-11-26-spark-release-1-1-1.md109
5 files changed, 132 insertions, 6 deletions
diff --git a/documentation.md b/documentation.md
index 5f2eb2335..4bf9e0f9a 100644
--- a/documentation.md
+++ b/documentation.md
@@ -12,7 +12,7 @@ navigation:
<p>Setup instructions, programming guides, and other documentation are available for each version of Spark below:</p>
<ul>
- <li><a href="{{site.url}}docs/latest/">Spark 1.1.0 (latest release)</a></li>
+ <li><a href="{{site.url}}docs/latest/">Spark 1.1.1 (latest release)</a></li>
<li><a href="{{site.url}}docs/1.0.2/">Spark 1.0.2</a></li>
<li><a href="{{site.url}}docs/0.9.2/">Spark 0.9.2</a></li>
<li><a href="{{site.url}}docs/0.8.1/">Spark 0.8.1</a></li>
diff --git a/downloads.md b/downloads.md
index 3c9bd1503..874e9269f 100644
--- a/downloads.md
+++ b/downloads.md
@@ -16,9 +16,9 @@ $(document).ready(function() {
## Download Spark
-The latest release of Spark is Spark 1.1.0, released on September 11, 2014
-<a href="{{site.url}}releases/spark-release-1-1-0.html">(release notes)</a>
-<a href="https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=2f9b2bd7844ee8393dc9c319f4fefedf95f5e460">(git tag)</a><br/>
+The latest release of Spark is Spark 1.1.1, released on November 26, 2014
+<a href="{{site.url}}releases/spark-release-1-1-1.html">(release notes)</a>
+<a href="https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=3693ae5d3c01861557e06edbc32a8112683f3d86">(git tag)</a><br/>
1. Chose a Spark release:
<select id="sparkVersionSelect" onChange="javascript:onVersionSelect();"></select><br>
@@ -38,7 +38,7 @@ Spark artifacts are [hosted in Maven Central](http://search.maven.org/#browse%7C
groupId: org.apache.spark
artifactId: spark-core_2.10
- version: 1.1.0
+ version: 1.1.1
### Development and Maintenance Branches
If you are interested in working with the newest under-development code or contributing to Spark development, you can also check out the master branch from Git:
@@ -46,7 +46,7 @@ If you are interested in working with the newest under-development code or contr
# Master development branch
git clone git://github.com/apache/spark.git
- # 1.1 maintenance branch with stability fixes on top of Spark 1.1.0
+ # 1.1 maintenance branch with stability fixes on top of Spark 1.1.1
git clone git://github.com/apache/spark.git -b branch-1.1
Once you've downloaded Spark, you can find instructions for installing and building it on the <a href="{{site.url}}documentation.html">documentation page</a>.
diff --git a/js/downloads.js b/js/downloads.js
index 946ac7ba1..ce40883a6 100644
--- a/js/downloads.js
+++ b/js/downloads.js
@@ -26,6 +26,7 @@ var packagesV3 = packagesV2.concat([mapr3, mapr4]);
// 1.1.0+
var packagesV4 = packagesV1.concat([hadoop2p3, hadoop2p4, mapr3, mapr4]);
+addRelease("1.1.1", new Date("11/26/2014"), packagesV4, true);
addRelease("1.1.0", new Date("9/11/2014"), packagesV4, true);
addRelease("1.0.2", new Date("8/5/2014"), packagesV3, true);
addRelease("1.0.1", new Date("7/11/2014"), packagesV3);
diff --git a/news/_posts/2014-11-26-spark-1-1-1-released.md b/news/_posts/2014-11-26-spark-1-1-1-released.md
new file mode 100644
index 000000000..df32f2617
--- /dev/null
+++ b/news/_posts/2014-11-26-spark-1-1-1-released.md
@@ -0,0 +1,16 @@
+---
+layout: post
+title: Spark 1.1.1 released
+categories:
+- News
+tags: []
+status: publish
+type: post
+published: true
+meta:
+ _edit_last: '4'
+ _wpas_done_all: '1'
+---
+We are happy to announce the availability of <a href="{{site.url}}releases/spark-release-1-1-1.html" title="Spark Release 1.1.1">Spark 1.1.1</a>! This is a maintenance release that includes contributions from 55 developers. Spark 1.1.1 includes fixes across several areas of Spark, including the core API, Streaming, PySpark, SQL, GraphX, and MLlib.
+
+Visit the <a href="{{site.url}}releases/spark-release-1-1-1.html" title="Spark Release 1.1.1">release notes</a> to read about this release or <a href="{{site.url}}downloads.html">download</a> the release today.
diff --git a/releases/_posts/2014-11-26-spark-release-1-1-1.md b/releases/_posts/2014-11-26-spark-release-1-1-1.md
new file mode 100644
index 000000000..415394204
--- /dev/null
+++ b/releases/_posts/2014-11-26-spark-release-1-1-1.md
@@ -0,0 +1,109 @@
+---
+layout: post
+title: Spark Release 1.1.1
+categories: []
+tags: []
+status: publish
+type: post
+published: true
+meta:
+ _edit_last: '4'
+ _wpas_done_all: '1'
+---
+
+Spark 1.1.1 is a maintenance release with bug fixes. This release is based on the [branch-1.1](https://github.com/apache/spark/tree/branch-1.1) maintenance branch of Spark. We recommend all 1.1.0 users to upgrade to this stable release. Contributions to this release came from 55 developers.
+
+To download Spark 1.1.1 visit the <a href="{{site.url}}downloads.html">downloads</a> page.
+
+### Fixes
+Spark 1.1.1 contains bug fixes in several components. Some of the more important fixes are highlighted below. You can visit the [Spark issue tracker](http://s.apache.org/z9h) for the full list of fixes.
+
+#### Spark Core
+- Avoid many small spills in external data structures ([SPARK-4480](https://issues.apache.org/jira/browse/SPARK-4480))
+- Memory leak in connection manager timeout thread ([SPARK-4393](https://issues.apache.org/jira/browse/SPARK-4393))
+- Incorrect of channel read return value may lead to data truncation ([SPARK-4107](https://issues.apache.org/jira/browse/SPARK-4107))
+- Stream corruption exceptions observed in sort-based shuffle ([SPARK-3948](https://issues.apache.org/jira/browse/SPARK-3948))
+- Integer overflow in sort-based shuffle key comparison ([SPARK-3032](https://issues.apache.org/jira/browse/SPARK-3032))
+- Lack of thread safety in Hadoop configuration usage in Spark ([SPARK-2546](https://issues.apache.org/jira/browse/SPARK-2546))
+
+#### SQL
+- Wrong Parquet filters are created for all inequality predicates with literals on the left hand side ([SPARK-4468](https://issues.apache.org/jira/browse/SPARK-4468))
+- Support backticks in aliases ([SPARK-3708](https://issues.apache.org/jira/browse/SPARK-3708) and [SPARK-3834](https://issues.apache.org/jira/browse/SPARK-3834))
+- ColumnValue types do not match in Spark rows vs Hive rows ([SPARK-3704](https://github.com/apache/spark/pull/3704))
+
+#### PySpark
+- Fix sortByKey on empty RDD ([SPARK-4304](https://issues.apache.org/jira/browse/SPARK-4304))
+- Avoid using the same random seed for all partitions ([SPARK-4148](https://issues.apache.org/jira/browse/SPARK-4148))
+- Avoid OOMs when take() is run on empty partitions ([SPARK-3211](https://issues.apache.org/jira/browse/SPARK-3211))
+
+#### MLlib
+- KryoException caused by ALS.trainImplicit in PySpark ([SPARK-3990](https://issues.apache.org/jira/browse/SPARK-3990))
+
+#### Streaming
+- Block replication continuously fails if target is down ([SPARK-3495](https://issues.apache.org/jira/browse/SPARK-3495))
+- Block replication may choose driver as target ([SPARK-3496](https://issues.apache.org/jira/browse/SPARK-3496))
+
+#### GraphX
+- Ensure VertexRDD.apply uses mergeFunc ([SPARK-2062](https://issues.apache.org/jira/browse/SPARK-2062))
+
+### Contributors
+The following developers contributed to this release:
+
+* Andrew Ash - Documentation and bug fixes in Core
+* Andrew Or - Improvements in Core; bug fixes in Windows, Core, Block Manager, and Shuffle
+* Aniket Bhatnagar - Bug fixes in Core and Streaming
+* Benjamin Piering - Improvements in GraphX
+* Bertrand Bossy - Bug fixes in Core
+* Brenden Matthews - Bug fixes in Mesos
+* Chao Chen - Documentation in Core
+* Cheng Hao - Test in SQL
+* Cheng Lian - Bug fixes in PySpark, MLlib, and SQL
+* Chirag Aggarwal - Bug fixes in SQL
+* Chris Cope - Bug fixes in YARN
+* Davies Liu - Improvements in PySpark; bug fixes in Core, SQL, and PySpark
+* Eric Eijkelenboom - Bug fixes in Core
+* Eric Liang - Bug fixes in Core and SQL
+* Eugen Cepoi - Improvements in Core
+* Fei Wang - Improvements in Core and SQL; bug fixes in Core; documentation in Streaming
+* Grega Kespret - Documentation in Core
+* Guoqiang Li - Bug fixes in Web UI
+* Henry Cook - Documentation in Core
+* Hossein Falaki - Bug fixes in Web UI
+* Ian Hummel - Improvements in Core
+* Jakub Dubovsky - Bug fixes in Core
+* Jerry Shao - Bug fixes in Shuffle
+* Jongyoul Lee - Bug fixes in Core and Mesos
+* Josh Rosen - Improvements in Core; bug fixes in Streaming and Core
+* Kousuke Saruta - Improvements in Core and Web UI; bug fixes in Core, Web UI, and PySpark
+* Larry Xiao - Bug fixes in GraphX
+* Lianhui Wang - Bug fixes in GraphX
+* Liang-Chi Hsieh - Bug fixes in Core
+* Lu Lu - Improvements in GraphX
+* Ma Ji - Bug fixes in Streaming
+* Marcelo Vanzin - Bug fixes in YARN
+* Mark Hamstra - Bug fixes in Core
+* Masayoshi Tsuzuki - Improvements in Core, Shell, and PySpark; bug fixes in Windows and PySpark
+* Michael Armbrust - Documentation in Core
+* Michael Griffiths - Bug fixes in PySpark
+* Min Shen - Bug fixes in YARN
+* Mubarak Seyed - Improvements in Streaming
+* Nicholas Chammas - Documentation in Core
+* Niklas Wilcke - Bug fixes in Core
+* Oded Zimerman - Bug fixes in GraphX
+* Reynold Xin - New features in Core; bug fixes in Core and SQL
+* Rongquan Su - Improvements in Streaming
+* Sandy Ryza - Bug fixes in Core
+* Sean Owen - Bug fixes in Java API, Core, and Streaming
+* Shane Knapp - Bug fixes in Core
+* Shixiong Zhu - Improvements in Web UI; bug fixes in Core and YARN
+* Shuo Xiang - Bug fixes in MLlib
+* Tal Sliwowicz - Bug fixes in Core and Block Manager
+* Tao Wang - Improvements and bug fixes in Core
+* Tathagata Das - Improvements in Streaming; bug fixes in Core, Block Manager, and Streaming
+* Xiangrui Meng - Improvements in Web UI and PySpark; bug fixes in Core, MLlib, and PySpark
+* Yantang Zhai - Bug fixes in Core and Web UI
+* Yash Datta - Improvements in SQL
+* Yin Huai - Documentation in Core
+
+_Thanks to everyone who contributed!_
+