aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorKevin McHale <kevin@premise.com>2016-08-03 13:15:13 -0700
committerSean Owen <sowen@cloudera.com>2016-08-03 13:15:13 -0700
commit685b08e2611b69f8db60a00c0c94aecd315e2a3e (patch)
tree093efd9fe794a29653938112b9717f8c39f9242a
parente6f226c5670d9f332b49ca40ff7b86b81a218d1b (diff)
downloadspark-685b08e2611b69f8db60a00c0c94aecd315e2a3e.tar.gz
spark-685b08e2611b69f8db60a00c0c94aecd315e2a3e.tar.bz2
spark-685b08e2611b69f8db60a00c0c94aecd315e2a3e.zip
[SPARK-14204][SQL] register driverClass rather than user-specified class
This is a pull request that was originally merged against branch-1.6 as #12000, now being merged into master as well. srowen zzcclp JoshRosen This pull request fixes an issue in which cluster-mode executors fail to properly register a JDBC driver when the driver is provided in a jar by the user, but the driver class name is derived from a JDBC URL (rather than specified by the user). The consequence of this is that all JDBC accesses under the described circumstances fail with an IllegalStateException. I reported the issue here: https://issues.apache.org/jira/browse/SPARK-14204 My proposed solution is to have the executors register the JDBC driver class under all circumstances, not only when the driver is specified by the user. This patch was tested manually. I built an assembly jar, deployed it to a cluster, and confirmed that the problem was fixed. Author: Kevin McHale <kevin@premise.com> Closes #14420 from mchalek/mchalek-jdbc_driver_registration.
-rw-r--r--sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
index 81d38e3699..a33c26d813 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
@@ -55,7 +55,7 @@ object JdbcUtils extends Logging {
DriverManager.getDriver(url).getClass.getCanonicalName
}
() => {
- userSpecifiedDriverClass.foreach(DriverRegistry.register)
+ DriverRegistry.register(driverClass)
val driver: Driver = DriverManager.getDrivers.asScala.collectFirst {
case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass => d
case d if d.getClass.getCanonicalName == driverClass => d