aboutsummaryrefslogtreecommitdiff
path: root/R
diff options
context:
space:
mode:
authorJoseph K. Bradley <joseph@databricks.com>2016-09-09 05:35:10 -0700
committerYanbo Liang <ybliang8@gmail.com>2016-09-09 05:35:10 -0700
commit65b814bf50e92e2e9b622d1602f18bacd217181c (patch)
tree9deecaecdbe330faa9f81e70db2448edefc07937 /R
parent92ce8d4849a0341c4636e70821b7be57ad3055b1 (diff)
downloadspark-65b814bf50e92e2e9b622d1602f18bacd217181c.tar.gz
spark-65b814bf50e92e2e9b622d1602f18bacd217181c.tar.bz2
spark-65b814bf50e92e2e9b622d1602f18bacd217181c.zip
[SPARK-17456][CORE] Utility for parsing Spark versions
## What changes were proposed in this pull request? This patch adds methods for extracting major and minor versions as Int types in Scala from a Spark version string. Motivation: There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well. Currently, I want this functionality to check Spark versions to provide backwards compatibility for ML model persistence. ## How was this patch tested? Unit tests Author: Joseph K. Bradley <joseph@databricks.com> Closes #15017 from jkbradley/version-parsing.
Diffstat (limited to 'R')
0 files changed, 0 insertions, 0 deletions