diff options
author | Prashant Sharma <prashant.s@imaginea.com> | 2014-05-08 10:23:05 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-05-08 10:23:05 -0700 |
commit | 44dd57fb66bb676d753ad8d9757f9f4c03364113 (patch) | |
tree | 755cdff1c17a29b24837a6405fed5eb46733769e /core/src | |
parent | 19c8fb02bc2c2f76c3c45bfff4b8d093be9d7c66 (diff) | |
download | spark-44dd57fb66bb676d753ad8d9757f9f4c03364113.tar.gz spark-44dd57fb66bb676d753ad8d9757f9f4c03364113.tar.bz2 spark-44dd57fb66bb676d753ad8d9757f9f4c03364113.zip |
SPARK-1565, update examples to be used with spark-submit script.
Commit for initial feedback, basically I am curious if we should prompt user for providing args esp. when its mandatory. And can we skip if they are not ?
Also few other things that did not work like
`bin/spark-submit examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop1.0.4.jar --class org.apache.spark.examples.SparkALS --arg 100 500 10 5 2`
Not all the args get passed properly, may be I have messed up something will try to sort it out hopefully.
Author: Prashant Sharma <prashant.s@imaginea.com>
Closes #552 from ScrapCodes/SPARK-1565/update-examples and squashes the following commits:
669dd23 [Prashant Sharma] Review comments
2727e70 [Prashant Sharma] SPARK-1565, update examples to be used with spark-submit script.
Diffstat (limited to 'core/src')
-rw-r--r-- | core/src/main/scala/org/apache/spark/SparkContext.scala | 8 |
1 files changed, 4 insertions, 4 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala index eb14d87467..9d7c2c8d3d 100644 --- a/core/src/main/scala/org/apache/spark/SparkContext.scala +++ b/core/src/main/scala/org/apache/spark/SparkContext.scala @@ -74,10 +74,10 @@ class SparkContext(config: SparkConf) extends Logging { * be generated using [[org.apache.spark.scheduler.InputFormatInfo.computePreferredLocations]] * from a list of input files or InputFormats for the application. */ - @DeveloperApi - def this(config: SparkConf, preferredNodeLocationData: Map[String, Set[SplitInfo]]) = { - this(config) - this.preferredNodeLocationData = preferredNodeLocationData + @DeveloperApi + def this(config: SparkConf, preferredNodeLocationData: Map[String, Set[SplitInfo]]) = { + this(config) + this.preferredNodeLocationData = preferredNodeLocationData } /** |