diff options
Diffstat (limited to 'docs')
-rw-r--r-- | docs/programming-guide.md | 2 |
1 files changed, 2 insertions, 0 deletions
diff --git a/docs/programming-guide.md b/docs/programming-guide.md index 9de2f914b8..49f319ba77 100644 --- a/docs/programming-guide.md +++ b/docs/programming-guide.md @@ -117,6 +117,8 @@ The first thing a Spark program must do is to create a [SparkContext](api/scala/ how to access a cluster. To create a `SparkContext` you first need to build a [SparkConf](api/scala/index.html#org.apache.spark.SparkConf) object that contains information about your application. +Only one SparkContext may be active per JVM. You must `stop()` the active SparkContext before creating a new one. + {% highlight scala %} val conf = new SparkConf().setAppName(appName).setMaster(master) new SparkContext(conf) |