aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--docs/configuration.md2
-rw-r--r--docs/monitoring.md2
-rw-r--r--docs/sql-programming-guide.md6
3 files changed, 5 insertions, 5 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index ea99592408..c021a377ba 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -1951,7 +1951,7 @@ showDF(properties, numRows = 200, truncate = FALSE)
<td><code>spark.r.heartBeatInterval</code></td>
<td>100</td>
<td>
- Interval for heartbeats sents from SparkR backend to R process to prevent connection timeout.
+ Interval for heartbeats sent from SparkR backend to R process to prevent connection timeout.
</td>
</tr>
diff --git a/docs/monitoring.md b/docs/monitoring.md
index 5bc5e18c4d..2eef4568d0 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -41,7 +41,7 @@ directory must be supplied in the `spark.history.fs.logDirectory` configuration
and should contain sub-directories that each represents an application's event logs.
The spark jobs themselves must be configured to log events, and to log them to the same shared,
-writeable directory. For example, if the server was configured with a log directory of
+writable directory. For example, if the server was configured with a log directory of
`hdfs://namenode/shared/spark-logs`, then the client-side options would be:
```
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index b9be7a7545..ba3e55fc06 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -222,9 +222,9 @@ The `sql` function enables applications to run SQL queries programmatically and
## Global Temporary View
-Temporay views in Spark SQL are session-scoped and will disappear if the session that creates it
+Temporary views in Spark SQL are session-scoped and will disappear if the session that creates it
terminates. If you want to have a temporary view that is shared among all sessions and keep alive
-until the Spark application terminiates, you can create a global temporary view. Global temporary
+until the Spark application terminates, you can create a global temporary view. Global temporary
view is tied to a system preserved database `global_temp`, and we must use the qualified name to
refer it, e.g. `SELECT * FROM global_temp.view1`.
@@ -1029,7 +1029,7 @@ following command:
bin/spark-shell --driver-class-path postgresql-9.4.1207.jar --jars postgresql-9.4.1207.jar
{% endhighlight %}
-Tables from the remote database can be loaded as a DataFrame or Spark SQL Temporary table using
+Tables from the remote database can be loaded as a DataFrame or Spark SQL temporary view using
the Data Sources API. Users can specify the JDBC connection properties in the data source options.
<code>user</code> and <code>password</code> are normally provided as connection properties for
logging into the data sources. In addition to the connection properties, Spark also supports