Skip to content

Commit 7b79957

Browse files
scwfmarmbrus
authored andcommitted
[SQL] Minor fix for doc and comment
Author: wangfei <[email protected]> Closes #3533 from scwf/sql-doc1 and squashes the following commits: 962910b [wangfei] doc and comment fix
1 parent bc35381 commit 7b79957

File tree

3 files changed

+7
-5
lines changed

3 files changed

+7
-5
lines changed

docs/sql-programming-guide.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1002,7 +1002,7 @@ Several caching related features are not supported yet:
10021002
## Compatibility with Apache Hive
10031003

10041004
Spark SQL is designed to be compatible with the Hive Metastore, SerDes and UDFs. Currently Spark
1005-
SQL is based on Hive 0.12.0.
1005+
SQL is based on Hive 0.12.0 and 0.13.1.
10061006

10071007
#### Deploying in Existing Hive Warehouses
10081008

@@ -1041,6 +1041,7 @@ Spark SQL supports the vast majority of Hive features, such as:
10411041
* Sampling
10421042
* Explain
10431043
* Partitioned tables
1044+
* View
10441045
* All Hive DDL Functions, including:
10451046
* `CREATE TABLE`
10461047
* `CREATE TABLE AS SELECT`

examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,10 @@ object HiveFromSpark {
2929
val sc = new SparkContext(sparkConf)
3030
val path = s"${System.getenv("SPARK_HOME")}/examples/src/main/resources/kv1.txt"
3131

32-
// A local hive context creates an instance of the Hive Metastore in process, storing
33-
// the warehouse data in the current directory. This location can be overridden by
34-
// specifying a second parameter to the constructor.
32+
// A hive context adds support for finding tables in the MetaStore and writing queries
33+
// using HiveQL. Users who do not have an existing Hive deployment can still create a
34+
// HiveContext. When not configured by the hive-site.xml, the context automatically
35+
// creates metastore_db and warehouse in the current directory.
3536
val hiveContext = new HiveContext(sc)
3637
import hiveContext._
3738

sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ class DefaultSource extends RelationProvider {
4949
sqlContext: SQLContext,
5050
parameters: Map[String, String]): BaseRelation = {
5151
val path =
52-
parameters.getOrElse("path", sys.error("'path' must be specifed for parquet tables."))
52+
parameters.getOrElse("path", sys.error("'path' must be specified for parquet tables."))
5353

5454
ParquetRelation2(path)(sqlContext)
5555
}

0 commit comments

Comments
 (0)