Skip to content

Commit b0ffbf6

Browse files
committed
[MINOR] remove spark.sql.dialect from doc and test
1 parent 6ca990f commit b0ffbf6

File tree

3 files changed

+1
-14
lines changed

3 files changed

+1
-14
lines changed

docs/sql-programming-guide.md

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -122,13 +122,6 @@ Spark build. If these dependencies are not a problem for your application then u
122122
is recommended for the 1.3 release of Spark. Future releases will focus on bringing `SQLContext` up
123123
to feature parity with a `HiveContext`.
124124

125-
The specific variant of SQL that is used to parse queries can also be selected using the
126-
`spark.sql.dialect` option. This parameter can be changed using either the `setConf` method on
127-
a `SQLContext` or by using a `SET key=value` command in SQL. For a `SQLContext`, the only dialect
128-
available is "sql" which uses a simple SQL parser provided by Spark SQL. In a `HiveContext`, the
129-
default is "hiveql", though "sql" is also available. Since the HiveQL parser is much more complete,
130-
this is recommended for most use cases.
131-
132125

133126
## Creating DataFrames
134127

sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ class SQLContext private[sql](
7979
def this(sparkContext: JavaSparkContext) = this(sparkContext.sc)
8080

8181
// If spark.sql.allowMultipleContexts is true, we will throw an exception if a user
82-
// wants to create a new root SQLContext (a SLQContext that is not created by newSession).
82+
// wants to create a new root SQLContext (a SQLContext that is not created by newSession).
8383
private val allowMultipleContexts =
8484
sparkContext.conf.getBoolean(
8585
SQLConf.ALLOW_MULTIPLE_CONTEXTS.key,

sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveQuerySuite.scala

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -270,12 +270,6 @@ class HiveQuerySuite extends HiveComparisonTest with BeforeAndAfter {
270270
"SELECT 11 % 10, IF((101.1 % 100.0) BETWEEN 1.01 AND 1.11, \"true\", \"false\"), " +
271271
"(101 / 2) % 10 FROM src LIMIT 1")
272272

273-
test("Query expressed in SQL") {
274-
setConf("spark.sql.dialect", "sql")
275-
assert(sql("SELECT 1").collect() === Array(Row(1)))
276-
setConf("spark.sql.dialect", "hiveql")
277-
}
278-
279273
test("Query expressed in HiveQL") {
280274
sql("FROM src SELECT key").collect()
281275
}

0 commit comments

Comments
 (0)