Skip to content

Commit db0ddce

Browse files
dongjoon-hyunsrowen
authored andcommitted
[SPARK-19775][SQL] Remove an obsolete partitionBy().insertInto() test case
## What changes were proposed in this pull request? This issue removes [a test case](https://github.com/apache/spark/blame/master/sql/hive/src/test/scala/org/apache/spark/sql/hive/InsertIntoHiveTableSuite.scala#L287-L298) which was introduced by [SPARK-14459](652bbb1) and was superseded by [SPARK-16033](https://github.com/apache/spark/blame/master/sql/hive/src/test/scala/org/apache/spark/sql/hive/InsertIntoHiveTableSuite.scala#L365-L371). Basically, we cannot use `partitionBy` and `insertInto` together. ```scala test("Reject partitioning that does not match table") { withSQLConf(("hive.exec.dynamic.partition.mode", "nonstrict")) { sql("CREATE TABLE partitioned (id bigint, data string) PARTITIONED BY (part string)") val data = (1 to 10).map(i => (i, s"data-$i", if ((i % 2) == 0) "even" else "odd")) .toDF("id", "data", "part") intercept[AnalysisException] { // cannot partition by 2 fields when there is only one in the table definition data.write.partitionBy("part", "data").insertInto("partitioned") } } } ``` ## How was this patch tested? This only removes a test case. Pass the existing Jenkins test. Author: Dongjoon Hyun <[email protected]> Closes #17106 from dongjoon-hyun/SPARK-19775.
1 parent 2ff1467 commit db0ddce

File tree

1 file changed

+0
-13
lines changed

1 file changed

+0
-13
lines changed

sql/hive/src/test/scala/org/apache/spark/sql/hive/InsertIntoHiveTableSuite.scala

Lines changed: 0 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -284,19 +284,6 @@ class InsertIntoHiveTableSuite extends QueryTest with TestHiveSingleton with Bef
284284
sql("DROP TABLE hiveTableWithStructValue")
285285
}
286286

287-
test("Reject partitioning that does not match table") {
288-
withSQLConf(("hive.exec.dynamic.partition.mode", "nonstrict")) {
289-
sql("CREATE TABLE partitioned (id bigint, data string) PARTITIONED BY (part string)")
290-
val data = (1 to 10).map(i => (i, s"data-$i", if ((i % 2) == 0) "even" else "odd"))
291-
.toDF("id", "data", "part")
292-
293-
intercept[AnalysisException] {
294-
// cannot partition by 2 fields when there is only one in the table definition
295-
data.write.partitionBy("part", "data").insertInto("partitioned")
296-
}
297-
}
298-
}
299-
300287
test("Test partition mode = strict") {
301288
withSQLConf(("hive.exec.dynamic.partition.mode", "strict")) {
302289
sql("CREATE TABLE partitioned (id bigint, data string) PARTITIONED BY (part string)")

0 commit comments

Comments
 (0)