Skip to content

Commit a1eea37

Browse files
sarutakdongjoon-hyun
authored andcommitted
[SPARK-33850][SQL][FOLLOWUP] Improve and cleanup the test code
### What changes were proposed in this pull request? This PR mainly improves and cleans up the test code introduced in #30855 based on the comment. The test code is actually taken from another test `explain formatted - check presence of subquery in case of DPP` so this PR cleans the code too ( removed unnecessary `withTable`). ### Why are the changes needed? To keep the test code clean. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? `ExplainSuite` passes. Closes #30861 from sarutak/followup-SPARK-33850. Authored-by: Kousuke Saruta <[email protected]> Signed-off-by: Takeshi Yamamuro <[email protected]> (cherry picked from commit 3c8be39) Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 9c946c3 commit a1eea37

File tree

1 file changed

+9
-16
lines changed

1 file changed

+9
-16
lines changed

sql/core/src/test/scala/org/apache/spark/sql/ExplainSuite.scala

Lines changed: 9 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -249,7 +249,6 @@ class ExplainSuite extends ExplainSuiteHelper with DisableAdaptiveExecutionSuite
249249
withSQLConf(SQLConf.DYNAMIC_PARTITION_PRUNING_ENABLED.key -> "true",
250250
SQLConf.DYNAMIC_PARTITION_PRUNING_REUSE_BROADCAST_ONLY.key -> "false",
251251
SQLConf.EXCHANGE_REUSE_ENABLED.key -> "false") {
252-
withTable("df1", "df2") {
253252
spark.range(1000).select(col("id"), col("id").as("k"))
254253
.write
255254
.partitionBy("k")
@@ -289,27 +288,21 @@ class ExplainSuite extends ExplainSuiteHelper with DisableAdaptiveExecutionSuite
289288
assert(expected_pattern4.r.findAllMatchIn(normalizedOutput).length == 1)
290289
}
291290
}
292-
}
293291
}
294292
}
295293

296294
test("SPARK-33850: explain formatted - check presence of subquery in case of AQE") {
297-
withTable("df1") {
298-
withSQLConf(SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true") {
299-
withTable("df1") {
300-
spark.range(1, 100)
301-
.write
302-
.format("parquet")
303-
.mode("overwrite")
304-
.saveAsTable("df1")
295+
withSQLConf(SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true") {
296+
withTempView("df") {
297+
val df = spark.range(1, 100)
298+
df.createTempView("df")
305299

306-
val sqlText = "EXPLAIN FORMATTED SELECT (SELECT min(id) FROM df1) as v"
307-
val expected_pattern1 =
308-
"Subquery:1 Hosting operator id = 2 Hosting Expression = Subquery subquery#x"
300+
val sqlText = "EXPLAIN FORMATTED SELECT (SELECT min(id) FROM df) as v"
301+
val expected_pattern =
302+
"Subquery:1 Hosting operator id = 2 Hosting Expression = Subquery subquery#x"
309303

310-
withNormalizedExplain(sqlText) { normalizedOutput =>
311-
assert(expected_pattern1.r.findAllMatchIn(normalizedOutput).length == 1)
312-
}
304+
withNormalizedExplain(sqlText) { normalizedOutput =>
305+
assert(expected_pattern.r.findAllMatchIn(normalizedOutput).length == 1)
313306
}
314307
}
315308
}

0 commit comments

Comments
 (0)