Skip to content

Commit b0f30b5

Browse files
Greg Owengatorsmile
authored andcommitted
[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive warehouse directory
## What changes were proposed in this pull request? During TestHiveSparkSession.reset(), which is called after each TestHiveSingleton suite, we now delete and recreate the Hive warehouse directory. ## How was this patch tested? Ran full suite of tests locally, verified that they pass. Author: Greg Owen <[email protected]> Closes #19341 from GregOwen/SPARK-22120. (cherry picked from commit ce20478) Signed-off-by: gatorsmile <[email protected]>
1 parent 9836ea1 commit b0f30b5

File tree

1 file changed

+6
-0
lines changed

1 file changed

+6
-0
lines changed

sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@
1818
package org.apache.spark.sql.hive.test
1919

2020
import java.io.File
21+
import java.net.URI
2122
import java.util.{Set => JavaSet}
2223

2324
import scala.collection.JavaConverters._
@@ -486,6 +487,11 @@ private[hive] class TestHiveSparkSession(
486487
}
487488
}
488489

490+
// Clean out the Hive warehouse between each suite
491+
val warehouseDir = new File(new URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath)
492+
Utils.deleteRecursively(warehouseDir)
493+
warehouseDir.mkdir()
494+
489495
sharedState.cacheManager.clearCache()
490496
loadedTables.clear()
491497
sessionState.catalog.reset()

0 commit comments

Comments
 (0)