Skip to content

Commit 0b3e7cc

Browse files
jsnowackiHyukjinKwon
authored andcommitted
[SPARK-18136] Fix SPARK_JARS_DIR for Python pip install on Windows
## What changes were proposed in this pull request? Fix for setup of `SPARK_JARS_DIR` on Windows as it looks for `%SPARK_HOME%\RELEASE` file instead of `%SPARK_HOME%\jars` as it should. RELEASE file is not included in the `pip` build of PySpark. ## How was this patch tested? Local install of PySpark on Anaconda 4.4.0 (Python 3.6.1). Author: Jakub Nowacki <[email protected]> Closes #19310 from jsnowacki/master. (cherry picked from commit c11f24a) Signed-off-by: hyukjinkwon <[email protected]>
1 parent 03db721 commit 0b3e7cc

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

bin/spark-class2.cmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ if "x%1"=="x" (
2929
)
3030

3131
rem Find Spark jars.
32-
if exist "%SPARK_HOME%\RELEASE" (
32+
if exist "%SPARK_HOME%\jars" (
3333
set SPARK_JARS_DIR="%SPARK_HOME%\jars"
3434
) else (
3535
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\scala-%SPARK_SCALA_VERSION%\jars"

0 commit comments

Comments
 (0)