-
Notifications
You must be signed in to change notification settings - Fork 28.7k
[SPARK-26076][Build][Minor] Revise ambiguous error message from load-spark-env.sh #23049
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
bin/load-spark-env.sh
Outdated
@@ -47,8 +47,8 @@ if [ -z "$SPARK_SCALA_VERSION" ]; then | |||
ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-2.12" | |||
|
|||
if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then | |||
echo -e "Presence of build for multiple Scala versions detected." 1>&2 | |||
echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION in spark-env.sh.' 1>&2 | |||
echo -e "Presence of build for both scala versions(SCALA 2.11 and SCALA 2.12) detected." 1>&2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about something like "Multiple Scala versions detected ($ASSEMBLY_DIR1 and $ASSEMBLY_DIR2). Remove one, or export SPARK_SCALA_VERSION=(version) in load-spark-env.sh"
That adds more info and maybe is less maintenance as we add or drop Scala versions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense. Let me have a quick update
Test build #98879 has finished for PR 23049 at commit
|
Test build #98876 has finished for PR 23049 at commit
|
bin/load-spark-env.sh
Outdated
ASSEMBLY_DIR1="${SPARK_HOME}/assembly/target/scala-${SCALA_VERSION1}" | ||
ASSEMBLY_DIR2="${SPARK_HOME}/assembly/target/scala-${SCALA_VERSION2}" | ||
if [[ -d "$ASSEMBLY_DIR1" && -d "$ASSEMBLY_DIR2" ]]; then | ||
echo -e "Presence of build for multiple Scala versions detected($ASSEMBLY_DIR1 and $ASSEMBLY_DIR2)." 1>&2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: space after 'detected'. I think you can put this in one string if you like.
The reason why it mentioned
Users shouldn't modify |
Hi @vanzin , |
Test build #98899 has finished for PR 23049 at commit
|
Test build #98900 has finished for PR 23049 at commit
|
retest this please. |
Test build #98908 has finished for PR 23049 at commit
|
I'm not sure this is actually helping much or making things less confusing. If If it's triggered in a subsequent invocation of this script, where the file has already been loaded, it will just say
|
@vanzin I see your point. I will add a link to https://spark.apache.org/docs/latest/configuration.html. Thanks for the suggestion. In my case, I didn't know where to find or edit In addition, it shows what the value of |
Test build #98954 has finished for PR 23049 at commit
|
retest this please. |
Test build #98957 has finished for PR 23049 at commit
|
Merged to master |
…spark-env.sh ## What changes were proposed in this pull request? When I try to run scripts (e.g. `start-master.sh`/`start-history-server.sh ` in latest master, I got such error: ``` Presence of build for multiple Scala versions detected. Either clean one of them or, export SPARK_SCALA_VERSION in spark-env.sh. ``` The error message is quite confusing. Without reading `load-spark-env.sh`, I didn't know which directory to remove, or where to find and edit the `spark-evn.sh`. This PR is to make the error message more clear. Also change the script for less maintenance when we add or drop Scala versions in the future. As now with apache#22967, we can revise the error message as following(in my local setup): ``` Presence of build for multiple Scala versions detected (/Users/gengliangwang/IdeaProjects/spark/assembly/target/scala-2.12 and /Users/gengliangwang/IdeaProjects/spark/assembly/target/scala-2.11). Remove one of them or, export SPARK_SCALA_VERSION=2.12 in /Users/gengliangwang/IdeaProjects/spark/conf/spark-env.sh. Visit https://spark.apache.org/docs/latest/configuration.html#environment-variables for more details about setting environment variables in spark-env.sh. ``` ## How was this patch tested? Manual test Closes apache#23049 from gengliangwang/reviseEnvScript. Authored-by: Gengliang Wang <[email protected]> Signed-off-by: Sean Owen <[email protected]>
What changes were proposed in this pull request?
When I try to run scripts (e.g.
start-master.sh
/start-history-server.sh
in latest master, I got such error:The error message is quite confusing. Without reading
load-spark-env.sh
, I didn't know which directory to remove, or where to find and edit thespark-evn.sh
.This PR is to make the error message more clear. Also change the script for less maintenance when we add or drop Scala versions in the future.
As now with #22967, we can revise the error message as following(in my local setup):
How was this patch tested?
Manual test