Skip to content

Commit bf720ef

Browse files
jkbradleyrxin
authored andcommitted
[docs] Fix outdated comment in tuning guide
When you use the SPARK_JAVA_OPTS env variable, Spark complains: ``` SPARK_JAVA_OPTS was detected (set to ' -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps '). This is deprecated in Spark 1.0+. Please instead use: - ./spark-submit with conf/spark-defaults.conf to set defaults for an application - ./spark-submit with --driver-java-options to set -X options for a driver - spark.executor.extraJavaOptions to set -X options for executors - SPARK_DAEMON_JAVA_OPTS to set java options for standalone daemons (master or worker) ``` This updates the docs to redirect the user to the relevant part of the configuration docs. CC: mengxr but please CC someone else as needed Author: Joseph K. Bradley <[email protected]> Closes #3592 from jkbradley/tuning-doc and squashes the following commits: 0760ce1 [Joseph K. Bradley] fixed outdated comment in tuning guide (cherry picked from commit 529439b) Signed-off-by: Reynold Xin <[email protected]>
1 parent dec838b commit bf720ef

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

docs/tuning.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -143,8 +143,7 @@ the space allocated to the RDD cache to mitigate this.
143143
**Measuring the Impact of GC**
144144

145145
The first step in GC tuning is to collect statistics on how frequently garbage collection occurs and the amount of
146-
time spent GC. This can be done by adding `-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps` to your
147-
`SPARK_JAVA_OPTS` environment variable. Next time your Spark job is run, you will see messages printed in the worker's logs
146+
time spent GC. This can be done by adding `-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps` to the Java options. (See the [configuration guide](configuration.html#Dynamically-Loading-Spark-Properties) for info on passing Java options to Spark jobs.) Next time your Spark job is run, you will see messages printed in the worker's logs
148147
each time a garbage collection occurs. Note these logs will be on your cluster's worker nodes (in the `stdout` files in
149148
their work directories), *not* on your driver program.
150149

0 commit comments

Comments
 (0)