You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/tuning.md
+1-2Lines changed: 1 addition & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -143,8 +143,7 @@ the space allocated to the RDD cache to mitigate this.
143
143
**Measuring the Impact of GC**
144
144
145
145
The first step in GC tuning is to collect statistics on how frequently garbage collection occurs and the amount of
146
-
time spent GC. This can be done by adding `-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps` to your
147
-
`SPARK_JAVA_OPTS` environment variable. Next time your Spark job is run, you will see messages printed in the worker's logs
146
+
time spent GC. This can be done by adding `-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps` to the Java options. (See the [configuration guide](configuration.html#Dynamically-Loading-Spark-Properties) for info on passing Java options to Spark jobs.) Next time your Spark job is run, you will see messages printed in the worker's logs
148
147
each time a garbage collection occurs. Note these logs will be on your cluster's worker nodes (in the `stdout` files in
149
148
their work directories), *not* on your driver program.
0 commit comments