Skip to content

Commit aa18bb6

Browse files
committed
Disable use of KryoSerializer in Thrift Server
1 parent 02ac379 commit aa18bb6

File tree

2 files changed

+2
-13
lines changed

2 files changed

+2
-13
lines changed

docs/configuration.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -744,7 +744,7 @@ Apart from these, the following properties are also available, and may be useful
744744
</tr>
745745
<tr>
746746
<td><code>spark.kryo.referenceTracking</code></td>
747-
<td>true (false when using Spark SQL Thrift Server)</td>
747+
<td>true</td>
748748
<td>
749749
Whether to track references to the same object when serializing data with Kryo, which is
750750
necessary if your object graphs have loops and useful for efficiency if they contain multiple
@@ -807,8 +807,7 @@ Apart from these, the following properties are also available, and may be useful
807807
<tr>
808808
<td><code>spark.serializer</code></td>
809809
<td>
810-
org.apache.spark.serializer.<br />JavaSerializer (org.apache.spark.serializer.<br />
811-
KryoSerializer when using Spark SQL Thrift Server)
810+
org.apache.spark.serializer.<br />JavaSerializer
812811
</td>
813812
<td>
814813
Class to use for serializing objects that will be sent over the network or need to be cached

sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@ package org.apache.spark.sql.hive.thriftserver
1919

2020
import java.io.PrintStream
2121

22-
import scala.collection.JavaConverters._
23-
2422
import org.apache.spark.{SparkConf, SparkContext}
2523
import org.apache.spark.internal.Logging
2624
import org.apache.spark.sql.{SparkSession, SQLContext}
@@ -37,8 +35,6 @@ private[hive] object SparkSQLEnv extends Logging {
3735
def init() {
3836
if (sqlContext == null) {
3937
val sparkConf = new SparkConf(loadDefaults = true)
40-
val maybeSerializer = sparkConf.getOption("spark.serializer")
41-
val maybeKryoReferenceTracking = sparkConf.getOption("spark.kryo.referenceTracking")
4238
// If user doesn't specify the appName, we want to get [SparkSQL::localHostName] instead of
4339
// the default appName [SparkSQLCLIDriver] in cli or beeline.
4440
val maybeAppName = sparkConf
@@ -47,12 +43,6 @@ private[hive] object SparkSQLEnv extends Logging {
4743

4844
sparkConf
4945
.setAppName(maybeAppName.getOrElse(s"SparkSQL::${Utils.localHostName()}"))
50-
.set(
51-
"spark.serializer",
52-
maybeSerializer.getOrElse("org.apache.spark.serializer.KryoSerializer"))
53-
.set(
54-
"spark.kryo.referenceTracking",
55-
maybeKryoReferenceTracking.getOrElse("false"))
5646

5747
val sparkSession = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()
5848
sparkContext = sparkSession.sparkContext

0 commit comments

Comments
 (0)