-
Notifications
You must be signed in to change notification settings - Fork 28.7k
[Spark 3922] Refactor spark-core to use Utils.UTF_8 #2781
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
/cc @rxin, @JoshRosen |
Can one of the admins verify this patch? |
QA tests have started for PR 2781 at commit
|
I vote for |
QA tests have finished for PR 2781 at commit
|
Test FAILed. |
Like @srowen, I prefer to use existing constants. Also, it might be worth it to import the constant directly and just use (Edit: just fixed typo.) |
QA tests have started for PR 2781 at commit
|
QA tests have finished for PR 2781 at commit
|
QA tests have started for PR 2781 at commit
|
Good point. I updated to use |
QA tests have finished for PR 2781 at commit
|
QA tests have started for PR 2781 at commit
|
QA tests have finished for PR 2781 at commit
|
retest this please |
Jenkins, retest this please. |
Test build #22344 has started for PR 2781 at commit
|
Test build #22344 has finished for PR 2781 at commit
|
Test PASSed. |
LGTM. @JoshRosen any further comment? |
This looks good to me. Thanks! |
Merging in master. Thanks. |
A global UTF8 constant is very helpful to handle encoding problems when converting between String and bytes. There are several solutions here:
val UTF_8 = Charset.forName("UTF-8")
to Utils.scalaIMO, I prefer option 1) because people can find it easily.
This is a PR for option 1) and only fixes Spark Core.