Skip to content

[SPARK-26030][BUILD] Bump previousSparkVersion in MimaBuild.scala to be 2.4.0 #22977

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

cloud-fan
Copy link
Contributor

@cloud-fan cloud-fan commented Nov 8, 2018

What changes were proposed in this pull request?

Since Spark 2.4.0 is already in maven repo, we can Bump previousSparkVersion in MimaBuild.scala to be 2.4.0.

Note that, seems we forgot to do it for branch 2.4, so this PR also updates MimaExcludes.scala

How was this patch tested?

N/A

Copy link
Member

@HyukjinKwon HyukjinKwon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good if tests pass

@cloud-fan
Copy link
Contributor Author

cloud-fan commented Nov 8, 2018

I think one major issue is, there is no document about how to update mima with new releases. Anyone knows the detailed process? Seems we need to update MimaExcludes.scala with something like lazy val v30excludes = v24excludes ++ Seq when cut the new branch, and update MimaBuild.scala when the new release is published.

And there are 2 remaining issues.

  1. the data source v2 changes broke a lot of mima rules, while I expect interfaces marked as "Envolving" should not be tracked by mima.
  2. mllib broke a lot of mima rules.

Also cc @JoshRosen @srowen @gatorsmile @shaneknapp @vanzin @holdenk @felixcheung

@HyukjinKwon
Copy link
Member

Actually, similar changes were being made at #22967 FYI. cc @dbtsai

@SparkQA
Copy link

SparkQA commented Nov 8, 2018

Test build #98596 has finished for PR 22977 at commit a035778.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@@ -88,7 +88,7 @@ object MimaBuild {

def mimaSettings(sparkHome: File, projectRef: ProjectRef) = {
val organization = "org.apache.spark"
val previousSparkVersion = "2.2.0"
val previousSparkVersion = "2.4.0"
Copy link
Member

@dongjoon-hyun dongjoon-hyun Nov 8, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to https://issues.apache.org/jira/browse/SPARK-23070, 2.3.0 seems to be the correct version in this PR. (If this is a part of 2.4.0 release process)

cc @gatorsmile .

Copy link
Contributor Author

@cloud-fan cloud-fan Nov 8, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

up to my understanding, we should have changed it to 2.3.0 when 2.3.0 was released.

Maybe we should send another PR to branch-2.4 and change it to 2.3.0

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe @cloud-fan is right here. If master is 3.0, the previous version is 2.4.0 in master. Analogously for branch 2.4. Yes sounds like this should be a step in the release process somewhere.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @cloud-fan and @srowen . +1 for another PR for branch-2.4.

ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.expressions.Window.rangeBetween"),
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.expressions.WindowSpec.rangeBetween"),
// [SPARK-23781][CORE] Merge token renewer functionality into HadoopDelegationTokenManager
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.deploy.SparkHadoopUtil.nextCredentialRenewalTime"),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually a private method, I'm not sure why mima tracks it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was a private[spark] method which has to be implemented as a public method in the Java bytecode. MiMa doesn't know the difference. Yes it can be ignore for sure.

So these failures are 'new' because they were introduced after Spark 2.2 (hence not missing since 2.2) then removed, so bumping the previous version causes them to fail now? OK.

@@ -88,7 +88,7 @@ object MimaBuild {

def mimaSettings(sparkHome: File, projectRef: ProjectRef) = {
val organization = "org.apache.spark"
val previousSparkVersion = "2.2.0"
val previousSparkVersion = "2.4.0"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe @cloud-fan is right here. If master is 3.0, the previous version is 2.4.0 in master. Analogously for branch 2.4. Yes sounds like this should be a step in the release process somewhere.

ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.expressions.Window.rangeBetween"),
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.expressions.WindowSpec.rangeBetween"),
// [SPARK-23781][CORE] Merge token renewer functionality into HadoopDelegationTokenManager
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.deploy.SparkHadoopUtil.nextCredentialRenewalTime"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was a private[spark] method which has to be implemented as a public method in the Java bytecode. MiMa doesn't know the difference. Yes it can be ignore for sure.

So these failures are 'new' because they were introduced after Spark 2.2 (hence not missing since 2.2) then removed, so bumping the previous version causes them to fail now? OK.

ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.expressions.WindowSpec.rangeBetween"),
// [SPARK-23781][CORE] Merge token renewer functionality into HadoopDelegationTokenManager
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.deploy.SparkHadoopUtil.nextCredentialRenewalTime"),
// [SPARK-25908][CORE][SQL] Remove old deprecated items in Spark 3
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan really these two lines should move to the top, because the first long chunk of exclusions is for 25908.

@felixcheung
Copy link
Member

felixcheung commented Nov 10, 2018 via email

@srowen
Copy link
Member

srowen commented Nov 10, 2018

HiveExternalCatalogVersionsSuite ? that's already updated.

@felixcheung
Copy link
Member

felixcheung commented Nov 11, 2018

right, I mean both this and that should be part of the process "post-release"
about @cloud-fan asks #22977 (comment)

@cloud-fan cloud-fan changed the title [BUILD] Bump previousSparkVersion in MimaBuild.scala to be 2.4.0 [SPARK-26030][BUILD] Bump previousSparkVersion in MimaBuild.scala to be 2.4.0 Nov 13, 2018
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.ml.classification.LabelConverter")
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.ml.classification.LabelConverter"),

// [SPARK-21842][MESOS] Support Kerberos ticket renewal and creation in Mesos
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these changes are cherry-picked from #23015

@SparkQA
Copy link

SparkQA commented Nov 13, 2018

Test build #98746 has finished for PR 22977 at commit 8b9efe1.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

…a to be 2.3.0

## What changes were proposed in this pull request?

Although it's a little late, we should still update mima for branch 2.4, to avoid future breaking changes.

Note that, when merging, we should forward port it to master branch, so that the excluding rules are still in `v24excludes`.

TODO: update the release process document to mention about mima update.

## How was this patch tested?

N/A

Closes apache#23015 from cloud-fan/mima-2.4.

Authored-by: Wenchen Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
@SparkQA
Copy link

SparkQA commented Nov 13, 2018

Test build #98750 has finished for PR 22977 at commit 8d9f5c7.

  • This patch fails MiMa tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dbtsai
Copy link
Member

dbtsai commented Nov 13, 2018

LGTM. Thanks!

@cloud-fan
Copy link
Contributor Author

since this PR only touches mima, and the jenkins already passed the mima check, I'm going to merge it to master, thanks!

@asfgit asfgit closed this in e25bce5 Nov 13, 2018
@SparkQA
Copy link

SparkQA commented Nov 13, 2018

Test build #98753 has finished for PR 22977 at commit 802b521.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

jackylee-ch pushed a commit to jackylee-ch/spark that referenced this pull request Feb 18, 2019
…be 2.4.0

## What changes were proposed in this pull request?

Since Spark 2.4.0 is already in maven repo, we can Bump previousSparkVersion in MimaBuild.scala to be 2.4.0.

Note that, seems we forgot to do it for branch 2.4, so this PR also updates MimaExcludes.scala

## How was this patch tested?

N/A

Closes apache#22977 from cloud-fan/mima.

Authored-by: Wenchen Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants