Skip to content

Conversation

davidradl
Copy link
Collaborator

@davidradl davidradl commented Aug 20, 2025

Description

First commit for Flink, includes:
changing the pom in line with the standard Flink connector including

  • Flink checkstyle
  • Flink spotless
  • standardize the name of the Maven artifact
  • amend the package to follow Flink standards
  • added Apache license to pom files and java files
  • changed the unit tests away from jacoco to use surefire
  • removed jetbrains annotations
  • removed the use of a google annotation it was complaining about
  • removed use of ImmutableList which it did not like.
  • separate the pom into a parent , code and SQL shading pom in line with Kafka.
  • licence checker passes
  • notices - added the top level NOTICE and added a NOTICE in the SQL project meta-inf to declare the dependancies - this is the same approach the Flink Kafka connector
    Resolves HTTP155
PR Checklist

I am looking to get this code into Flink with the commit history as-is.

There are still things to do:

  • migrate tests away from Mockito to avoid the banned
  • test datastream
  • do the docs
  • consider upgrade the WireMock which is very back level.
  • amend the config option names to be flink with gid aliiases
  • amend the prefix to not include gid and move to Flink
  • the parent connector version is 1.0.0 which is same as the Kafka connector. 1.1.0 did not work but we should look into upgrading to use this.

I think it makes sense to get this initial commit in to push the commit history over. So all subsequent contributions will count as Flink contributions.

@davidradl
Copy link
Collaborator Author

fyi @grzegorz8

@davidradl
Copy link
Collaborator Author

davidradl commented Aug 21, 2025

@grzegorz8 I have split the poms out so there is now a parent , the code and sql (with the shading).

Accummulating a list of tasks I need to do before offering to Flink :

@davidradl davidradl marked this pull request as draft August 21, 2025 15:54
@davidradl
Copy link
Collaborator Author

I did not see this build breaks locally. Changing to draft while I address them.

@davidradl
Copy link
Collaborator Author

@grzegorz8 thanks for your valuable feedback - I have addressed what you have raised - so I can see the build issues again (that I do not see locally) so I can address before taking the PR out of draft. I am aware that Flink is looking to support java 17 and 21 as well as 11. 21 was failing prior to the pom split.

@davidradl
Copy link
Collaborator Author

davidradl commented Aug 22, 2025

I have fixed the convergence locally.
I am seeing that if I include flink-table-test-utils in the parent and http poms then the junit tests work locally. But I get an unused dependancy - which I can ignore in the http pom, but the ci yml does not allow me to ignore this dependancy.
If I remove this dependancy the unit tests fail as it cannot find an ExecutorFactory. I assume that the presence of the plugin means that its content is in the classpath for the test, but this is a runtime dependancy not a compile time one.

without the plugin I see errors like:
[ERROR] org.apache.flink.connector.http.internal.sink.HttpSinkConnectionTest.testServerErrorConnection Time elapsed: 0.014 s <<< ERROR!
java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:88)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getPipelineExecutor(StreamExecutionEnvironment.java:2995)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:2469)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:2351)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:68)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:2325)
at org.apache.flink.connector.http.internal.sink.HttpSinkConnectionTest.testServerErrorConnection(HttpSinkConnectionTest.java:268)
at java.base/java.lang.reflect.Method.invoke(Method.java:572)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)

Still investigating - any thoughts ?

@davidradl
Copy link
Collaborator Author

davidradl commented Aug 27, 2025

The 2nd commit fixes:

  • parent pom was not specifying the versions correctly- the change allows the flink.version to be used in the child
  • copied the pattern used in the Kafka connector around bringing in the Flink clients and planner.
  • the planner comes in with an associated scala level.
  • added exclusions for convergence reasons.
  • CI clean locally but fails with an unused import error in build machine

@davidradl
Copy link
Collaborator Author

davidradl commented Aug 27, 2025

The unused import error occurs at java 21 and not at 8. I have not tested 17. It seems to related to reflection. There appears to be a circumvention - investigating.

@davidradl
Copy link
Collaborator Author

java 21 compiles on build machine - the license checker is now failing.

@davidradl davidradl marked this pull request as ready for review August 29, 2025 08:38
@davidradl
Copy link
Collaborator Author

@grzegorz8 I have the Flink CI running cleanly now. Could you review please?

I am planning to do the following prior to donating to Flink

  • cleanup commented out content
  • address any more points you raise
  • directories are being created at the top level of the models - I need to resolve this
  • raise Jiras for remaining work
  • more local testing

I hope to be be able to donate today or early next week.

@davidradl davidradl force-pushed the flink branch 2 times, most recently from fe2644f to 3818d11 Compare August 29, 2025 10:27
@davidradl davidradl marked this pull request as draft August 29, 2025 11:04
@davidradl
Copy link
Collaborator Author

davidradl commented Aug 29, 2025

Though the tests are green the sql jar file does not contain all the expected files. Moving to draft while I sort that out. This was a local issue - not to do with the code.

@davidradl davidradl marked this pull request as ready for review August 29, 2025 13:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants