Skip to content

Commit 3b43768

Browse files
HyukjinKwonsrowen
authored andcommitted
[MINOR][BUILD] Fix javadoc8 break
## What changes were proposed in this pull request? These error below seems caused by unidoc that does not understand double commented block. ``` [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:69: error: class, interface, or enum expected [error] * MapGroupsWithStateFunction&lt;String, Integer, Integer, String&gt; mappingFunction = [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:69: error: class, interface, or enum expected [error] * MapGroupsWithStateFunction&lt;String, Integer, Integer, String&gt; mappingFunction = [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:70: error: class, interface, or enum expected [error] * new MapGroupsWithStateFunction&lt;String, Integer, Integer, String&gt;() { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:70: error: class, interface, or enum expected [error] * new MapGroupsWithStateFunction&lt;String, Integer, Integer, String&gt;() { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:72: error: illegal character: '#' [error] * &#64;Override [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:72: error: class, interface, or enum expected [error] * &#64;Override [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:73: error: class, interface, or enum expected [error] * public String call(String key, Iterator&lt;Integer&gt; value, KeyedState&lt;Integer&gt; state) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:73: error: class, interface, or enum expected [error] * public String call(String key, Iterator&lt;Integer&gt; value, KeyedState&lt;Integer&gt; state) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:73: error: class, interface, or enum expected [error] * public String call(String key, Iterator&lt;Integer&gt; value, KeyedState&lt;Integer&gt; state) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:73: error: class, interface, or enum expected [error] * public String call(String key, Iterator&lt;Integer&gt; value, KeyedState&lt;Integer&gt; state) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:73: error: class, interface, or enum expected [error] * public String call(String key, Iterator&lt;Integer&gt; value, KeyedState&lt;Integer&gt; state) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:76: error: class, interface, or enum expected [error] * boolean shouldRemove = ...; // Decide whether to remove the state [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:77: error: class, interface, or enum expected [error] * if (shouldRemove) { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:79: error: class, interface, or enum expected [error] * } else { [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:81: error: class, interface, or enum expected [error] * state.update(newState); // Set the new state [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:82: error: class, interface, or enum expected [error] * } [error] ^ [error] .../forked/spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:85: error: class, interface, or enum expected [error] * state.update(initialState); [error] ^ [error] .../forked/spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:86: error: class, interface, or enum expected [error] * } [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:90: error: class, interface, or enum expected [error] * </code></pre> [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:92: error: class, interface, or enum expected [error] * tparam S User-defined type of the state to be stored for each key. Must be encodable into [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:93: error: class, interface, or enum expected [error] * Spark SQL types (see {link Encoder} for more details). [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:94: error: class, interface, or enum expected [error] * since 2.1.1 [error] ^ ``` And another link seems unrecognisable. ``` .../spark/sql/core/target/java/org/apache/spark/sql/KeyedState.java:16: error: reference not found [error] * That is, in every batch of the {link streaming.StreamingQuery StreamingQuery}, [error] ``` Note that this PR does not fix the two breaks as below: ``` [error] .../spark/sql/core/target/java/org/apache/spark/sql/DataFrameStatFunctions.java:43: error: unexpected content [error] * see {link DataFrameStatsFunctions.approxQuantile(col:Str* approxQuantile} for [error] ^ [error] .../spark/sql/core/target/java/org/apache/spark/sql/DataFrameStatFunctions.java:52: error: bad use of '>' [error] * param relativeError The relative target precision to achieve (>= 0). [error] ^ [error] ``` because these seem probably fixed soon in apache#16776 and I intended to avoid potential conflicts. ## How was this patch tested? Manually via `jekyll build` Author: hyukjinkwon <[email protected]> Closes apache#16926 from HyukjinKwon/javadoc-break.
1 parent 0e24054 commit 3b43768

File tree

1 file changed

+3
-5
lines changed

1 file changed

+3
-5
lines changed

sql/core/src/main/scala/org/apache/spark/sql/KeyedState.scala

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,6 @@
1717

1818
package org.apache.spark.sql
1919

20-
import java.lang.IllegalArgumentException
21-
2220
import org.apache.spark.annotation.{Experimental, InterfaceStability}
2321
import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
2422

@@ -36,7 +34,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
3634
* `Dataset.groupByKey()`) while maintaining user-defined per-group state between invocations.
3735
* For a static batch Dataset, the function will be invoked once per group. For a streaming
3836
* Dataset, the function will be invoked for each group repeatedly in every trigger.
39-
* That is, in every batch of the [[streaming.StreamingQuery StreamingQuery]],
37+
* That is, in every batch of the `streaming.StreamingQuery`,
4038
* the function will be invoked once for each group that has data in the batch.
4139
*
4240
* The function is invoked with following parameters.
@@ -65,7 +63,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
6563
*
6664
* Scala example of using KeyedState in `mapGroupsWithState`:
6765
* {{{
68-
* /* A mapping function that maintains an integer state for string keys and returns a string. */
66+
* // A mapping function that maintains an integer state for string keys and returns a string.
6967
* def mappingFunction(key: String, value: Iterator[Int], state: KeyedState[Int]): String = {
7068
* // Check if state exists
7169
* if (state.exists) {
@@ -88,7 +86,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
8886
*
8987
* Java example of using `KeyedState`:
9088
* {{{
91-
* /* A mapping function that maintains an integer state for string keys and returns a string. */
89+
* // A mapping function that maintains an integer state for string keys and returns a string.
9290
* MapGroupsWithStateFunction<String, Integer, Integer, String> mappingFunction =
9391
* new MapGroupsWithStateFunction<String, Integer, Integer, String>() {
9492
*

0 commit comments

Comments
 (0)