17
17
18
18
package org .apache .spark .sql
19
19
20
- import java .lang .IllegalArgumentException
21
-
22
20
import org .apache .spark .annotation .{Experimental , InterfaceStability }
23
21
import org .apache .spark .sql .catalyst .plans .logical .LogicalKeyedState
24
22
@@ -36,7 +34,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
36
34
* `Dataset.groupByKey()`) while maintaining user-defined per-group state between invocations.
37
35
* For a static batch Dataset, the function will be invoked once per group. For a streaming
38
36
* Dataset, the function will be invoked for each group repeatedly in every trigger.
39
- * That is, in every batch of the [[ streaming.StreamingQuery StreamingQuery ]] ,
37
+ * That is, in every batch of the ` streaming.StreamingQuery` ,
40
38
* the function will be invoked once for each group that has data in the batch.
41
39
*
42
40
* The function is invoked with following parameters.
@@ -65,7 +63,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
65
63
*
66
64
* Scala example of using KeyedState in `mapGroupsWithState`:
67
65
* {{{
68
- * /* A mapping function that maintains an integer state for string keys and returns a string. */
66
+ * // A mapping function that maintains an integer state for string keys and returns a string.
69
67
* def mappingFunction(key: String, value: Iterator[Int], state: KeyedState[Int]): String = {
70
68
* // Check if state exists
71
69
* if (state.exists) {
@@ -88,7 +86,7 @@ import org.apache.spark.sql.catalyst.plans.logical.LogicalKeyedState
88
86
*
89
87
* Java example of using `KeyedState`:
90
88
* {{{
91
- * /* A mapping function that maintains an integer state for string keys and returns a string. */
89
+ * // A mapping function that maintains an integer state for string keys and returns a string.
92
90
* MapGroupsWithStateFunction<String, Integer, Integer, String> mappingFunction =
93
91
* new MapGroupsWithStateFunction<String, Integer, Integer, String>() {
94
92
*
0 commit comments