Skip to content

Commit 7d0fb76

Browse files
committed
change tree/master to tree/branch-0.9 in docs
1 parent ea2b205 commit 7d0fb76

File tree

5 files changed

+11
-11
lines changed

5 files changed

+11
-11
lines changed

docs/bagel-programming-guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ _Example_
108108

109109
## Operations
110110

111-
Here are the actions and types in the Bagel API. See [Bagel.scala](https://github.com/apache/spark/blob/master/bagel/src/main/scala/org/apache/spark/bagel/Bagel.scala) for details.
111+
Here are the actions and types in the Bagel API. See [Bagel.scala](https://github.com/apache/spark/tree/branch-0.9/bagel/src/main/scala/org/apache/spark/bagel/Bagel.scala) for details.
112112

113113
### Actions
114114

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ Note that on Windows, you need to set the environment variables on separate line
117117
exercises about Spark, Shark, Mesos, and more. [Videos](http://ampcamp.berkeley.edu/agenda-2012),
118118
[slides](http://ampcamp.berkeley.edu/agenda-2012) and [exercises](http://ampcamp.berkeley.edu/exercises-2012) are
119119
available online for free.
120-
* [Code Examples](http://spark.apache.org/examples.html): more are also available in the [examples subfolder](https://github.com/apache/spark/tree/master/examples/src/main/scala/) of Spark
120+
* [Code Examples](http://spark.apache.org/examples.html): more are also available in the [examples subfolder](https://github.com/apache/spark/tree/branch-0.9/examples/src/main/scala/) of Spark
121121
* [Paper Describing Spark](http://www.cs.berkeley.edu/~matei/papers/2012/nsdi_spark.pdf)
122122
* [Paper Describing Spark Streaming](http://www.eecs.berkeley.edu/Pubs/TechRpts/2012/EECS-2012-259.pdf)
123123

docs/java-programming-guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ We hope to generate documentation with Java-style syntax in the future.
189189
# Where to Go from Here
190190

191191
Spark includes several sample programs using the Java API in
192-
[`examples/src/main/java`](https://github.com/apache/spark/tree/master/examples/src/main/java/org/apache/spark/examples). You can run them by passing the class name to the
192+
[`examples/src/main/java`](https://github.com/apache/spark/tree/branch-0.9/examples/src/main/java/org/apache/spark/examples). You can run them by passing the class name to the
193193
`bin/run-example` script included in Spark; for example:
194194

195195
./bin/run-example org.apache.spark.examples.JavaWordCount

docs/python-programming-guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -157,7 +157,7 @@ some example applications.
157157

158158
# Where to Go from Here
159159

160-
PySpark also includes several sample programs in the [`python/examples` folder](https://github.com/apache/spark/tree/master/python/examples).
160+
PySpark also includes several sample programs in the [`python/examples` folder](https://github.com/apache/spark/tree/branch-0.9/python/examples).
161161
You can run them by passing the files to `pyspark`; e.g.:
162162

163163
./bin/pyspark python/examples/wordcount.py

docs/streaming-programming-guide.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -125,7 +125,7 @@ ssc.awaitTermination() // Wait for the computation to terminate
125125
{% endhighlight %}
126126

127127
The complete code can be found in the Spark Streaming example
128-
[NetworkWordCount]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/scala/org/apache/spark/streaming/examples/NetworkWordCount.scala).
128+
[NetworkWordCount]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/scala/org/apache/spark/streaming/examples/NetworkWordCount.scala).
129129
<br>
130130

131131
</div>
@@ -207,7 +207,7 @@ jssc.awaitTermination(); // Wait for the computation to terminate
207207
{% endhighlight %}
208208

209209
The complete code can be found in the Spark Streaming example
210-
[JavaNetworkWordCount]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/java/org/apache/spark/streaming/examples/JavaNetworkWordCount.java).
210+
[JavaNetworkWordCount]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/java/org/apache/spark/streaming/examples/JavaNetworkWordCount.java).
211211
<br>
212212

213213
</div>
@@ -602,7 +602,7 @@ JavaPairDStream<String, Integer> runningCounts = pairs.updateStateByKey(updateFu
602602
The update function will be called for each word, with `newValues` having a sequence of 1's (from
603603
the `(word, 1)` pairs) and the `runningCount` having the previous count. For the complete
604604
Scala code, take a look at the example
605-
[StatefulNetworkWordCount]({{site.SPARK_GITHUB_URL}}/blob/master/examples/src/main/scala/org/apache/spark/streaming/examples/StatefulNetworkWordCount.scala).
605+
[StatefulNetworkWordCount]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/scala/org/apache/spark/streaming/examples/StatefulNetworkWordCount.scala).
606606

607607
<h4>Transform Operation</h4>
608608

@@ -1075,7 +1075,7 @@ If the `checkpointDirectory` exists, then the context will be recreated from the
10751075
If the directory does not exist (i.e., running for the first time),
10761076
then the function `functionToCreateContext` will be called to create a new
10771077
context and set up the DStreams. See the Scala example
1078-
[RecoverableNetworkWordCount]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/streaming/examples/RecoverableNetworkWordCount.scala).
1078+
[RecoverableNetworkWordCount]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/scala/org/apache/spark/streaming/examples/RecoverableNetworkWordCount.scala).
10791079
This example appends the word counts of network data into a file.
10801080

10811081
You can also explicitly create a `StreamingContext` from the checkpoint data and start the
@@ -1114,7 +1114,7 @@ If the `checkpointDirectory` exists, then the context will be recreated from the
11141114
If the directory does not exist (i.e., running for the first time),
11151115
then the function `contextFactory` will be called to create a new
11161116
context and set up the DStreams. See the Scala example
1117-
[JavaRecoverableWordCount]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/streaming/examples/JavaRecoverableWordCount.scala)
1117+
[JavaRecoverableWordCount]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/scala/org/apache/spark/streaming/examples/JavaRecoverableWordCount.scala)
11181118
(note that this example is missing in the 0.9 release, so you can test it using the master branch).
11191119
This example appends the word counts of network data into a file.
11201120

@@ -1253,6 +1253,6 @@ and output 30 after recovery.
12531253
[ZeroMQ](api/external/zeromq/index.html#org.apache.spark.streaming.zeromq.ZeroMQUtils$), and
12541254
[MQTT](api/external/mqtt/index.html#org.apache.spark.streaming.mqtt.MQTTUtils$)
12551255

1256-
* More examples in [Scala]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/scala/org/apache/spark/streaming/examples)
1257-
and [Java]({{site.SPARK_GITHUB_URL}}/tree/master/examples/src/main/java/org/apache/spark/streaming/examples)
1256+
* More examples in [Scala]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/scala/org/apache/spark/streaming/examples)
1257+
and [Java]({{site.SPARK_GITHUB_URL}}/tree/branch-0.9/examples/src/main/java/org/apache/spark/streaming/examples)
12581258
* [Paper](http://www.eecs.berkeley.edu/Pubs/TechRpts/2012/EECS-2012-259.pdf) describing Spark Streaming

0 commit comments

Comments
 (0)