-
Notifications
You must be signed in to change notification settings - Fork 28.7k
[SPARK-7704] Updating Programming Guides per SPARK-4397 #6234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)
Can one of the admins verify this patch? |
OK to test |
Yes I believe that's correct. |
I remember we keep |
The docs are versioned though. I can understand that people using old Spark might still look at the latest docs, but then there are a number of problems of that form. I think/hope people would be careful to look at the right version if one version didn't seem to be working or making sense. |
(the built-in tuples in the language, created by simply writing `(a, b)`), as long as you | ||
import `org.apache.spark.SparkContext._` in your program to enable Spark's implicit | ||
conversions. The key-value pair operations are available in the | ||
(the built-in tuples in the language, created by simply writing `(a, b)`). The key-value pair operations are available in the | ||
[PairRDDFunctions](api/scala/index.html#org.apache.spark.rdd.PairRDDFunctions) class, | ||
which automatically wraps around an RDD of tuples if you import the conversions. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if you import the conversions
can be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(I can tap that in on merge, OK.)
Yes we should remove it. But maybe add a note somewhere to say "in Spark version xxx and below, you would need to explicitly ..." |
Thanks guys. Then, does adding the following make sense?
|
That looks good. |
import org.apache.spark.SparkConf | ||
{% endhighlight %} | ||
|
||
(If the Spark version is prior to 1.3.0, user needs to explicitly import org.apache.spark.SparkContext._ to allow the implicit conversions.) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest a slight rewording and putting the import in code font:
Before Spark 1.3.0, you need to explicitly import org.apache.spark.SparkContext._
to enable essential implicit conversions.
LGTM except the minor comment. But I think @srowen will handle it. |
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher) Author: Dice <[email protected]> Closes #6234 from daisukebe/patch-1 and squashes the following commits: b77ecd9 [Dice] fix a typo 45dfcd3 [Dice] rewording per Sean's advice a094bcf [Dice] Adding a note for users on any previous releases a29be5f [Dice] Updating Programming Guides per SPARK-4397 (cherry picked from commit 32fa611) Signed-off-by: Sean Owen <[email protected]>
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher) Author: Dice <[email protected]> Closes apache#6234 from daisukebe/patch-1 and squashes the following commits: b77ecd9 [Dice] fix a typo 45dfcd3 [Dice] rewording per Sean's advice a094bcf [Dice] Adding a note for users on any previous releases a29be5f [Dice] Updating Programming Guides per SPARK-4397
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher) Author: Dice <[email protected]> Closes apache#6234 from daisukebe/patch-1 and squashes the following commits: b77ecd9 [Dice] fix a typo 45dfcd3 [Dice] rewording per Sean's advice a094bcf [Dice] Adding a note for users on any previous releases a29be5f [Dice] Updating Programming Guides per SPARK-4397
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher) Author: Dice <[email protected]> Closes apache#6234 from daisukebe/patch-1 and squashes the following commits: b77ecd9 [Dice] fix a typo 45dfcd3 [Dice] rewording per Sean's advice a094bcf [Dice] Adding a note for users on any previous releases a29be5f [Dice] Updating Programming Guides per SPARK-4397
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)