Skip to content

[SPARK-7704] Updating Programming Guides per SPARK-4397 #6234

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed

[SPARK-7704] Updating Programming Guides per SPARK-4397 #6234

wants to merge 4 commits into from

Conversation

daisukebe
Copy link
Contributor

The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)

The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)
@AmplabJenkins
Copy link

Can one of the admins verify this patch?

@daisukebe daisukebe changed the title Updating Programming Guides per SPARK-4397 [SPARK-7704] Updating Programming Guides per SPARK-4397 May 18, 2015
@srowen
Copy link
Member

srowen commented May 18, 2015

OK to test

@srowen
Copy link
Member

srowen commented May 18, 2015

Yes I believe that's correct.

@zsxwing
Copy link
Member

zsxwing commented May 18, 2015

I remember we keep import org.apache.spark.SparkContext._ for people that uses old Spark versions. They may read this doc even if they don't use the latest Spark.

@srowen
Copy link
Member

srowen commented May 18, 2015

The docs are versioned though. I can understand that people using old Spark might still look at the latest docs, but then there are a number of problems of that form. I think/hope people would be careful to look at the right version if one version didn't seem to be working or making sense.

(the built-in tuples in the language, created by simply writing `(a, b)`), as long as you
import `org.apache.spark.SparkContext._` in your program to enable Spark's implicit
conversions. The key-value pair operations are available in the
(the built-in tuples in the language, created by simply writing `(a, b)`). The key-value pair operations are available in the
[PairRDDFunctions](api/scala/index.html#org.apache.spark.rdd.PairRDDFunctions) class,
which automatically wraps around an RDD of tuples if you import the conversions.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you import the conversions can be removed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(I can tap that in on merge, OK.)

@zsxwing
Copy link
Member

zsxwing commented May 18, 2015

@srowen Agree.

@rxin what do you think?

@rxin
Copy link
Contributor

rxin commented May 18, 2015

Yes we should remove it. But maybe add a note somewhere to say "in Spark version xxx and below, you would need to explicitly ..."

@daisukebe
Copy link
Contributor Author

Thanks guys. Then, does adding the following make sense?

If the Spark version is prior to 1.3.0, user needs to explicitly import org.apache.spark.SparkContext._ to allow the implicit conversions.

@rxin
Copy link
Contributor

rxin commented May 19, 2015

That looks good.

import org.apache.spark.SparkConf
{% endhighlight %}

(If the Spark version is prior to 1.3.0, user needs to explicitly import org.apache.spark.SparkContext._ to allow the implicit conversions.)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest a slight rewording and putting the import in code font:

Before Spark 1.3.0, you need to explicitly import org.apache.spark.SparkContext._ to enable essential implicit conversions.

@zsxwing
Copy link
Member

zsxwing commented May 19, 2015

LGTM except the minor comment. But I think @srowen will handle it.

asfgit pushed a commit that referenced this pull request May 19, 2015
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)

Author: Dice <[email protected]>

Closes #6234 from daisukebe/patch-1 and squashes the following commits:

b77ecd9 [Dice] fix a typo
45dfcd3 [Dice] rewording per Sean's advice
a094bcf [Dice] Adding a note for users on any previous releases
a29be5f [Dice] Updating Programming Guides per SPARK-4397

(cherry picked from commit 32fa611)
Signed-off-by: Sean Owen <[email protected]>
@asfgit asfgit closed this in 32fa611 May 19, 2015
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request May 28, 2015
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)

Author: Dice <[email protected]>

Closes apache#6234 from daisukebe/patch-1 and squashes the following commits:

b77ecd9 [Dice] fix a typo
45dfcd3 [Dice] rewording per Sean's advice
a094bcf [Dice] Adding a note for users on any previous releases
a29be5f [Dice] Updating Programming Guides per SPARK-4397
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request Jun 12, 2015
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)

Author: Dice <[email protected]>

Closes apache#6234 from daisukebe/patch-1 and squashes the following commits:

b77ecd9 [Dice] fix a typo
45dfcd3 [Dice] rewording per Sean's advice
a094bcf [Dice] Adding a note for users on any previous releases
a29be5f [Dice] Updating Programming Guides per SPARK-4397
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
The change per SPARK-4397 makes implicit objects in SparkContext to be found by the compiler automatically. So that we don't need to import the o.a.s.SparkContext._ explicitly any more and can remove some statements around the "implicit conversions" from the latest Programming Guides (1.3.0 and higher)

Author: Dice <[email protected]>

Closes apache#6234 from daisukebe/patch-1 and squashes the following commits:

b77ecd9 [Dice] fix a typo
45dfcd3 [Dice] rewording per Sean's advice
a094bcf [Dice] Adding a note for users on any previous releases
a29be5f [Dice] Updating Programming Guides per SPARK-4397
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants