You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+5-10Lines changed: 5 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,12 +13,6 @@ SparkR requires Scala 2.10 and Spark version >= 0.9.0. Current build by default
13
13
Apache Spark 1.1.0. You can also build SparkR against a
14
14
different Spark version (>= 0.9.0) by modifying `pkg/src/build.sbt`.
15
15
16
-
SparkR also requires the R package `rJava` to be installed. To install `rJava`,
17
-
you can run the following command in R:
18
-
19
-
install.packages("rJava")
20
-
21
-
22
16
### Package installation
23
17
To develop SparkR, you can build the scala package and the R package using
24
18
@@ -31,9 +25,9 @@ If you wish to try out the package directly from github, you can use [`install_g
31
25
32
26
SparkR by default uses Apache Spark 1.1.0. You can switch to a different Spark
33
27
version by setting the environment variable `SPARK_VERSION`. For example, to
34
-
use Apache Spark 1.2.0, you can run
28
+
use Apache Spark 1.3.0, you can run
35
29
36
-
SPARK_VERSION=1.2.0 ./install-dev.sh
30
+
SPARK_VERSION=1.3.0 ./install-dev.sh
37
31
38
32
SparkR by default links to Hadoop 1.0.4. To use SparkR with other Hadoop
39
33
versions, you will need to rebuild SparkR with the same version that [Spark is
@@ -97,8 +91,9 @@ To run one of them, use `./sparkR <filename> <args>`. For example:
97
91
98
92
./sparkR examples/pi.R local[2]
99
93
100
-
You can also run the unit-tests for SparkR by running
94
+
You can also run the unit-tests for SparkR by running (you need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first):
101
95
96
+
R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")'
102
97
./run-tests.sh
103
98
104
99
## Running on EC2
@@ -110,7 +105,7 @@ Instructions for running SparkR on EC2 can be found in the
110
105
Currently, SparkR supports running on YARN with the `yarn-client` mode. These steps show how to build SparkR with YARN support and run SparkR programs on a YARN cluster:
111
106
112
107
```
113
-
# assumes Java, R, rJava, yarn, spark etc. are installed on the whole cluster.
108
+
# assumes Java, R, yarn, spark etc. are installed on the whole cluster.
0 commit comments