Skip to content

Commit fb05432

Browse files
witgopwendell
authored andcommitted
The default version of yarn is equal to the hadoop version
This is a part of [PR 590](#590) Author: witgo <[email protected]> Closes #626 from witgo/yarn_version and squashes the following commits: c390631 [witgo] restore the yarn dependency declarations f8a4ad8 [witgo] revert remove the dependency of avro in yarn-alpha 2df6cf5 [witgo] review commit a1d876a [witgo] review commit 20e7e3e [witgo] review commit c76763b [witgo] The default value of yarn.version is equal to hadoop.version
1 parent 92b2902 commit fb05432

File tree

4 files changed

+18
-12
lines changed

4 files changed

+18
-12
lines changed

bin/compute-classpath.sh

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,7 @@ if [ -f "$ASSEMBLY_DIR"/spark-assembly*hadoop*-deps.jar ]; then
4444
CLASSPATH="$CLASSPATH:$FWDIR/sql/catalyst/target/scala-$SCALA_VERSION/classes"
4545
CLASSPATH="$CLASSPATH:$FWDIR/sql/core/target/scala-$SCALA_VERSION/classes"
4646
CLASSPATH="$CLASSPATH:$FWDIR/sql/hive/target/scala-$SCALA_VERSION/classes"
47+
CLASSPATH="$CLASSPATH:$FWDIR/yarn/stable/target/scala-$SCALA_VERSION/classes"
4748

4849
DEPS_ASSEMBLY_JAR=`ls "$ASSEMBLY_DIR"/spark-assembly*hadoop*-deps.jar`
4950
CLASSPATH="$CLASSPATH:$DEPS_ASSEMBLY_JAR"

docs/building-with-maven.md

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,17 +45,20 @@ For Apache Hadoop versions 1.x, Cloudera CDH MRv1, and other Hadoop versions wit
4545
For Apache Hadoop 2.x, 0.23.x, Cloudera CDH MRv2, and other Hadoop versions with YARN, you can enable the "yarn-alpha" or "yarn" profile and set the "hadoop.version", "yarn.version" property. Note that Hadoop 0.23.X requires a special `-Phadoop-0.23` profile:
4646

4747
# Apache Hadoop 2.0.5-alpha
48-
$ mvn -Pyarn-alpha -Dhadoop.version=2.0.5-alpha -Dyarn.version=2.0.5-alpha -DskipTests clean package
48+
$ mvn -Pyarn-alpha -Dhadoop.version=2.0.5-alpha -DskipTests clean package
4949

5050
# Cloudera CDH 4.2.0 with MapReduce v2
51-
$ mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -Dyarn.version=2.0.0-cdh4.2.0 -DskipTests clean package
51+
$ mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -DskipTests clean package
5252

5353
# Apache Hadoop 2.2.X (e.g. 2.2.0 as below) and newer
54-
$ mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package
54+
$ mvn -Pyarn -Dhadoop.version=2.2.0 -DskipTests clean package
5555

5656
# Apache Hadoop 0.23.x
5757
$ mvn -Pyarn-alpha -Phadoop-0.23 -Dhadoop.version=0.23.7 -Dyarn.version=0.23.7 -DskipTests clean package
5858

59+
# Different versions of HDFS and YARN.
60+
$ mvn -Pyarn-alpha -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 -DskipTests clean package
61+
5962
## Spark Tests in Maven ##
6063

6164
Tests are run by default via the [ScalaTest Maven plugin](http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin). Some of the require Spark to be packaged first, so always run `mvn package` with `-DskipTests` the first time. You can then run the tests with `mvn -Dhadoop.version=... test`.

pom.xml

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,8 @@
1616
~ limitations under the License.
1717
-->
1818

19-
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
19+
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
20+
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
2021
<modelVersion>4.0.0</modelVersion>
2122
<parent>
2223
<groupId>org.apache</groupId>
@@ -119,7 +120,7 @@
119120
<log4j.version>1.2.17</log4j.version>
120121
<hadoop.version>1.0.4</hadoop.version>
121122
<protobuf.version>2.4.1</protobuf.version>
122-
<yarn.version>0.23.7</yarn.version>
123+
<yarn.version>${hadoop.version}</yarn.version>
123124
<hbase.version>0.94.6</hbase.version>
124125
<hive.version>0.12.0</hive.version>
125126
<parquet.version>1.3.2</parquet.version>
@@ -135,7 +136,8 @@
135136

136137
<repositories>
137138
<repository>
138-
<id>maven-repo</id> <!-- This should be at top, it makes maven try the central repo first and then others and hence faster dep resolution -->
139+
<id>maven-repo</id>
140+
<!-- This should be at top, it makes maven try the central repo first and then others and hence faster dep resolution -->
139141
<name>Maven Repository</name>
140142
<!-- HTTPS is unavailable for Maven Central -->
141143
<url>http://repo.maven.apache.org/maven2</url>
@@ -847,15 +849,16 @@
847849
<hadoop.version>0.23.7</hadoop.version>
848850
<!--<hadoop.version>2.0.5-alpha</hadoop.version> -->
849851
</properties>
850-
<modules>
851-
<module>yarn</module>
852-
</modules>
853852
<dependencies>
854853
<dependency>
855854
<groupId>org.apache.avro</groupId>
856855
<artifactId>avro</artifactId>
857856
</dependency>
858857
</dependencies>
858+
<modules>
859+
<module>yarn</module>
860+
</modules>
861+
859862
</profile>
860863

861864
<!-- Ganglia integration is not included by default due to LGPL-licensed code -->

yarn/pom.xml

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
<artifactId>yarn-parent_2.10</artifactId>
2929
<packaging>pom</packaging>
3030
<name>Spark Project YARN Parent POM</name>
31-
31+
3232
<dependencies>
3333
<dependency>
3434
<groupId>org.apache.spark</groupId>
@@ -50,7 +50,6 @@
5050
<dependency>
5151
<groupId>org.apache.hadoop</groupId>
5252
<artifactId>hadoop-client</artifactId>
53-
<version>${yarn.version}</version>
5453
</dependency>
5554
<dependency>
5655
<groupId>org.scalatest</groupId>
@@ -128,7 +127,7 @@
128127
<target>
129128
<property name="spark.classpath" refid="maven.test.classpath" />
130129
<property environment="env" />
131-
<fail message="Please set the SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment variables and retry.">
130+
<fail message="Please set the SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment variables and retry.">
132131
<condition>
133132
<not>
134133
<or>

0 commit comments

Comments
 (0)