Skip to content

Commit a2e9220

Browse files
author
Marcelo Vanzin
committed
SHS-NG M2: Store FsHistoryProvider listing data in LevelDB.
The application listing is still generated from event logs, but is now stored in LevelDB. No data (except for the internal LevelDB pages) is kept in memory. The actual app UIs are, as of now, still untouched. The provider stores things internally using the public REST API types; I believe this is better going forward since it will make it easier to get rid of the internal history server API which is mostly redundant at this point. I also added a finalizer to LevelDBIterator, to make sure that resources are eventually released. This helps when code iterates but does not exhaust the iterator, thus not triggering the auto-close code. HistoryServerSuite was modified to not re-start the history server unnecessarily; this makes the json validation tests run more quickly.
1 parent a77f0cb commit a2e9220

File tree

13 files changed

+465
-323
lines changed

13 files changed

+465
-323
lines changed

common/kvstore/src/main/java/org/apache/spark/kvstore/LevelDBIterator.java

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -162,6 +162,18 @@ public synchronized void close() throws IOException {
162162
}
163163
}
164164

165+
/**
166+
* Because it's tricky to expose closeable iterators through many internal APIs, especially
167+
* when Scala wrappers are used, this makes sure that, hopefully, the JNI resources held by
168+
* the iterator will eventually be released.
169+
*/
170+
@Override
171+
protected void finalize() throws Throwable {
172+
if (db.db() != null) {
173+
close();
174+
}
175+
}
176+
165177
private T loadNext() {
166178
if (count >= max) {
167179
return null;

core/pom.xml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,11 @@
6767
<artifactId>spark-launcher_${scala.binary.version}</artifactId>
6868
<version>${project.version}</version>
6969
</dependency>
70+
<dependency>
71+
<groupId>org.apache.spark</groupId>
72+
<artifactId>spark-kvstore_${scala.binary.version}</artifactId>
73+
<version>${project.version}</version>
74+
</dependency>
7075
<dependency>
7176
<groupId>org.apache.spark</groupId>
7277
<artifactId>spark-network-common_${scala.binary.version}</artifactId>

core/src/main/resources/org/apache/spark/ui/static/historypage.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@ $(document).ready(function() {
188188
}
189189

190190
$(selector).DataTable(conf);
191-
$('#hisotry-summary [data-toggle="tooltip"]').tooltip();
191+
$('#history-summary [data-toggle="tooltip"]').tooltip();
192192
});
193193
});
194194
});

core/src/main/scala/org/apache/spark/deploy/history/ApplicationHistoryProvider.scala

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -68,11 +68,19 @@ private[history] abstract class HistoryUpdateProbe {
6868
* @param ui Spark UI
6969
* @param updateProbe probe to call to check on the update state of this application attempt
7070
*/
71-
private[history] case class LoadedAppUI(
71+
private[spark] case class LoadedAppUI(
7272
ui: SparkUI,
7373
updateProbe: () => Boolean)
7474

75-
private[history] abstract class ApplicationHistoryProvider {
75+
private[spark] abstract class ApplicationHistoryProvider {
76+
77+
/**
78+
* The number of applications available for listing. Separate method in case it's cheaper
79+
* to get a count than to calculate the whole listing.
80+
*
81+
* @return The number of available applications.
82+
*/
83+
def getAppCount(): Int = getListing().size
7684

7785
/**
7886
* Returns the count of application event logs that the provider is currently still processing.

0 commit comments

Comments
 (0)