Skip to content

Commit 7a3f589

Browse files
cocoatomoJoshRosen
authored andcommitted
[SPARK-3909][PySpark][Doc] A corrupted format in Sphinx documents and building warnings
Sphinx documents contains a corrupted ReST format and have some warnings. The purpose of this issue is same as https://issues.apache.org/jira/browse/SPARK-3773. commit: 0e8203f output ``` $ cd ./python/docs $ make clean html rm -rf _build/* sphinx-build -b html -d _build/doctrees . _build/html Making output directory... Running Sphinx v1.2.3 loading pickled environment... not yet created building [html]: targets for 4 source files that are out of date updating environment: 4 added, 0 changed, 0 removed reading sources... [100%] pyspark.sql /Users/<user>/MyRepos/Scala/spark/python/pyspark/mllib/feature.py:docstring of pyspark.mllib.feature.Word2VecModel.findSynonyms:4: WARNING: Field list ends without a blank line; unexpected unindent. /Users/<user>/MyRepos/Scala/spark/python/pyspark/mllib/feature.py:docstring of pyspark.mllib.feature.Word2VecModel.transform:3: WARNING: Field list ends without a blank line; unexpected unindent. /Users/<user>/MyRepos/Scala/spark/python/pyspark/sql.py:docstring of pyspark.sql:4: WARNING: Bullet list ends without a blank line; unexpected unindent. looking for now-outdated files... none found pickling environment... done checking consistency... done preparing documents... done writing output... [100%] pyspark.sql writing additional files... (12 module code pages) _modules/index search copying static files... WARNING: html_static_path entry u'/Users/<user>/MyRepos/Scala/spark/python/docs/_static' does not exist done copying extra files... done dumping search index... done dumping object inventory... done build succeeded, 4 warnings. Build finished. The HTML pages are in _build/html. ``` Author: cocoatomo <[email protected]> Closes apache#2766 from cocoatomo/issues/3909-sphinx-build-warnings and squashes the following commits: 2c7faa8 [cocoatomo] [SPARK-3909][PySpark][Doc] A corrupted format in Sphinx documents and building warnings
1 parent 81015a2 commit 7a3f589

File tree

4 files changed

+9
-7
lines changed

4 files changed

+9
-7
lines changed

python/docs/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -131,7 +131,7 @@
131131
# Add any paths that contain custom static files (such as style sheets) here,
132132
# relative to this directory. They are copied after the builtin static files,
133133
# so a file named "default.css" will overwrite the builtin "default.css".
134-
html_static_path = ['_static']
134+
#html_static_path = ['_static']
135135

136136
# Add any extra paths that contain custom files (such as robots.txt or
137137
# .htaccess) here, relative to this directory. These files are copied

python/pyspark/mllib/feature.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,7 @@ def transform(self, word):
4444
"""
4545
:param word: a word
4646
:return: vector representation of word
47+
4748
Transforms a word to its vector representation
4849
4950
Note: local use only
@@ -57,6 +58,7 @@ def findSynonyms(self, x, num):
5758
:param x: a word or a vector representation of word
5859
:param num: number of synonyms to find
5960
:return: array of (word, cosineSimilarity)
61+
6062
Find synonyms of a word
6163
6264
Note: local use only

python/pyspark/rdd.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2009,7 +2009,7 @@ def countApproxDistinct(self, relativeSD=0.05):
20092009
of The Art Cardinality Estimation Algorithm", available
20102010
<a href="http://dx.doi.org/10.1145/2452376.2452456">here</a>.
20112011
2012-
:param relativeSD Relative accuracy. Smaller values create
2012+
:param relativeSD: Relative accuracy. Smaller values create
20132013
counters that require more space.
20142014
It must be greater than 0.000017.
20152015

python/pyspark/sql.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -19,14 +19,14 @@
1919
public classes of Spark SQL:
2020
2121
- L{SQLContext}
22-
Main entry point for SQL functionality.
22+
Main entry point for SQL functionality.
2323
- L{SchemaRDD}
24-
A Resilient Distributed Dataset (RDD) with Schema information for the data contained. In
25-
addition to normal RDD operations, SchemaRDDs also support SQL.
24+
A Resilient Distributed Dataset (RDD) with Schema information for the data contained. In
25+
addition to normal RDD operations, SchemaRDDs also support SQL.
2626
- L{Row}
27-
A Row of data returned by a Spark SQL query.
27+
A Row of data returned by a Spark SQL query.
2828
- L{HiveContext}
29-
Main entry point for accessing data stored in Apache Hive..
29+
Main entry point for accessing data stored in Apache Hive..
3030
"""
3131

3232
import itertools

0 commit comments

Comments
 (0)