You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-3580] add 'partitions' property to PySpark RDD
'rdd.partitions' is available in scala&java, primarily used for its
size() method to get the number of partitions. pyspark instead has a
getNumPartitions() call and no access to 'partitions'
this change adds 'partitions' to pyspark's rdd, allowing for
len(rdd.partitions) to get the number of partitions in a way familiar
to python developers
0 commit comments