aboutsummaryrefslogtreecommitdiff
path: root/python/test_support/hello.txt
Commit message (Collapse)AuthorAgeFilesLines
* [SPARK-17585][PYSPARK][CORE] PySpark SparkContext.addFile supports adding ↵Yanbo Liang2016-09-211-1/+0
| | | | | | | | | | | | | | files recursively ## What changes were proposed in this pull request? Users would like to add a directory as dependency in some cases, they can use ```SparkContext.addFile``` with argument ```recursive=true``` to recursively add all files under the directory by using Scala. But Python users can only add file not directory, we should also make it supported. ## How was this patch tested? Unit test. Author: Yanbo Liang <ybliang8@gmail.com> Closes #15140 from yanboliang/spark-17585.
* Allow PySpark's SparkFiles to be used from driverJosh Rosen2013-01-231-0/+1
Fix minor documentation formatting issues.