aboutsummaryrefslogtreecommitdiff
path: root/ec2/spark-ec2
Commit message (Collapse)AuthorAgeFilesLines
* [SPARK-5434] [EC2] Preserve spaces in EC2 pathNicholas Chammas2015-01-281-1/+1
| | | | | | | | | | | | | | | | | | | | | | | | Fixes [SPARK-5434](https://issues.apache.org/jira/browse/SPARK-5434). Simple demonstration of the problem and the fix: ``` $ spacey_path="/path/with some/spaces" $ dirname $spacey_path usage: dirname path $ echo $? 1 $ dirname "$spacey_path" /path/with some $ echo $? 0 ``` Author: Nicholas Chammas <nicholas.chammas@gmail.com> Closes #4224 from nchammas/patch-1 and squashes the following commits: 960711a [Nicholas Chammas] [EC2] Preserve spaces in EC2 path
* [SPARK-4890] Upgrade Boto to 2.34.0; automatically download Boto from PyPi ↵Josh Rosen2014-12-191-2/+1
| | | | | | | | | | | | | | | | | | instead of packaging it This patch upgrades `spark-ec2`'s Boto version to 2.34.0, since this is blocking several features. Newer versions of Boto don't work properly when they're loaded from a zipfile since they try to read a JSON file from a path relative to the Boto library sources. Therefore, this patch also changes spark-ec2 to automatically download Boto from PyPi if it's not present in `SPARK_EC2_DIR/lib`, similar to what we do in the `sbt/sbt` script. This shouldn't ben an issue for users since they already need to have an internet connection to launch an EC2 cluster. By performing the downloading in spark_ec2.py instead of the Bash script, this should also work for Windows users. I've tested this with Python 2.6, too. Author: Josh Rosen <joshrosen@databricks.com> Closes #3737 from JoshRosen/update-boto and squashes the following commits: 0aa43cc [Josh Rosen] Remove unused setup_standalone_cluster() method. f02935d [Josh Rosen] Enable Python deprecation warnings and fix one Boto warning: 587ae89 [Josh Rosen] [SPARK-4890] Upgrade Boto to 2.34.0; automatically download Boto from PyPi instead of packaging it
* [SPARK-4137] [EC2] Don't change working dir on userNicholas Chammas2014-11-051-2/+6
| | | | | | | | | | | | | | | | | | | | | | | This issue was uncovered after [this discussion](https://issues.apache.org/jira/browse/SPARK-3398?focusedCommentId=14187471&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14187471). Don't change the working directory on the user. This breaks relative paths the user may pass in, e.g., for the SSH identity file. ``` ./ec2/spark-ec2 -i ../my.pem ``` This patch will preserve the user's current working directory and allow calls like the one above to work. Author: Nicholas Chammas <nicholas.chammas@gmail.com> Closes #2988 from nchammas/spark-ec2-cwd and squashes the following commits: f3850b5 [Nicholas Chammas] pep8 fix fbc20c7 [Nicholas Chammas] revert to old commenting style 752f958 [Nicholas Chammas] specify deploy.generic path absolutely bcdf6a5 [Nicholas Chammas] fix typo 77871a2 [Nicholas Chammas] add clarifying comment ce071fc [Nicholas Chammas] don't change working dir
* SPARK-2241: quote command line args in ec2 scriptOri Kremer2014-06-221-1/+1
| | | | | | | | | | To preserve quoted command line args (in case options have space in them). Author: Ori Kremer <ori.kremer@gmail.com> Closes #1169 from orikremer/quote_cmd_line_args and squashes the following commits: 67e2aa1 [Ori Kremer] quote command line args
* Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-161-0/+2
|
* Added script for launching Spark on EC2 from Mesos, to make it easierMatei Zaharia2012-06-101-0/+20
for new users to get up and running on EC2.