From d12c0711faa3d4333513fcbbbee4868bcb784a26 Mon Sep 17 00:00:00 2001 From: Mike Jennings Date: Tue, 16 Dec 2014 12:13:21 -0800 Subject: [SPARK-3405] add subnet-id and vpc-id options to spark_ec2.py Based on this gist: https://gist.github.com/amar-analytx/0b62543621e1f246c0a2 We use security group ids instead of security group to get around this issue: https://github.com/boto/boto/issues/350 Author: Mike Jennings Author: Mike Jennings Closes #2872 from mvj101/SPARK-3405 and squashes the following commits: be9cb43 [Mike Jennings] `pep8 spark_ec2.py` runs cleanly. 4dc6756 [Mike Jennings] Remove duplicate comment 731d94c [Mike Jennings] Update for code review. ad90a36 [Mike Jennings] Merge branch 'master' of https://github.com/apache/spark into SPARK-3405 1ebffa1 [Mike Jennings] Merge branch 'master' into SPARK-3405 52aaeec [Mike Jennings] [SPARK-3405] add subnet-id and vpc-id options to spark_ec2.py --- docs/ec2-scripts.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) (limited to 'docs/ec2-scripts.md') diff --git a/docs/ec2-scripts.md b/docs/ec2-scripts.md index ed51d0abb3..d50f445d7e 100644 --- a/docs/ec2-scripts.md +++ b/docs/ec2-scripts.md @@ -94,6 +94,25 @@ another. permissions on your private key file, you can run `launch` with the `--resume` option to restart the setup process on an existing cluster. +# Launching a Cluster in a VPC + +- Run + `./spark-ec2 -k -i -s --vpc-id= --subnet-id= launch `, + where `` is the name of your EC2 key pair (that you gave it + when you created it), `` is the private key file for your + key pair, `` is the number of slave nodes to launch (try + 1 at first), `` is the name of your VPC, `` is the + name of your subnet, and `` is the name to give to your + cluster. + + For example: + + ```bash + export AWS_SECRET_ACCESS_KEY=AaBbCcDdEeFGgHhIiJjKkLlMmNnOoPpQqRrSsTtU +export AWS_ACCESS_KEY_ID=ABCDEFG1234567890123 +./spark-ec2 --key-pair=awskey --identity-file=awskey.pem --region=us-west-1 --zone=us-west-1a --vpc-id=vpc-a28d24c7 --subnet-id=subnet-4eb27b39 --spark-version=1.1.0 launch my-spark-cluster + ``` + # Running Applications - Go into the `ec2` directory in the release of Spark you downloaded. -- cgit v1.2.3