Merge pull request #76 from islinwb/deux

Update staging/spark/README.md
This commit is contained in:
sebgoa 2017-08-20 14:43:05 +02:00 committed by GitHub
commit c7ac7cb65d
1 changed files with 2 additions and 2 deletions

View File

@ -23,7 +23,7 @@ The PySpark examples are taken from http://stackoverflow.com/questions/4114167/c
This example assumes
- You have a Kubernetes cluster installed and running.
- That you have installed the ```kubectl``` command line tool installed in your path and configured to talk to your Kubernetes cluster
- That you have the ```kubectl``` command line tool installed in your path and configured to talk to your Kubernetes cluster
- That your Kubernetes cluster is running [kube-dns](https://github.com/kubernetes/dns) or an equivalent integration.
Optionally, your Kubernetes cluster should be configured with a Loadbalancer integration (automatically configured via kube-up or GKE)
@ -111,7 +111,7 @@ Spark Command: /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp /opt/spark-1.5
15/10/27 21:25:07 INFO Master: I have been elected leader! New state: ALIVE
```
Once the master is started, we'll want to check the Spark WebUI. In order to access the Spark WebUI, we will deploy a [specialized proxy](https://github.com/aseigneurin/spark-ui-proxy). This proxy is neccessary to access worker logs from the Spark UI.
Once the master is started, we'll want to check the Spark WebUI. In order to access the Spark WebUI, we will deploy a [specialized proxy](https://github.com/aseigneurin/spark-ui-proxy). This proxy is necessary to access worker logs from the Spark UI.
Deploy the proxy controller with [`examples/staging/spark/spark-ui-proxy-controller.yaml`](spark-ui-proxy-controller.yaml):