Skip to content

Commit ea3f722

Browse files
committed
Update examples
1 parent 9f229ac commit ea3f722

File tree

14 files changed

+37
-42
lines changed

14 files changed

+37
-42
lines changed

docs/cluster-management/install.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@ _WARNING: you are on the master branch, please refer to the docs on the branch t
77
1. [Docker](https://docs.docker.com/install)
88
2. [AWS credentials](aws-credentials.md)
99

10-
## Installation
10+
## Spin up a cluster
1111

12-
See [cluster configuration](config.md) to learn how you can customize your installation and [EC2 instances](ec2-instances.md) for an overview of how to pick an appropriate EC2 instance type for your cluster.
12+
See [cluster configuration](config.md) to learn how you can customize your cluster and [EC2 instances](ec2-instances.md) for an overview of several EC2 instance types.
1313

1414
<!-- CORTEX_VERSION_MINOR -->
1515
```bash
@@ -30,8 +30,8 @@ Note: This will create resources in your AWS account which aren't included in th
3030
# clone the Cortex repository
3131
git clone -b master https://github.com/cortexlabs/cortex.git
3232

33-
# navigate to the iris classifier example
34-
cd cortex/examples/sklearn/iris-classifier
33+
# navigate to the TensorFlow iris classification example
34+
cd cortex/examples/tensorflow/iris-classifier
3535

3636
# deploy the model to the cluster
3737
cortex deploy
@@ -42,7 +42,7 @@ cortex get --watch
4242
# stream logs from the api
4343
cortex logs iris-classifier
4444

45-
# get the API's endpoint
45+
# get the api's endpoint
4646
cortex get iris-classifier
4747

4848
# classify a sample
@@ -58,4 +58,4 @@ curl -X POST -H "Content-Type: application/json" \
5858
cortex delete iris-classifier
5959
```
6060

61-
See [uninstall](uninstall.md) if you'd like to uninstall Cortex.
61+
See [uninstall](uninstall.md) if you'd like to spin down your cluster.

examples/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030

3131
- [Image classification](pytorch/image-classifier): deploy an AlexNet model from TorchVision to classify images.
3232

33-
- [Object Detection](pytorch/object-detection): deploy a Faster R-CNN model from TorchVision to detect objects in images.
33+
- [Object detection](pytorch/object-detector): deploy a Faster R-CNN model from TorchVision to detect objects in images.
3434

3535
## XGBoost
3636

File renamed without changes.

examples/pytorch/object-detection/cortex.yaml renamed to examples/pytorch/object-detector/cortex.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# WARNING: you are on the master branch, please refer to the examples on the branch that matches your `cortex version`
22

3-
- name: image-detector
3+
- name: object-detector
44
predictor:
55
type: python
66
path: predictor.py

examples/pytorch/object-detection/predictor.py renamed to examples/pytorch/object-detector/predictor.py

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -25,22 +25,20 @@ def predict(self, payload):
2525
threshold = float(payload["threshold"])
2626
image = requests.get(payload["url"]).content
2727
img_pil = Image.open(BytesIO(image))
28-
2928
img_tensor = self.preprocess(img_pil)
30-
3129
img_tensor.unsqueeze_(0)
3230

3331
with torch.no_grad():
3432
pred = self.model(img_tensor)
3533

36-
pred_class = [self.coco_labels[i] for i in list(pred[0]["labels"].numpy())]
37-
pred_boxes = [[(i[0], i[1]), (i[2], i[3])] for i in list(pred[0]["boxes"].detach().numpy())]
38-
pred_score = list(pred[0]["scores"].detach().numpy())
39-
pred_t = [pred_score.index(x) for x in pred_score if x > threshold]
40-
if len(pred_t) == 0:
34+
predicted_class = [self.coco_labels[i] for i in list(pred[0]["labels"].numpy())]
35+
predicted_boxes = [[(i[0], i[1]), (i[2], i[3])] for i in list(pred[0]["boxes"].detach().numpy())]
36+
predicted_score = list(pred[0]["scores"].detach().numpy())
37+
predicted_t = [predicted_score.index(x) for x in predicted_score if x > threshold]
38+
if len(predicted_t) == 0:
4139
return [], []
4240

43-
pred_t = pred_t[-1]
44-
pred_boxes = pred_boxes[: pred_t + 1]
45-
pred_class = pred_class[: pred_t + 1]
46-
return pred_boxes, pred_class
41+
predicted_t = predicted_t[-1]
42+
predicted_boxes = predicted_boxes[: predicted_t + 1]
43+
predicted_class = predicted_class[: predicted_t + 1]
44+
return predicted_boxes, predicted_class

examples/sklearn/iris-classifier/README.md

Lines changed: 20 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Deploy a scikit-learn model as a web service
1+
# Deploy a model as a web service
22

33
_WARNING: you are on the master branch, please refer to the examples on the branch that matches your `cortex version`_
44

@@ -50,7 +50,7 @@ $ python3 trainer.py
5050

5151
<br>
5252

53-
## Implement a predictor
53+
## Implement your predictor
5454

5555
1. Create another Python file `predictor.py`.
5656
2. Define a Predictor class with a constructor that loads and initializes your pickled model.
@@ -69,7 +69,6 @@ class PythonPredictor:
6969
def __init__(self, config):
7070
s3 = boto3.client("s3")
7171
s3.download_file(config["bucket"], config["key"], "model.pkl")
72-
7372
self.model = pickle.load(open("model.pkl", "rb"))
7473

7574
def predict(self, payload):
@@ -86,7 +85,7 @@ class PythonPredictor:
8685

8786
<br>
8887

89-
## Specify Python dependencies
88+
## Specify your Python dependencies
9089

9190
Create a `requirements.txt` file to specify the dependencies needed by `predictor.py`. Cortex will automatically install them into your runtime once you deploy:
9291

@@ -100,9 +99,9 @@ You can skip dependencies that are [pre-installed](../../../docs/deployments/pyt
10099

101100
<br>
102101

103-
## Configure an API
102+
## Configure your API
104103

105-
Create a `cortex.yaml` file and add the configuration below. An `api` provides a runtime for inference and makes our `predictor.py` implementation available as a web service that can serve real-time predictions:
104+
Create a `cortex.yaml` file and add the configuration below. An `api` provides a runtime for inference and makes your `predictor.py` implementation available as a web service that can serve real-time predictions:
106105

107106
```yaml
108107
# cortex.yaml
@@ -120,7 +119,7 @@ Create a `cortex.yaml` file and add the configuration below. An `api` provides a
120119
121120
## Deploy to AWS
122121
123-
`cortex deploy` takes the declarative configuration from `cortex.yaml` and creates it on your Cortex cluster:
122+
`cortex deploy` takes the configuration from `cortex.yaml` and creates it on your cluster:
124123

125124
```bash
126125
$ cortex deploy
@@ -139,7 +138,7 @@ live 1 1 8s -
139138
endpoint: http://***.amazonaws.com/iris-classifier
140139
```
141140

142-
The output above indicates that one replica of the API was requested and is available to serve predictions. Cortex will automatically launch more replicas if the load increases and spin down replicas if there is unused capacity.
141+
The output above indicates that one replica of your API was requested and is available to serve predictions. Cortex will automatically launch more replicas if the load increases and spin down replicas if there is unused capacity.
143142

144143
You can also stream logs from your API:
145144

@@ -315,7 +314,6 @@ class PythonPredictor:
315314
def __init__(self, config):
316315
s3 = boto3.client("s3")
317316
s3.download_file(config["bucket"], config["key"], "model.pkl")
318-
319317
self.model = pickle.load(open("model.pkl", "rb"))
320318
321319
def predict(self, payload):
@@ -362,7 +360,6 @@ Next, add the `api` to `cortex.yaml`:
362360
cpu: 0.2
363361
mem: 100M
364362
365-
366363
- name: batch-iris-classifier
367364
predictor:
368365
type: python
@@ -375,20 +372,20 @@ Next, add the `api` to `cortex.yaml`:
375372
mem: 100M
376373
```
377374

378-
Run `cortex deploy` to create the batch API:
375+
Run `cortex deploy` to create your batch API:
379376

380377
```bash
381378
$ cortex deploy
382379
383380
creating batch-iris-classifier api
384381
```
385382

386-
`cortex get` should show all three APIs now:
383+
`cortex get` should show all 3 APIs now:
387384

388385
```bash
389386
$ cortex get --watch
390387
391-
api status up-to-date requested last update
388+
api status up-to-date requested last update
392389
iris-classifier live 1 1 10m
393390
another-iris-classifier live 1 1 5m
394391
batch-iris-classifier live 1 1 8s
@@ -427,16 +424,22 @@ $ curl http://***.amazonaws.com/batch-iris-classifier \
427424

428425
<br>
429426

430-
## Clean up
427+
## Cleanup
431428

432-
Run `cortex delete` to spin down your API:
429+
Run `cortex delete` to delete each API:
433430

434431
```bash
435432
$ cortex delete iris-classifier
436433
437434
deleting iris-classifier api
435+
436+
$ cortex delete another-iris-classifier
437+
438+
deleting another-iris-classifier api
439+
440+
$ cortex delete batch-iris-classifier
441+
442+
deleting batch-iris-classifier api
438443
```
439444

440445
Running `cortex delete` will free up cluster resources and allow Cortex to scale down to the minimum number of instances you specified during cluster installation. It will not spin down your cluster.
441-
442-
Any questions? [chat with us](https://gitter.im/cortexlabs/cortex).

examples/sklearn/iris-classifier/batch-predictor.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ class PythonPredictor:
1010
def __init__(self, config):
1111
s3 = boto3.client("s3")
1212
s3.download_file(config["bucket"], config["key"], "model.pkl")
13-
1413
self.model = pickle.load(open("model.pkl", "rb"))
1514

1615
def predict(self, payload):

0 commit comments

Comments
 (0)