Skip to content

Commit bbd5642

Browse files
ospillingerdeliahu
authored andcommitted
Update docs (#617)
(cherry picked from commit b6bbcd2)
1 parent d25f33f commit bbd5642

File tree

20 files changed

+88
-72
lines changed

20 files changed

+88
-72
lines changed

README.md

Lines changed: 35 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -11,25 +11,35 @@ Cortex is an open source platform for deploying machine learning models—traine
1111

1212
## Key features
1313

14-
- **Autoscaling:** Cortex automatically scales APIs to handle production workloads.
14+
* **Autoscaling:** Cortex automatically scales APIs to handle production workloads.
15+
* **Multi framework:** Cortex supports TensorFlow, PyTorch, scikit-learn, XGBoost, and more.
16+
* **CPU / GPU support:** Cortex can run inference on CPU or GPU infrastructure.
17+
* **Spot instances:** Cortex supports EC2 spot instances.
18+
* **Rolling updates:** Cortex updates deployed APIs without any downtime.
19+
* **Log streaming:** Cortex streams logs from deployed models to your CLI.
20+
* **Prediction monitoring:** Cortex monitors network metrics and tracks predictions.
21+
* **Minimal configuration:** Deployments are defined in a single `cortex.yaml` file.
1522

16-
- **Multi framework:** Cortex supports TensorFlow, PyTorch, scikit-learn, XGBoost, and more.
17-
18-
- **CPU / GPU support:** Cortex can run inference on CPU or GPU infrastructure.
19-
20-
- **Spot instances:** Cortex supports EC2 spot instances.
23+
<br>
2124

22-
- **Rolling updates:** Cortex updates deployed APIs without any downtime.
25+
## Spinning up a Cortex cluster
2326

24-
- **Log streaming:** Cortex streams logs from deployed models to your CLI.
27+
Cortex is designed to be self-hosted on any AWS account. You can spin up a Cortex cluster with a single command:
2528

26-
- **Prediction monitoring:** Cortex monitors network metrics and tracks predictions.
29+
```bash
30+
$ cortex cluster up
2731

28-
- **Minimal configuration:** Deployments are defined in a single `cortex.yaml` file.
32+
aws region: us-west-2
33+
aws instance type: p2.xlarge
34+
min instances: 0
35+
max instances: 10
36+
spot instances: yes
2937

30-
<br>
38+
○ spinning up your cluster ...
39+
your cluster is ready!
40+
```
3141

32-
## Usage
42+
## Deploying a model
3343

3444
### Implement your predictor
3545

@@ -94,17 +104,23 @@ negative 4
94104

95105
<br>
96106

97-
## How it works
107+
## What is Cortex an alternative to?
108+
109+
Cortex is an open source alternative to serving models with SageMaker or building your own model deployment platform on top of AWS services like Elastic Kubernetes Service (EKS), Elastic Container Service (ECS), Lambda, Fargate, and Elastic Compute Cloud (EC2) or open source projects like Docker, Kubernetes, and TensorFlow Serving.
110+
111+
<br>
112+
113+
## How does Cortex work?
98114

99115
The CLI sends configuration and code to the cluster every time you run `cortex deploy`. Each model is loaded into a Docker container, along with any Python packages and request handling code. The model is exposed as a web service using Elastic Load Balancing (ELB), TensorFlow Serving, and ONNX Runtime. The containers are orchestrated on Elastic Kubernetes Service (EKS) while logs and metrics are streamed to CloudWatch.
100116

101117
<br>
102118

103-
## Examples
119+
## Examples of Cortex deployments
104120

105121
<!-- CORTEX_VERSION_README_MINOR x5 -->
106-
- [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.11/examples/tensorflow/sentiment-analyzer) in TensorFlow with BERT
107-
- [Image classification](https://github.com/cortexlabs/cortex/tree/0.11/examples/tensorflow/image-classifier) in TensorFlow with Inception
108-
- [Text generation](https://github.com/cortexlabs/cortex/tree/0.11/examples/pytorch/text-generator) in PyTorch with DistilGPT2
109-
- [Reading comprehension](https://github.com/cortexlabs/cortex/tree/0.11/examples/pytorch/reading-comprehender) in PyTorch with ELMo-BiDAF
110-
- [Iris classification](https://github.com/cortexlabs/cortex/tree/0.11/examples/sklearn/iris-classifier) in scikit-learn
122+
* [Sentiment analysis](https://github.com/cortexlabs/cortex/tree/0.11/examples/tensorflow/sentiment-analyzer): deploy a BERT model for sentiment analysis.
123+
* [Image classification](https://github.com/cortexlabs/cortex/tree/0.11/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
124+
* [Search completion](https://github.com/cortexlabs/cortex/tree/0.11/examples/tensorflow/search-completer): deploy Facebook's RoBERTa model to complete search terms.
125+
* [Text generation](https://github.com/cortexlabs/cortex/tree/0.11/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
126+
* [Iris classification](https://github.com/cortexlabs/cortex/tree/0.11/examples/sklearn/iris-classifier): deploy a scikit-learn model to classify iris flowers.
File renamed without changes.
File renamed without changes.
File renamed without changes.

docs/cluster/install.md renamed to docs/cluster-management/install.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,8 +42,8 @@ cortex get classifier
4242

4343
# classify a sample
4444
curl -X POST -H "Content-Type: application/json" \
45-
-d '{ "sepal_length": 5.2, "sepal_width": 3.6, "petal_length": 1.4, "petal_width": 0.3 }' \
46-
<API endpoint>
45+
-d '{ "sepal_length": 5.2, "sepal_width": 3.6, "petal_length": 1.4, "petal_width": 0.3 }' \
46+
<API endpoint>
4747
```
4848

4949
## Cleanup
File renamed without changes.
File renamed without changes.
File renamed without changes.

docs/development.md renamed to docs/contributing/development.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
1-
# Development Environment
1+
# Development
22

33
## Prerequisites
44

55
1. Go (>=1.12.9)
6-
1. Docker
7-
1. eksctl
8-
1. kubectl
6+
2. Docker
7+
3. eksctl
8+
4. kubectl
99

10-
## Cortex Dev Environment
10+
## Cortex dev environment
1111

1212
Clone the project:
1313

@@ -109,7 +109,7 @@ make cli # The binary will be placed in path/to/cortex/bin/cortex
109109
path/to/cortex/bin/cortex configure
110110
```
111111

112-
### Cortex Cluster
112+
### Cortex cluster
113113

114114
Start Cortex:
115115

@@ -123,35 +123,35 @@ Tear down the Cortex cluster:
123123
make cortex-down
124124
```
125125

126-
### Deployment an Example
126+
### Deploy an example
127127

128128
```bash
129129
cd examples/iris-classifier
130130
path/to/cortex/bin/cortex deploy
131131
```
132132

133-
## Off-cluster Operator
133+
## Off-cluster operator
134134

135135
If you're making changes in the operator and want faster iterations, you can run an off-cluster operator.
136136

137137
1. `make operator-stop` to stop the in-cluster operator
138-
1. `make devstart` to run the off-cluster operator (which rebuilds the CLI and restarts the Operator when files change)
139-
1. `path/to/cortex/bin/cortex configure` (on a separate terminal) to configure your cortex CLI to use the off-cluster operator. When prompted for operator URL, use `http://localhost:8888`
138+
2. `make devstart` to run the off-cluster operator (which rebuilds the CLI and restarts the Operator when files change)
139+
3. `path/to/cortex/bin/cortex configure` (on a separate terminal) to configure your cortex CLI to use the off-cluster operator. When prompted for operator URL, use `http://localhost:8888`
140140

141141
Note: `make cortex-up-dev` will start Cortex without installing the operator.
142142

143143
If you want to switch back to the in-cluster operator:
144144

145145
1. `<ctrl+C>` to stop your off-cluster operator
146-
1. `make operator-start` to install the operator in your cluster
147-
1. `path/to/cortex/bin/cortex configure` to configure your cortex CLI to use the in-cluster operator. When prompted for operator URL, use the URL shown when running `make cortex-info`
146+
2. `make operator-start` to install the operator in your cluster
147+
3. `path/to/cortex/bin/cortex configure` to configure your cortex CLI to use the in-cluster operator. When prompted for operator URL, use the URL shown when running `make cortex-info`
148148

149-
## Dev Workflow
149+
## Dev workflow
150150

151151
1. `make cortex-up-dev`
152-
1. `make devstart`
153-
1. Make changes
154-
1. `make registry-dev`
155-
1. Test your changes with projects in `examples` or your own
152+
2. `make devstart`
153+
3. Make changes
154+
4. `make registry-dev`
155+
5. Test your changes with projects in `examples` or your own
156156

157157
See `Makefile` for additional dev commands
File renamed without changes.

0 commit comments

Comments
 (0)