|
2 | 2 |
|
3 | 3 | ## TensorFlow |
4 | 4 |
|
5 | | -Zip the exported estimator output in your checkpoint directory: |
| 5 | +Export your trained model and zip the model directory. An example is shown below (here is the [complete example](https://github.com/cortexlabs/cortex/blob/master/examples/iris/models/tensorflow_model.py)): |
6 | 6 |
|
7 | | -```text |
8 | | -$ ls export/estimator/1560263597/ |
9 | | -saved_model.pb variables/ |
| 7 | +```Python |
| 8 | +import tensorflow as tf |
| 9 | +import shutil |
| 10 | +import os |
| 11 | + |
| 12 | +... |
| 13 | + |
| 14 | +classifier = tf.estimator.Estimator( |
| 15 | + model_fn=my_model, model_dir="iris", params={"hidden_units": [10, 10], "n_classes": 3} |
| 16 | +) |
| 17 | + |
| 18 | +exporter = tf.estimator.FinalExporter("estimator", serving_input_fn, as_text=False) |
| 19 | +train_spec = tf.estimator.TrainSpec(train_input_fn, max_steps=1000) |
| 20 | +eval_spec = tf.estimator.EvalSpec(eval_input_fn, exporters=[exporter], name="estimator-eval") |
| 21 | + |
| 22 | +tf.estimator.train_and_evaluate(classifier, train_spec, eval_spec) |
10 | 23 |
|
11 | | -$ zip -r model.zip export/estimator |
| 24 | +# zip the estimator export dir (the exported path looks like iris/export/estimator/1562353043/) |
| 25 | +shutil.make_archive("tensorflow", "zip", os.path.join("iris/export/estimator")) |
12 | 26 | ``` |
13 | 27 |
|
14 | | -Upload the zipped file to Amazon S3: |
| 28 | +Upload the zipped file to Amazon S3 using the AWS web console or CLI: |
15 | 29 |
|
16 | 30 | ```text |
17 | 31 | $ aws s3 cp model.zip s3://my-bucket/model.zip |
18 | 32 | ``` |
19 | 33 |
|
20 | | -Reference your `model` in an API: |
| 34 | +Reference your model in an `api`: |
21 | 35 |
|
22 | 36 | ```yaml |
23 | 37 | - kind: api |
@@ -45,20 +59,20 @@ with open("sklearn.onnx", "wb") as f: |
45 | 59 | f.write(onnx_model.SerializeToString()) |
46 | 60 | ``` |
47 | 61 |
|
48 | | -Here are examples of converting models from some of the common ML frameworks to ONNX: |
| 62 | +Here are complete examples of converting models from some of the common ML frameworks to ONNX: |
49 | 63 |
|
50 | 64 | * [PyTorch](https://github.com/cortexlabs/cortex/blob/master/examples/iris/models/pytorch_model.py) |
51 | 65 | * [Sklearn](https://github.com/cortexlabs/cortex/blob/master/examples/iris/models/sklearn_model.py) |
52 | 66 | * [XGBoost](https://github.com/cortexlabs/cortex/blob/master/examples/iris/models/xgboost_model.py) |
53 | 67 | * [Keras](https://github.com/cortexlabs/cortex/blob/master/examples/iris/models/keras_model.py) |
54 | 68 |
|
55 | | -Upload your trained model in ONNX format to Amazon S3: |
| 69 | +Upload your trained model in ONNX format to Amazon S3 using the AWS web console or CLI: |
56 | 70 |
|
57 | 71 | ```text |
58 | 72 | $ aws s3 cp model.onnx s3://my-bucket/model.onnx |
59 | 73 | ``` |
60 | 74 |
|
61 | | -Reference your `model` in an API: |
| 75 | +Reference your model in an `api`: |
62 | 76 |
|
63 | 77 | ```yaml |
64 | 78 | - kind: api |
|
0 commit comments