pipelines/contrib/samples/openvino/model_optimizer
Trawinski, Dariusz 62c1e1294a Adjustments for OpenVINO v5 (#997)
* initial version

* fix predict function

* add classes file

* predict and readmy updates

* TF docker fix
2019-03-20 18:51:36 -07:00
..
README.md OpenVINO model optimizer component and example pipeline (#178) 2018-11-16 15:55:15 -08:00
convert_model_pipeline.py Adjustments for OpenVINO v5 (#997) 2019-03-20 18:51:36 -07:00

README.md

OpenVINO model optimizer pipeline

This is an example of a one step pipeline implementation of model optimization using OpenVINO toolkit

It performs graph optimization and generates Intermediate Representation model format which can be used later by the Inference Engine.

Learn more about OpenVINO model optimizer

Note: Executing this pipeline required building the docker image according to the guidelines on OpenVINO model converted doc. The image name pushed to the docker registry should be configured in the pipeline script convert_model_pipeline.py

Examples of the parameters

input_path - gs://tensorflow_model_path/resnet/1/saved_model.pb

mo_options - --saved_model_dir .

output_path - gs://tensorflow_model_path/resnet/1

All parameters for model optimizer options are described in the component doc

The model conversion component is copying the content of the input path to the current directory in the container. It can include a single file or the complete folder. In the model optimizer options you should reference the the file using relative path from the input path folder. This way you could pass also any configuration file needed by the model optimizer.