Using model optimizer for tensorflow slim models





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I am aiming to inference tensorflow slim model with Intel OpenVINO optimizer. Using open vino docs and slides for inference and tf slim docs for training model.



It's a multi-class classification problem. I have trained tf slim mobilnet_v2 model from scratch (using sript train_image_classifier.py). Evaluation of trained model on test set gives relatively good results to begin with (using script eval_image_classifier.py):



eval/Accuracy[0.8017]eval/Recall_5[0.9993]



However, single .ckpt file is not saved (even though at the end of train_image_classifier.py run there is a message like "model.ckpt is saved to checkpoint_dir"), there are 3 files (.ckpt-180000.data-00000-of-00001, .ckpt-180000.index, .ckpt-180000.meta) instead.



OpenVINO model optimizer requires a single checkpoint file.



According to docs I call mo_tf.py with following params:



python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_checkpoint D:/model/model.ckpt-180000 -b 1


It gives the error (same if pass --input_checkpoint D:/model/model.ckpt):



[ ERROR ]  The value for command line parameter "input_checkpoint" must be existing file/directory,  but "D:/model/model.ckpt-180000" does not exist.


Error message is clear, there are not such files on disk. But as I know most tf utilities convert .ckpt-????.meta to .ckpt under the hood.



Trying to call:



python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_meta_graph D:/model/model.ckpt-180000.meta -b 1


Causes:



[ ERROR ]  Unknown configuration of input model parameters


It doesn't matter for me in which way I will transfer graph to OpenVINO intermediate representation, just need to reach that result.



Thanks a lot.



EDIT



I managed to run OpenVINO model optimizer on frozen graph of tf slim model. However I still have no idea why had my previous attempts (based on docs) failed.










share|improve this question































    0















    I am aiming to inference tensorflow slim model with Intel OpenVINO optimizer. Using open vino docs and slides for inference and tf slim docs for training model.



    It's a multi-class classification problem. I have trained tf slim mobilnet_v2 model from scratch (using sript train_image_classifier.py). Evaluation of trained model on test set gives relatively good results to begin with (using script eval_image_classifier.py):



    eval/Accuracy[0.8017]eval/Recall_5[0.9993]



    However, single .ckpt file is not saved (even though at the end of train_image_classifier.py run there is a message like "model.ckpt is saved to checkpoint_dir"), there are 3 files (.ckpt-180000.data-00000-of-00001, .ckpt-180000.index, .ckpt-180000.meta) instead.



    OpenVINO model optimizer requires a single checkpoint file.



    According to docs I call mo_tf.py with following params:



    python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_checkpoint D:/model/model.ckpt-180000 -b 1


    It gives the error (same if pass --input_checkpoint D:/model/model.ckpt):



    [ ERROR ]  The value for command line parameter "input_checkpoint" must be existing file/directory,  but "D:/model/model.ckpt-180000" does not exist.


    Error message is clear, there are not such files on disk. But as I know most tf utilities convert .ckpt-????.meta to .ckpt under the hood.



    Trying to call:



    python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_meta_graph D:/model/model.ckpt-180000.meta -b 1


    Causes:



    [ ERROR ]  Unknown configuration of input model parameters


    It doesn't matter for me in which way I will transfer graph to OpenVINO intermediate representation, just need to reach that result.



    Thanks a lot.



    EDIT



    I managed to run OpenVINO model optimizer on frozen graph of tf slim model. However I still have no idea why had my previous attempts (based on docs) failed.










    share|improve this question



























      0












      0








      0








      I am aiming to inference tensorflow slim model with Intel OpenVINO optimizer. Using open vino docs and slides for inference and tf slim docs for training model.



      It's a multi-class classification problem. I have trained tf slim mobilnet_v2 model from scratch (using sript train_image_classifier.py). Evaluation of trained model on test set gives relatively good results to begin with (using script eval_image_classifier.py):



      eval/Accuracy[0.8017]eval/Recall_5[0.9993]



      However, single .ckpt file is not saved (even though at the end of train_image_classifier.py run there is a message like "model.ckpt is saved to checkpoint_dir"), there are 3 files (.ckpt-180000.data-00000-of-00001, .ckpt-180000.index, .ckpt-180000.meta) instead.



      OpenVINO model optimizer requires a single checkpoint file.



      According to docs I call mo_tf.py with following params:



      python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_checkpoint D:/model/model.ckpt-180000 -b 1


      It gives the error (same if pass --input_checkpoint D:/model/model.ckpt):



      [ ERROR ]  The value for command line parameter "input_checkpoint" must be existing file/directory,  but "D:/model/model.ckpt-180000" does not exist.


      Error message is clear, there are not such files on disk. But as I know most tf utilities convert .ckpt-????.meta to .ckpt under the hood.



      Trying to call:



      python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_meta_graph D:/model/model.ckpt-180000.meta -b 1


      Causes:



      [ ERROR ]  Unknown configuration of input model parameters


      It doesn't matter for me in which way I will transfer graph to OpenVINO intermediate representation, just need to reach that result.



      Thanks a lot.



      EDIT



      I managed to run OpenVINO model optimizer on frozen graph of tf slim model. However I still have no idea why had my previous attempts (based on docs) failed.










      share|improve this question
















      I am aiming to inference tensorflow slim model with Intel OpenVINO optimizer. Using open vino docs and slides for inference and tf slim docs for training model.



      It's a multi-class classification problem. I have trained tf slim mobilnet_v2 model from scratch (using sript train_image_classifier.py). Evaluation of trained model on test set gives relatively good results to begin with (using script eval_image_classifier.py):



      eval/Accuracy[0.8017]eval/Recall_5[0.9993]



      However, single .ckpt file is not saved (even though at the end of train_image_classifier.py run there is a message like "model.ckpt is saved to checkpoint_dir"), there are 3 files (.ckpt-180000.data-00000-of-00001, .ckpt-180000.index, .ckpt-180000.meta) instead.



      OpenVINO model optimizer requires a single checkpoint file.



      According to docs I call mo_tf.py with following params:



      python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_checkpoint D:/model/model.ckpt-180000 -b 1


      It gives the error (same if pass --input_checkpoint D:/model/model.ckpt):



      [ ERROR ]  The value for command line parameter "input_checkpoint" must be existing file/directory,  but "D:/model/model.ckpt-180000" does not exist.


      Error message is clear, there are not such files on disk. But as I know most tf utilities convert .ckpt-????.meta to .ckpt under the hood.



      Trying to call:



      python mo_tf.py --input_model D:/model/mobilenet_v2_224.pb --input_meta_graph D:/model/model.ckpt-180000.meta -b 1


      Causes:



      [ ERROR ]  Unknown configuration of input model parameters


      It doesn't matter for me in which way I will transfer graph to OpenVINO intermediate representation, just need to reach that result.



      Thanks a lot.



      EDIT



      I managed to run OpenVINO model optimizer on frozen graph of tf slim model. However I still have no idea why had my previous attempts (based on docs) failed.







      tensorflow tf-slim tensorflow-slim inference-engine openvino






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 4 at 8:55







      f4f

















      asked Jan 3 at 8:28









      f4ff4f

      567




      567
























          1 Answer
          1






          active

          oldest

          votes


















          0














          you can try converting the model to frozen format (.pb) and then convert the model using OpenVINO.



          .ckpt-meta has the metagraph. The computation graph structure without variable values.
          the one you can observe in tensorboard.



          .ckpt-data has the variable values,without the skeleton or structure. to restore a model we need both meta and data files.



          .pb file saves the whole graph (meta+data)



          As per the documentation of OpenVINO:



          When a network is defined in Python* code, you have to create an inference graph file. Usually, graphs are built in a form that allows model training. That means that all trainable parameters are represented as variables in the graph. To use the graph with the Model Optimizer, it should be frozen.
          https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow



          the OpenVINO optimizes the model by converting the weighted graph passed in frozen form.






          share|improve this answer


























          • Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

            – f4f
            Jan 17 at 11:48














          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018696%2fusing-model-optimizer-for-tensorflow-slim-models%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          you can try converting the model to frozen format (.pb) and then convert the model using OpenVINO.



          .ckpt-meta has the metagraph. The computation graph structure without variable values.
          the one you can observe in tensorboard.



          .ckpt-data has the variable values,without the skeleton or structure. to restore a model we need both meta and data files.



          .pb file saves the whole graph (meta+data)



          As per the documentation of OpenVINO:



          When a network is defined in Python* code, you have to create an inference graph file. Usually, graphs are built in a form that allows model training. That means that all trainable parameters are represented as variables in the graph. To use the graph with the Model Optimizer, it should be frozen.
          https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow



          the OpenVINO optimizes the model by converting the weighted graph passed in frozen form.






          share|improve this answer


























          • Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

            – f4f
            Jan 17 at 11:48


















          0














          you can try converting the model to frozen format (.pb) and then convert the model using OpenVINO.



          .ckpt-meta has the metagraph. The computation graph structure without variable values.
          the one you can observe in tensorboard.



          .ckpt-data has the variable values,without the skeleton or structure. to restore a model we need both meta and data files.



          .pb file saves the whole graph (meta+data)



          As per the documentation of OpenVINO:



          When a network is defined in Python* code, you have to create an inference graph file. Usually, graphs are built in a form that allows model training. That means that all trainable parameters are represented as variables in the graph. To use the graph with the Model Optimizer, it should be frozen.
          https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow



          the OpenVINO optimizes the model by converting the weighted graph passed in frozen form.






          share|improve this answer


























          • Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

            – f4f
            Jan 17 at 11:48
















          0












          0








          0







          you can try converting the model to frozen format (.pb) and then convert the model using OpenVINO.



          .ckpt-meta has the metagraph. The computation graph structure without variable values.
          the one you can observe in tensorboard.



          .ckpt-data has the variable values,without the skeleton or structure. to restore a model we need both meta and data files.



          .pb file saves the whole graph (meta+data)



          As per the documentation of OpenVINO:



          When a network is defined in Python* code, you have to create an inference graph file. Usually, graphs are built in a form that allows model training. That means that all trainable parameters are represented as variables in the graph. To use the graph with the Model Optimizer, it should be frozen.
          https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow



          the OpenVINO optimizes the model by converting the weighted graph passed in frozen form.






          share|improve this answer















          you can try converting the model to frozen format (.pb) and then convert the model using OpenVINO.



          .ckpt-meta has the metagraph. The computation graph structure without variable values.
          the one you can observe in tensorboard.



          .ckpt-data has the variable values,without the skeleton or structure. to restore a model we need both meta and data files.



          .pb file saves the whole graph (meta+data)



          As per the documentation of OpenVINO:



          When a network is defined in Python* code, you have to create an inference graph file. Usually, graphs are built in a form that allows model training. That means that all trainable parameters are represented as variables in the graph. To use the graph with the Model Optimizer, it should be frozen.
          https://software.intel.com/en-us/articles/OpenVINO-Using-TensorFlow



          the OpenVINO optimizes the model by converting the weighted graph passed in frozen form.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Mar 15 at 5:44









          ChileAddict

          376111




          376111










          answered Jan 16 at 3:58









          Ansif_Muhammed - intelAnsif_Muhammed - intel

          864




          864













          • Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

            – f4f
            Jan 17 at 11:48





















          • Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

            – f4f
            Jan 17 at 11:48



















          Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

          – f4f
          Jan 17 at 11:48







          Yes, I managed to run model optimizer on a frozen model. Just like I have mentioned in edit section of my question. Now I am only confused why OpenVINO's tutorial steps haven't succeed.

          – f4f
          Jan 17 at 11:48






















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018696%2fusing-model-optimizer-for-tensorflow-slim-models%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

          How to fix TextFormField cause rebuild widget in Flutter