Freezing fine-tuned graph for TensorFlowSharp with TF 1.4












0














I've fine-tuned a model (using TF 1.9) from Object Detection Zoo Model and right now I am trying to freeze the graph for TensorFlowSharp using TF 1.9.



import tensorflow as tf
import os
from tensorflow.python.tools import freeze_graph
from tensorflow.core.protobuf import saver_pb2

#print("current tensorflow version: ", tf.version)
sess=tf.Session()

model_path = 'latest_cp/'
saver = tf.train.import_meta_graph('model.ckpt.meta')
saver.restore(sess,tf.train.latest_checkpoint('.')) #current dir of the checkpoint file
tf.train.write_graph(sess.graph_def, '.', 'test.pbtxt') #output in pbtxt format

freeze_graph.freeze_graph(input_graph = 'test.pbtxt',
input_binary = False,
input_checkpoint = model_path + 'model.ckpt',
output_node_names = "num_detections,detection_boxes,detection_scores,detection_classes",
output_graph = 'test.bytes' ,
clear_devices = True, initializer_nodes = "",input_saver = "",
restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0")


It worked but then after I imported it to Unity it returned the following error:



TFException: Op type not registered 'NonMaxSuppressionV3' in binary running on AK38713. Make sure the Op and Kernel are registered in the binary running in this process.



I find out that TensorFlowSharp works with TensorFlow 1.4 and when I tried to freeze graph with 1.4 it returns the same NonMaxSuppressionV3 error.
Do you know any way to solve this issue? Thank you so much for the support.










share|improve this question






















  • "when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
    – jdehesa
    Nov 19 '18 at 17:13






  • 1




    If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
    – jdehesa
    Nov 19 '18 at 17:13
















0














I've fine-tuned a model (using TF 1.9) from Object Detection Zoo Model and right now I am trying to freeze the graph for TensorFlowSharp using TF 1.9.



import tensorflow as tf
import os
from tensorflow.python.tools import freeze_graph
from tensorflow.core.protobuf import saver_pb2

#print("current tensorflow version: ", tf.version)
sess=tf.Session()

model_path = 'latest_cp/'
saver = tf.train.import_meta_graph('model.ckpt.meta')
saver.restore(sess,tf.train.latest_checkpoint('.')) #current dir of the checkpoint file
tf.train.write_graph(sess.graph_def, '.', 'test.pbtxt') #output in pbtxt format

freeze_graph.freeze_graph(input_graph = 'test.pbtxt',
input_binary = False,
input_checkpoint = model_path + 'model.ckpt',
output_node_names = "num_detections,detection_boxes,detection_scores,detection_classes",
output_graph = 'test.bytes' ,
clear_devices = True, initializer_nodes = "",input_saver = "",
restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0")


It worked but then after I imported it to Unity it returned the following error:



TFException: Op type not registered 'NonMaxSuppressionV3' in binary running on AK38713. Make sure the Op and Kernel are registered in the binary running in this process.



I find out that TensorFlowSharp works with TensorFlow 1.4 and when I tried to freeze graph with 1.4 it returns the same NonMaxSuppressionV3 error.
Do you know any way to solve this issue? Thank you so much for the support.










share|improve this question






















  • "when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
    – jdehesa
    Nov 19 '18 at 17:13






  • 1




    If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
    – jdehesa
    Nov 19 '18 at 17:13














0












0








0







I've fine-tuned a model (using TF 1.9) from Object Detection Zoo Model and right now I am trying to freeze the graph for TensorFlowSharp using TF 1.9.



import tensorflow as tf
import os
from tensorflow.python.tools import freeze_graph
from tensorflow.core.protobuf import saver_pb2

#print("current tensorflow version: ", tf.version)
sess=tf.Session()

model_path = 'latest_cp/'
saver = tf.train.import_meta_graph('model.ckpt.meta')
saver.restore(sess,tf.train.latest_checkpoint('.')) #current dir of the checkpoint file
tf.train.write_graph(sess.graph_def, '.', 'test.pbtxt') #output in pbtxt format

freeze_graph.freeze_graph(input_graph = 'test.pbtxt',
input_binary = False,
input_checkpoint = model_path + 'model.ckpt',
output_node_names = "num_detections,detection_boxes,detection_scores,detection_classes",
output_graph = 'test.bytes' ,
clear_devices = True, initializer_nodes = "",input_saver = "",
restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0")


It worked but then after I imported it to Unity it returned the following error:



TFException: Op type not registered 'NonMaxSuppressionV3' in binary running on AK38713. Make sure the Op and Kernel are registered in the binary running in this process.



I find out that TensorFlowSharp works with TensorFlow 1.4 and when I tried to freeze graph with 1.4 it returns the same NonMaxSuppressionV3 error.
Do you know any way to solve this issue? Thank you so much for the support.










share|improve this question













I've fine-tuned a model (using TF 1.9) from Object Detection Zoo Model and right now I am trying to freeze the graph for TensorFlowSharp using TF 1.9.



import tensorflow as tf
import os
from tensorflow.python.tools import freeze_graph
from tensorflow.core.protobuf import saver_pb2

#print("current tensorflow version: ", tf.version)
sess=tf.Session()

model_path = 'latest_cp/'
saver = tf.train.import_meta_graph('model.ckpt.meta')
saver.restore(sess,tf.train.latest_checkpoint('.')) #current dir of the checkpoint file
tf.train.write_graph(sess.graph_def, '.', 'test.pbtxt') #output in pbtxt format

freeze_graph.freeze_graph(input_graph = 'test.pbtxt',
input_binary = False,
input_checkpoint = model_path + 'model.ckpt',
output_node_names = "num_detections,detection_boxes,detection_scores,detection_classes",
output_graph = 'test.bytes' ,
clear_devices = True, initializer_nodes = "",input_saver = "",
restore_op_name = "save/restore_all", filename_tensor_name = "save/Const:0")


It worked but then after I imported it to Unity it returned the following error:



TFException: Op type not registered 'NonMaxSuppressionV3' in binary running on AK38713. Make sure the Op and Kernel are registered in the binary running in this process.



I find out that TensorFlowSharp works with TensorFlow 1.4 and when I tried to freeze graph with 1.4 it returns the same NonMaxSuppressionV3 error.
Do you know any way to solve this issue? Thank you so much for the support.







python unity3d tensorflow object-detection tensorflowsharp






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 19 '18 at 15:39









o.O

188214




188214












  • "when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
    – jdehesa
    Nov 19 '18 at 17:13






  • 1




    If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
    – jdehesa
    Nov 19 '18 at 17:13


















  • "when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
    – jdehesa
    Nov 19 '18 at 17:13






  • 1




    If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
    – jdehesa
    Nov 19 '18 at 17:13
















"when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
– jdehesa
Nov 19 '18 at 17:13




"when I tried to freeze graph with 1.4" you mean you installed TF 1.4 in Python and tried to export it? That should work, but you need to create the model with TF 1.4, not just restore it and export it. You don't necessarily have to retrain, though, you could run the original code to create the graph in 1.4, restore only the variables (not the whole metagraph) and export it.
– jdehesa
Nov 19 '18 at 17:13




1




1




If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
– jdehesa
Nov 19 '18 at 17:13




If that is absolutely not an option (e.g. no code available), technically it could be possible to replace NonMaxSuppressionV3 ops with a former version. Seems NonMaxSuppressionV2 has been there since 1.3.0, and looks like it has the same interface. However it's a bit of a messy path and there could still be other incompatible ops in the graph.
– jdehesa
Nov 19 '18 at 17:13












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53378045%2ffreezing-fine-tuned-graph-for-tensorflowsharp-with-tf-1-4%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53378045%2ffreezing-fine-tuned-graph-for-tensorflowsharp-with-tf-1-4%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

SQL update select statement

WPF add header to Image with URL pettitions [duplicate]