Failed Tensorflow Lite conversion: Unsupported data type in placeholder op
System information:
Mac OS Mojave
TensorFlow installed from (source or binary):
pip install tensorflow
TensorFlow version (or github SHA if from source):
1.12
I am trying to convert a simple convolutional tensorflow model to tensorflow lite. I already have it in SavedModel format. But when I try to run the convert util on the saved model, I get:
RuntimeError: TOCO failed see console for info.
b"2018-12-30 15:40:54.449737: I tensorflow/contrib/lite/toco/import_tensorflow.cc:189] Unsupported data type in placeholder op: 2n2018-12-30 15:40:54.450020: F tensorflow/contrib/lite/toco/import_tensorflow.cc:2137] Check failed: status.ok() Unexpected value forattribute 'T'. Expected 'DT_FLOAT'n"
To save the model, I have:
// model is an Estimator instance
def export(model):
model.export_saved_model("tmp/export", serving_input_receiver_fn)
and:
def serving_input_receiver_fn():
features = { 'x': tf.placeholder(shape=[1, 100, 100, 1], dtype=tf.as_dtype(np.int32)) }
return tf.estimator.export.ServingInputReceiver(features, features)
Input dtype is np.int32, so I attempt to cast that to a tf type here.
I can attach the full model def on request.
Thanks.
tensorflow tensorflow-lite
add a comment |
System information:
Mac OS Mojave
TensorFlow installed from (source or binary):
pip install tensorflow
TensorFlow version (or github SHA if from source):
1.12
I am trying to convert a simple convolutional tensorflow model to tensorflow lite. I already have it in SavedModel format. But when I try to run the convert util on the saved model, I get:
RuntimeError: TOCO failed see console for info.
b"2018-12-30 15:40:54.449737: I tensorflow/contrib/lite/toco/import_tensorflow.cc:189] Unsupported data type in placeholder op: 2n2018-12-30 15:40:54.450020: F tensorflow/contrib/lite/toco/import_tensorflow.cc:2137] Check failed: status.ok() Unexpected value forattribute 'T'. Expected 'DT_FLOAT'n"
To save the model, I have:
// model is an Estimator instance
def export(model):
model.export_saved_model("tmp/export", serving_input_receiver_fn)
and:
def serving_input_receiver_fn():
features = { 'x': tf.placeholder(shape=[1, 100, 100, 1], dtype=tf.as_dtype(np.int32)) }
return tf.estimator.export.ServingInputReceiver(features, features)
Input dtype is np.int32, so I attempt to cast that to a tf type here.
I can attach the full model def on request.
Thanks.
tensorflow tensorflow-lite
add a comment |
System information:
Mac OS Mojave
TensorFlow installed from (source or binary):
pip install tensorflow
TensorFlow version (or github SHA if from source):
1.12
I am trying to convert a simple convolutional tensorflow model to tensorflow lite. I already have it in SavedModel format. But when I try to run the convert util on the saved model, I get:
RuntimeError: TOCO failed see console for info.
b"2018-12-30 15:40:54.449737: I tensorflow/contrib/lite/toco/import_tensorflow.cc:189] Unsupported data type in placeholder op: 2n2018-12-30 15:40:54.450020: F tensorflow/contrib/lite/toco/import_tensorflow.cc:2137] Check failed: status.ok() Unexpected value forattribute 'T'. Expected 'DT_FLOAT'n"
To save the model, I have:
// model is an Estimator instance
def export(model):
model.export_saved_model("tmp/export", serving_input_receiver_fn)
and:
def serving_input_receiver_fn():
features = { 'x': tf.placeholder(shape=[1, 100, 100, 1], dtype=tf.as_dtype(np.int32)) }
return tf.estimator.export.ServingInputReceiver(features, features)
Input dtype is np.int32, so I attempt to cast that to a tf type here.
I can attach the full model def on request.
Thanks.
tensorflow tensorflow-lite
System information:
Mac OS Mojave
TensorFlow installed from (source or binary):
pip install tensorflow
TensorFlow version (or github SHA if from source):
1.12
I am trying to convert a simple convolutional tensorflow model to tensorflow lite. I already have it in SavedModel format. But when I try to run the convert util on the saved model, I get:
RuntimeError: TOCO failed see console for info.
b"2018-12-30 15:40:54.449737: I tensorflow/contrib/lite/toco/import_tensorflow.cc:189] Unsupported data type in placeholder op: 2n2018-12-30 15:40:54.450020: F tensorflow/contrib/lite/toco/import_tensorflow.cc:2137] Check failed: status.ok() Unexpected value forattribute 'T'. Expected 'DT_FLOAT'n"
To save the model, I have:
// model is an Estimator instance
def export(model):
model.export_saved_model("tmp/export", serving_input_receiver_fn)
and:
def serving_input_receiver_fn():
features = { 'x': tf.placeholder(shape=[1, 100, 100, 1], dtype=tf.as_dtype(np.int32)) }
return tf.estimator.export.ServingInputReceiver(features, features)
Input dtype is np.int32, so I attempt to cast that to a tf type here.
I can attach the full model def on request.
Thanks.
tensorflow tensorflow-lite
tensorflow tensorflow-lite
asked Jan 2 at 5:44
twalk4821twalk4821
114
114
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
The solution to this was not in the placeholder op itself, but in the model declaration. I was using a float64 input type. Switching to float32, and setting dtype=float32 in the placeholder, solved my issue.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54001755%2ffailed-tensorflow-lite-conversion-unsupported-data-type-in-placeholder-op%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
The solution to this was not in the placeholder op itself, but in the model declaration. I was using a float64 input type. Switching to float32, and setting dtype=float32 in the placeholder, solved my issue.
add a comment |
The solution to this was not in the placeholder op itself, but in the model declaration. I was using a float64 input type. Switching to float32, and setting dtype=float32 in the placeholder, solved my issue.
add a comment |
The solution to this was not in the placeholder op itself, but in the model declaration. I was using a float64 input type. Switching to float32, and setting dtype=float32 in the placeholder, solved my issue.
The solution to this was not in the placeholder op itself, but in the model declaration. I was using a float64 input type. Switching to float32, and setting dtype=float32 in the placeholder, solved my issue.
answered Jan 6 at 4:15
twalk4821twalk4821
114
114
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54001755%2ffailed-tensorflow-lite-conversion-unsupported-data-type-in-placeholder-op%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown