Friday, July 20, 2018

Error Trying to Convert TensorFlow Saved Model to TensorFlow.js Model

Leave a Comment

I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:

embedded_text_feature_column = hub.text_embedding_column(     key="sentence",     module_spec="https://tfhub.dev/google/nnlm-de-dim128/1") feature_columns = [embedded_text_feature_column] estimator = tf.estimator.DNNClassifier(     hidden_units=[500, 100],     feature_columns=feature_columns,     n_classes=2,     optimizer=tf.train.AdagradOptimizer(learning_rate=0.003)) feature_spec = tf.feature_column.make_parse_example_spec(feature_columns) serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec) estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn) 

Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.

When I issue the following command:

tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve /my/dir/base /my/export/dir 

…I get this error message:

ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding

I assume I'm doing something wrong when saving the model.

What is the correct way to save an estimator model so that it can be converted with tfjs-converter?

The source code of my project can be found on GitHub.

0 Answers

If You Enjoyed This, Take 5 Seconds To Share It

0 comments:

Post a Comment