Loading the Tensorflow Learning Model into Evaluation

Say I prepared the Tensor Estimator:

estimator = tf.contrib.learn.Estimator(
  model_fn=model_fn,
  model_dir=MODEL_DIR,
  config=some_config)

And I come to some train data:

estimator.fit(input_fn=input_fn_train, steps=None)

The idea is that the model is suitable for my MODEL_DIR. This folder contains a breakpoint and several files .metaand .index.

It works great. I want to make some predictions using my functions:

estimator = tf.contrib.Estimator(
  model_fn=model_fn,
  model_dir=MODEL_DIR,
  config=some_config)

predictions = estimator.predict(input_fn=input_fn_test)

My solution works fine, but there is one big drawback: you need to know model_fn, which is my model defined in Python. But if I change the model by adding a dense layer to my Python code, this model is incorrect for the stored data in MODEL_DIR, which leads to incorrect results:

NotFoundError (see above for traceback): Key xxxx/dense/kernel not found in checkpoint

? /, ? model_fn MODEL_DIR?

+4

Source: https://habr.com/ru/post/1691989/


All Articles