I am new to tensor flow and am currently experimenting with models of varying complexity. I have a problem with package save and restore functions. As far as I understood the tutorials, I would have to restore the prepared schedule and start it with some new input at some later point. However, when I try to do this, I get the following error:
InvalidArgumentError (see above for tracing): Form [-1,10] has negative dimensions [[Node: Placeholder = Placeholderdtype = DT_FLOAT, shape = [?, 10], _device = "/ job: localhost / replica: 0 / task : 0 / cpu: 0 "]]
My understanding of the message is that the reconstructed graph does not like one dimension to remain arbitrary, which, in turn, is necessary for practical cases when I do not know in advance how large my input will be. The code snippet as a minimal example that creates the error above can be found below. I know how to restore each tensor individually, but it becomes impractical pretty quickly when models grow in complexity. I am grateful for any help I receive and apologize if my question is stupid.
import numpy as np
import tensorflow as tf
def generate_random_input():
alist = []
for _ in range(10):
alist.append(np.random.uniform(-1, 1, 100))
return np.array(alist).T
def generate_random_target():
return np.random.uniform(-1, 1, 100)
x = tf.placeholder('float', [None, 10])
y = tf.placeholder('float')
w1 = tf.get_variable('w1', [10, 1], dtype=tf.float32, initializer=tf.contrib.layers.xavier_initializer(seed=1))
b1 = tf.get_variable('b1', [1], dtype=tf.float32, initializer=tf.contrib.layers.xavier_initializer(seed=1))
result = tf.add(tf.matmul(x, w1), b1, name='result')
loss = tf.reduce_mean(tf.losses.mean_squared_error(predictions=result, labels=y))
optimizer = tf.train.AdamOptimizer(0.03).minimize(loss)
saver = tf.train.Saver()
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
sess.run([optimizer, loss], feed_dict={x: generate_random_input(), y: generate_random_target()})
saver.save(sess, 'file_name')
sess2 = tf.Session()
saver = tf.train.import_meta_graph('file_name.meta')
saver.restore(sess2, tf.train.latest_checkpoint('./'))
graph = tf.get_default_graph()
pred = graph.get_operation_by_name('result')
test_result = sess2.run(pred, feed_dict={x: generate_random_input()})