The preloaded strain gauge load model uses a different optimizer

I want to download a pre-prepared model (optimized by AdadeltaOptimizer) and continue learning with SGD (GradientDescentOptimizer). Models are saved and loaded using the tensor layer API :

save model:

import tensorlayer as tl
tl.files.save_npz(network.all_params,
                  name=model_dir + "model-%d.npz" % global_step)

loading model:

load_params = tl.files.load_npz(path=resume_dir + '/', name=model_name)
tl.files.assign_params(sess, load_params, network)

If I continue to train with adadelta, the loss of training (cross-entropy) looks fine (start with a close value as a loaded model). However, if I changed the optimizer to SGD, the loss of training would be the same as the new initialized model.

I looked at the file model-xxx.npzfrom tl.files.save_npz. It saves only all model parameters as ndarray. I'm not sure how the optimizer or learning speed is involved.

+4
3

, , /-, . SGD.

saver = tf.train.import_meta_graph('filename.meta')
saver.restore(sess,tf.train.latest_checkpoint('./'))
graph = tf.get_default_graph()
cross_entropy = graph.get_tensor_by_name("entropy:0") #Tensor to import

optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy)

- entropy,

tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y_conv), name = 'entropy')

, ( , ) graph , . Tensorlayer, . Tensorlayer-Layers, , . Tensorlayer Tensorflow, .

+2

, .

save_npz([save_list, name, sess])

save_list , , , .

( ), save_list, :

save_npz(network.all_params.extend([learning_rate])

( , all_params - , , .

, learning_rate , , . , , ( - ), , , , .

+1

https://tensorlayer.readthedocs.io/en/latest/user/get_start_advance.html#pre-trained-cnn

vgg = tl.models.vgg16(pretrained=True)
img = tl.vis.read_image('data/tiger.jpeg')
img = tl.prepro.imresize(img, (224, 224)).astype(np.float32) / 255
output = vgg(img, is_train=False)

2.0

0
source

Source: https://habr.com/ru/post/1679924/


All Articles