You need to copy it to the tf variable. There's a great answer to this question in StackOverflow: Using pre-prepared word embedding (word2vec or Glove) in TensorFlow
Here is how I did it:
embedding_weights = tf.Variable(tf.constant(0.0, shape=[embedding_vocab_size, EMBEDDING_DIM]),trainable=False, name="embedding_weights") embedding_placeholder = tf.placeholder(tf.float32, [embedding_vocab_size, EMBEDDING_DIM]) embedding_init = embedding_weights.assign(embedding_placeholder) sess = tf.Session(config=tf.ConfigProto(log_device_placement=True)) sess.run(embedding_init, feed_dict={embedding_placeholder: embedding_matrix})
Then you can use the embedding_weights variable to do the search (don't forget to keep the index index display)
Update. The use of a variable is not required, but it allows you to save it for future use so that you do not have to do all this again (it takes some time to load very large attachments on my laptop). If this is not important, you can simply use placeholders, for example, Nicklas Schnelle suggested
source share