2017-05-18 23 views
1

我試圖顯示我在tensorboard中的嵌入。當我打開tensorboard的嵌入選項卡時,我得到:「計算PCA ...」和張量板無限地掛起。tensorfboard嵌入與「計算PCA」掛起

在此之前,它確實加載我的形狀張量200x128。它也可以找到元數據文件。

我試着在TF版本0.12和1.1上得到相同的結果。

features = np.zeros(shape=(num_batches*batch_size, 128), dtype=float) 
embedding_var = tf.Variable(features, name='feature_embedding') 
config = projector.ProjectorConfig() 
embedding = config.embeddings.add() 
embedding.tensor_name = 'feature_embedding' 
metadata_path = os.path.join(self.log_dir, 'metadata.tsv') 
embedding.metadata_path = metadata_path 

with tf.Session(config=self.config) as sess: 
    tf.global_variables_initializer().run() 
    restorer = tf.train.Saver() 
    restorer.restore(sess, self.pretrained_model_path) 

    with open(metadata_path, 'w') as f: 

    for step in range(num_batches): 
     batch_images, batch_labels = data.next() 

     for label in batch_labels: 
      f.write('%s\n' % label) 

     feed_dict = {model.images: batch_images} 
     features[step*batch_size : (step+1)*batch_size, :] = \ 
        sess.run(model.features, feed_dict) 

    sess.run(embedding_var.initializer) 
    projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config) 
+0

也許報告爲錯誤? (有更多的細節,比如你的操作系統等)這個網站是針對Q&A的。 – MaxB

+0

我不知道我是否錯過了一些東西 – etoropov

回答

0

我不知道什麼是錯在上面的代碼,但我重寫它以不同的方式(如下圖),和它的作品。區別在於何時以及如何初始化embedding_var

我也製作了a gist to copy-paste code from這個。

# a numpy array for embeddings and a list for labels 
features = np.zeros(shape=(num_batches*self.batch_size, 128), dtype=float) 
labels = [] 


# compute embeddings batch by batch 
with tf.Session(config=self.config) as sess: 
    tf.global_variables_initializer().run() 
    restorer = tf.train.Saver() 
    restorer.restore(sess, self.pretrained_model) 

    for step in range(num_batches): 
    batch_images, batch_labels = data.next() 

    labels += batch_labels 

    feed_dict = {model.images: batch_images}      
    features[step*self.batch_size : (step+1)*self.batch_size, :] = \ 
       sess.run(model.features, feed_dict) 


# write labels 
metadata_path = os.path.join(self.log_dir, 'metadata.tsv') 
with open(metadata_path, 'w') as f: 
    for label in labels: 
    f.write('%s\n' % label) 


# write embeddings 
with tf.Session(config=self.config) as sess: 

    config = projector.ProjectorConfig() 
    embedding = config.embeddings.add() 
    embedding.tensor_name = 'feature_embedding' 
    embedding.metadata_path = metadata_path 

    embedding_var = tf.Variable(features, name='feature_embedding') 
    sess.run(embedding_var.initializer) 
    projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config)     

    saver = tf.train.Saver({"feature_embedding": embedding_var}) 
    saver.save(sess, os.path.join(self.log_dir, 'model_features'))