tensorflow - Efficient way to extract pool_3 for large number of images? -


i'd use pool_3 features extracted set of images. have loop on each image extract pool_3 features:

# x_input.shape = (40000, 32, 32, 3) def batch_pool3_features(x_input):     sess = tf.interactivesession()     n_train = x_input.shape[0]     print 'extracting features %i rows' % n_train     pool3 = sess.graph.get_tensor_by_name('pool_3:0')     x_pool3 = []     in range(n_train):         print 'iteration %i' %         pool3_features = sess.run(pool3,{'decodejpeg:0': x_input[i,:]})         x_pool3.append(np.squeeze(pool3_features))     return np.array(x_pool3) 

this quite slow though. there faster batch implementation this?

thanks

it doesn't - yet. i've opened a ticket feature request on github in response question.


Comments

Popular posts from this blog

javascript - jQuery: Add class depending on URL in the best way -

caching - How to check if a url path exists in the service worker cache -

Redirect to a HTTPS version using .htaccess -