tensorflow - TensorBoard summary on multiple GPUs -
just wondering if can me on plotting scalars in multiple gpus setting.
what have @ moment is:
device_index in xrange(args.num_gpus): tf.device('/gpu:%d' % device_index), tf.name_scope('tower_%d' % device_index) scope: loss, grads = get_loss_grads() all_losses.append(loss) allr_grads.append(grads) summaries = tf.get_collection(tf.graphkeys.summaries, scope) r_loss = tf.reduce_mean(all_losses) ...later... summaries = tf.merge_summary(summaries) sess = tf.session(config=tf.configproto(allow_soft_placement=true)) while training: summary, loss_value, _ = sess.run(feed, fatch) writer.add_summary(summary, step)
however, "code" can save last tower. basically, i'd have losses each tower , r_loss displayed in tensorboard.
thanks,
edit:
i can plot each tower now:
all_summaries = [] device_index in xrange(args.num_gpus): tf.device('/gpu:%d' % device_index), tf.name_scope('tower_%d' % device_index) scope: loss, grads = get_loss_grads() all_losses.append(loss) allr_grads.append(grads) summaries = tf.get_collection(tf.graphkeys.summaries, scope) all_summaries.append(summaries) r_loss = tf.reduce_mean(all_losses) ...later... summaries = tf.merge_summary(all_summaries) sess = tf.session(config=tf.configproto(allow_soft_placement=true)) while training: summary, loss_value, _ = sess.run(feed, fatch) writer.add_summary(summary, step)
the question how can save/plot r_loss?
edit:
i think have now:
device_index in xrange(args.num_gpus): tf.device('/gpu:%d' % device_index), tf.name_scope('tower_%d' % device_index) scope: loss, grads = get_loss_grads() all_losses.append(loss) allr_grads.append(grads) summaries = tf.get_collection(tf.graphkeys.summaries, scope) r_loss = tf.reduce_mean(all_losses) tf.summary.scalar("reduce_mean_losses", r_oss) ...later... summaries = tf.merge_summary() sess = tf.session(config=tf.configproto(allow_soft_placement=true)) while training: summary, loss_value, _ = sess.run(feed, fatch) writer.add_summary(summary, step)
i think tf.merge_summary "magically" collecting summaries.
Comments
Post a Comment