Skip to content
This repository has been archived by the owner on May 27, 2024. It is now read-only.

Commit

Permalink
Merge pull request #7 from ashutosh1919/comments
Browse files Browse the repository at this point in the history
Replaced Depricated tf.contrib.deprecated.scalar_summary with tf.compat.v1.summary.scalar
  • Loading branch information
roadjiang authored Jul 27, 2019
2 parents e7c8eb1 + 8827968 commit 76d6be2
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 2 deletions.
3 changes: 2 additions & 1 deletion code/cifar_train_baseline.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,7 +247,8 @@ def train_inception_baseline(max_step_run):
labels=one_hot_labels, logits=logits)
total_loss = tf.reduce_mean(total_loss)

tf.contrib.deprecated.scalar_summary('Total Loss', total_loss)
# Using latest tensorflow ProtoBuf.
tf.compat.v1.summary.scalar('Total Loss', total_loss)

decay_steps = int(
num_samples_per_epoch / FLAGS.batch_size * FLAGS.num_epochs_per_decay)
Expand Down
2 changes: 1 addition & 1 deletion code/resnet_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ def build_model(self):
def _build_train_op(self):
"""Build training specific ops for the graph."""
self.lrn_rate = tf.constant(self.hps.lrn_rate, tf.float32)
tf.contrib.deprecated.scalar_summary('learning rate', self.lrn_rate)
tf.compat.v1.summary.scalar('learning rate', self.lrn_rate)

trainable_variables = tf.trainable_variables()
grads = tf.gradients(self.cost, trainable_variables)
Expand Down

0 comments on commit 76d6be2

Please sign in to comment.