aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/examples/image_retraining
diff options
context:
space:
mode:
authorGravatar Martin Wicke <wicke@google.com>2017-01-04 21:25:34 -0800
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2017-01-04 21:46:08 -0800
commit333dc32ff79af21484695157f3d141dc776f7c02 (patch)
treeb379bcaa56bfa54d12ea839fb7e62ab163490743 /tensorflow/examples/image_retraining
parentd9541696b068cfcc1fab66b03d0b8d605b64f14d (diff)
Change arg order for {softmax,sparse_softmax,sigmoid}_cross_entropy_with_logits to be (labels, predictions), and force use of named args to avoid accidents.
Change: 143629623
Diffstat (limited to 'tensorflow/examples/image_retraining')
-rw-r--r--tensorflow/examples/image_retraining/retrain.py2
1 files changed, 1 insertions, 1 deletions
diff --git a/tensorflow/examples/image_retraining/retrain.py b/tensorflow/examples/image_retraining/retrain.py
index 0d5ba84c2d..c5518e2603 100644
--- a/tensorflow/examples/image_retraining/retrain.py
+++ b/tensorflow/examples/image_retraining/retrain.py
@@ -723,7 +723,7 @@ def add_final_training_ops(class_count, final_tensor_name, bottleneck_tensor):
with tf.name_scope('cross_entropy'):
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(
- logits, ground_truth_input)
+ labels=ground_truth_input, logits=logits)
with tf.name_scope('total'):
cross_entropy_mean = tf.reduce_mean(cross_entropy)
tf.summary.scalar('cross_entropy', cross_entropy_mean)