aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md
diff options
context:
space:
mode:
authorGravatar A. Unique TensorFlower <nobody@tensorflow.org>2016-05-26 01:36:58 -0800
committerGravatar TensorFlower Gardener <gardener@tensorflow.org>2016-05-26 02:47:07 -0700
commit017175783ec4eead6c6c98a0e0de1e5031d45845 (patch)
treeb2d8e34cb87bc86a2405d8182b4e28779e724c1e /tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md
parentc963048b536b8905729ee617dfb78f21a0abd4d7 (diff)
Update generated Python Op docs.
Change: 123300300
Diffstat (limited to 'tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md')
-rw-r--r--tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md48
1 files changed, 48 insertions, 0 deletions
diff --git a/tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md b/tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md
new file mode 100644
index 0000000000..ea710b2a15
--- /dev/null
+++ b/tensorflow/g3doc/api_docs/python/functions_and_classes/shard9/tf.gradients.md
@@ -0,0 +1,48 @@
+### `tf.gradients(ys, xs, grad_ys=None, name='gradients', colocate_gradients_with_ops=False, gate_gradients=False, aggregation_method=None)` {#gradients}
+
+Constructs symbolic partial derivatives of sum of `ys` w.r.t. x in `xs`.
+
+`ys` and `xs` are each a `Tensor` or a list of tensors. `grad_ys`
+is a list of `Tensor`, holding the gradients received by the
+`ys`. The list must be the same length as `ys`.
+
+`gradients()` adds ops to the graph to output the partial
+derivatives of `ys` with respect to `xs`. It returns a list of
+`Tensor` of length `len(xs)` where each tensor is the `sum(dy/dx)`
+for y in `ys`.
+
+`grad_ys` is a list of tensors of the same length as `ys` that holds
+the initial gradients for each y in `ys`. When `grad_ys` is None,
+we fill in a tensor of '1's of the shape of y for each y in `ys`. A
+user can provide their own initial `grad_ys` to compute the
+derivatives using a different initial gradient for each y (e.g., if
+one wanted to weight the gradient differently for each value in
+each y).
+
+##### Args:
+
+
+* <b>`ys`</b>: A `Tensor` or list of tensors to be differentiated.
+* <b>`xs`</b>: A `Tensor` or list of tensors to be used for differentiation.
+* <b>`grad_ys`</b>: Optional. A `Tensor` or list of tensors the same size as
+ `ys` and holding the gradients computed for each y in `ys`.
+* <b>`name`</b>: Optional name to use for grouping all the gradient ops together.
+ defaults to 'gradients'.
+* <b>`colocate_gradients_with_ops`</b>: If True, try colocating gradients with
+ the corresponding op.
+* <b>`gate_gradients`</b>: If True, add a tuple around the gradients returned
+ for an operations. This avoids some race conditions.
+* <b>`aggregation_method`</b>: Specifies the method used to combine gradient terms.
+ Accepted values are constants defined in the class `AggregationMethod`.
+
+##### Returns:
+
+ A list of `sum(dy/dx)` for each x in `xs`.
+
+##### Raises:
+
+
+* <b>`LookupError`</b>: if one of the operations between `x` and `y` does not
+ have a registered gradient function.
+* <b>`ValueError`</b>: if the arguments are invalid.
+