aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/g3doc/tutorials/index.md
blob: 202b87c73c091ab50b63a0bab77aa0f4250d10f0 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
# Overview


## MNIST For ML Beginners

If you're new to machine learning, we recommend starting here.  You'll learn
about a classic problem, handwritten digit classification (MNIST), and get a
gentle introduction to multiclass classification.

[View Tutorial](mnist/beginners/index.md)


## Deep MNIST for Experts

If you're already familiar with other deep learning software packages, and are
already familiar with MNIST, this tutorial with give you a very brief primer on
TensorFlow.

[View Tutorial](mnist/pros/index.md)


## TensorFlow Mechanics 101

This is a technical tutorial, where we walk you through the details of using
TensorFlow infrastructure to train models at scale.  We use again MNIST as the
example.

[View Tutorial](mnist/tf/index.md)


## Convolutional Neural Networks

An introduction to convolutional neural networks using the CIFAR-10 data set.
Convolutional neural nets are particularly tailored to images, since they
exploit translation invariance to yield more compact and effective
representations of visual content.

[View Tutorial](deep_cnn/index.md)


## Vector Representations of Words

This tutorial motivates why it is useful to learn to represent words as vectors
(called *word embeddings*). It introduces the word2vec model as an efficient
method for learning embeddings. It also covers the high-level details behind
noise-contrastive training methods (the biggest recent advance in training
embeddings).

[View Tutorial](word2vec/index.md)


## Recurrent Neural Networks

An introduction to RNNs, wherein we train an LSTM network to predict the next
word in an English sentence.  (A task sometimes called language modeling.)

[View Tutorial](recurrent/index.md)


## Sequence-to-Sequence Models

A follow on to the RNN tutorial, where we assemble a sequence-to-sequence model
for machine translation.  You will learn to build your own English-to-French
translator, entirely machine learned, end-to-end.

[View Tutorial](seq2seq/index.md)


## Mandelbrot Set

TensorFlow can be used for computation that has nothing to do with machine
learning.  Here's a naive implementation of Mandelbrot set visualization.

[View Tutorial](mandelbrot/index.md)


## Partial Differential Equations

As another example of non-machine learning computation, we offer an example of
a naive PDE simulation of raindrops landing on a pond.

[View Tutorial](pdes/index.md)


## MNIST Data Download

Details about downloading the MNIST handwritten digits data set.  Exciting
stuff.

[View Tutorial](mnist/download/index.md)


## Visual Object Recognition

We will be releasing our state-of-the-art Inception object recognition model,
complete and already trained.

COMING SOON


## Deep Dream Visual Hallucinations

Building on the Inception recognition model, we will release a TensorFlow
version of the [Deep Dream](https://github.com/google/deepdream) neural network
visual hallucination software.

COMING SOON


<div class='sections-order' style="display: none;">
<!--
<!-- mnist/beginners/index.md -->
<!-- mnist/pros/index.md -->
<!-- mnist/tf/index.md -->
<!-- deep_cnn/index.md -->
<!-- word2vec/index.md -->
<!-- recurrent/index.md -->
<!-- seq2seq/index.md -->
<!-- mandelbrot/index.md -->
<!-- pdes/index.md -->
<!-- mnist/download/index.md -->
-->
</div>