aboutsummaryrefslogtreecommitdiffhomepage
path: root/tensorflow/g3doc/tutorials/index.md
blob: 98e1d60fbcff7ab9de30117363459b0be2d124ad (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
# Tutorials


## MNIST For ML Beginners

If you're new to machine learning, we recommend starting here.  You'll learn
about a classic problem, handwritten digit classification (MNIST), and get a
gentle introduction to multiclass classification.

[View Tutorial](../tutorials/mnist/beginners/index.md)


## Deep MNIST for Experts

If you're already familiar with other deep learning software packages, and are
already familiar with MNIST, this tutorial with give you a very brief primer on
TensorFlow.

[View Tutorial](../tutorials/mnist/pros/index.md)


## TensorFlow Mechanics 101

This is a technical tutorial, where we walk you through the details of using
TensorFlow infrastructure to train models at scale.  We use again MNIST as the
example.

[View Tutorial](../tutorials/mnist/tf/index.md)


## Convolutional Neural Networks

An introduction to convolutional neural networks using the CIFAR-10 data set.
Convolutional neural nets are particularly tailored to images, since they
exploit translation invariance to yield more compact and effective
representations of visual content.

[View Tutorial](../tutorials/deep_cnn/index.md)


## Vector Representations of Words

This tutorial motivates why it is useful to learn to represent words as vectors
(called *word embeddings*). It introduces the word2vec model as an efficient
method for learning embeddings. It also covers the high-level details behind
noise-contrastive training methods (the biggest recent advance in training
embeddings).

[View Tutorial](../tutorials/word2vec/index.md)


## Recurrent Neural Networks

An introduction to RNNs, wherein we train an LSTM network to predict the next
word in an English sentence.  (A task sometimes called language modeling.)

[View Tutorial](../tutorials/recurrent/index.md)


## Sequence-to-Sequence Models

A follow on to the RNN tutorial, where we assemble a sequence-to-sequence model
for machine translation.  You will learn to build your own English-to-French
translator, entirely machine learned, end-to-end.

[View Tutorial](../tutorials/seq2seq/index.md)


## Mandelbrot Set

TensorFlow can be used for computation that has nothing to do with machine
learning.  Here's a naive implementation of Mandelbrot set visualization.

[View Tutorial](../tutorials/mandelbrot/index.md)


## Partial Differential Equations

As another example of non-machine learning computation, we offer an example of
a naive PDE simulation of raindrops landing on a pond.

[View Tutorial](../tutorials/pdes/index.md)


## MNIST Data Download

Details about downloading the MNIST handwritten digits data set.  Exciting
stuff.

[View Tutorial](../tutorials/mnist/download/index.md)


## Image Recognition

How to run object recognition using a convolutional neural network
trained on ImageNet Challenge data and label set.

[View Tutorial](../tutorials/image_recognition/index.md)

We will soon be releasing code for training a state-of-the-art Inception model.


## Deep Dream Visual Hallucinations

Building on the Inception recognition model, we will release a TensorFlow
version of the [Deep Dream](https://github.com/google/deepdream) neural network
visual hallucination software.

COMING SOON