forked from Theano/Theano
-
Notifications
You must be signed in to change notification settings - Fork 1
/
EMAIL.txt
98 lines (66 loc) · 2.94 KB
/
EMAIL.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
===========================
Announcing Theano 0.4.0
===========================
This is a major release, with lots of new features, bug fixes, and some
interface changes (deprecated or potentially misleading features were
removed). The upgrade is recommended for everybody, unless you rely on
deprecated features that have been removed.
For those using the bleeding edge version in the
mercurial repository, we encourage you to update to the `0.4.0` tag.
Deleting old cache
------------------
The caching mechanism for compiled C modules has been updated.
In some cases, using previously-compiled modules with the new version of
Theano can lead to high memory usage and code slow-down. If you experience
these symptoms, we encourage you to clear your cache.
The easiest way to do that is to execute:
theano-cache clear
(The theano-cache executable is in Theano/bin.)
What's New
----------
[Include the content of NEWS.txt here]
Download
--------
You can download Theano from http://pypi.python.org/pypi/Theano.
Description
-----------
Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving
multi-dimensional arrays. It is built on top of NumPy. Theano
features:
* tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
* transparent use of a GPU: perform data-intensive computations up to
140x faster than on a CPU (support for float32 only).
* efficient symbolic differentiation: Theano can compute derivatives
for functions of one or many inputs.
* speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
* dynamic C code generation: evaluate expressions faster.
* extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.
Theano has been powering large-scale computationally intensive
scientific research since 2007, but it is also approachable
enough to be used in the classroom (IFT6266 at the University of Montreal).
Resources
---------
About Theano:
http://deeplearning.net/software/theano/
About NumPy:
http://numpy.scipy.org/
About SciPy:
http://www.scipy.org/
Machine Learning Tutorial with Theano on Deep Architectures:
http://deeplearning.net/tutorial/
Acknowledgments
---------------
I would like to thank all contributors of Theano. For this particular
release, many people have helped during the release sprint: (in
alphabetical order) Frederic Bastien, Arnaud Bergeron, James Bergstra,
Nicolas Boulanger-Lewandowski, Raul Chandias Ferrari, Olivier
Delalleau, Guillaume Desjardins, Philippe Hamel, Pascal Lamblin,
Razvan Pascanu and David Warde-Farley.
Also, thank you to all NumPy and Scipy developers as Theano builds on
its strength.
All questions/comments are always welcome on the Theano
mailing-lists ( http://deeplearning.net/software/theano/ )