├── Python神经网络编程.pdf
├── Readme.txt
└── code
├── LICENSE
├── README.md
├── mnist_dataset
├── mnist_readme.txt
├── mnist_test_10.csv
└── mnist_train_100.csv
├── my_own_images
├── 2828_my_own_2.png
├── 2828_my_own_3.png
├── 2828_my_own_4.png
├── 2828_my_own_5.png
├── 2828_my_own_6.png
├── 2828_my_own_image.png
├── 2828_my_own_noisy_6.png
└── readme.txt
├── part2_mnist_data_set.ipynb
├── part2_neural_network.ipynb
├── part2_neural_network_mnist_data.ipynb
├── part3_load_own_images.ipynb
├── part3_mnist_data_set_with_rotations.ipynb
├── part3_neural_network_mnist_and_own_data.ipynb
├── part3_neural_network_mnist_and_own_single_image.ipynb
├── part3_neural_network_mnist_backquery.ipynb
└── part3_neural_network_mnist_data_with_rotations.ipynb
/Python神经网络编程.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/Python神经网络编程.pdf
--------------------------------------------------------------------------------
/Readme.txt:
--------------------------------------------------------------------------------
1 | Python神经网络编程/Python神經網絡編程 [Python Neural Network]
2 |
3 | 資料來源:
4 | https://github.com/ZhiqiangHo/awesome-machine-learning
5 | 下载地址:https://box.lenovo.com/l/muUDSf 提取码:6954
6 |
7 | GITHUB: https://github.com/jash-git/Book-Python-Neural-Network
8 |
9 | 第1章 神经网络如何工作
10 | 1.1 尺有所短,寸有所长
11 | 1.2 一台简单的预测机
12 | 1.3 分类器与预测器并无太大差别
13 | 1.4 训练简单的分类器
14 | 1.5 有时候一个分类器不足以求解问题
15 | 1.6 神经元——大自然的计算机器
16 | 1.7 在神经网络中追踪信号
17 | 1.8 凭心而论,矩阵乘法大有用途
18 | 1.9 使用矩阵乘法的三层神经网络示例
19 | 1.10 学习来自多个节点的权重
20 | 1.11 多个输出节点反向传播误差
21 | 1.12 反向传播误差到更多层中
22 | 1.13 使用矩阵乘法进行反向传播误差
23 | 1.14 我们实际上如何更新权重
24 | 1.15 权重更新成功范例
25 | 1.16 准备数据
26 | 第2章 使用Python进行DIY
27 | 2.1 Python
28 | 2.2 交互式Python = IPython
29 | 2.3 优雅地开始使用Python
30 | 2.4 使用Python制作神经网络
31 | 2.5 手写数字的数据集MNIST
32 | 第3章 趣味盎然
33 | 3.1 自己的手写数字
34 | 3.2 神经网络大脑内部
35 | 3.3 创建新的训练数据:旋转图像
36 | 3.4 结语
37 | 附录A 微积分简介
38 | A.1 一条平直的线
39 | A.2 一条斜线
40 | A.3 一条曲线
41 | A.4 手绘微积分
42 | A.5 非手绘微积分
43 | A.6 无需绘制图表的微积分
44 | A.7 模式
45 | A.8 函数的函数
46 | 附录B 使用树莓派来工作
47 | B.1 安装IPython
48 | B.2 确保各项工作正常进行
49 | B.3 训练和测试神经网络
50 | B.4 树莓派成功了
51 |
52 |
53 | --------------
54 |
55 | 第1章 神經網絡如何工作
56 | 1.1 尺有所短,寸有所長
57 | 1.2 一台簡單的預測機
58 | 1.3 分類器與預測器並無太大差別
59 | 1.4 訓練簡單的分類器
60 | 1.5 有時候一個分類器不足以求解問題
61 | 1.6 神經元——大自然的計算機器
62 | 1.7 在神經網絡中追踪信號
63 | 1.8 憑心而論,矩陣乘法大有用途
64 | 1.9 使用矩陣乘法的三層神經網絡示例
65 | 1.10 學習來自多個節點的權重
66 | 1.11 多個輸出節點反向傳播誤差
67 | 1.12 反向傳播誤差到更多層中
68 | 1.13 使用矩陣乘法進行反向傳播誤差
69 | 1.14 我們實際上如何更新權重
70 | 1.15 權重更新成功範例
71 | 1.16 準備數據
72 | 第2章 使用Python進行DIY
73 | 2.1 Python
74 | 2.2 交互式Python = IPython
75 | 2.3 優雅地開始使用Python
76 | 2.4 使用Python製作神經網絡
77 | 2.5 手寫數字的數據集MNIST
78 | 第3章 趣味盎然
79 | 3.1 自己的手寫數字
80 | 3.2 神經網絡大腦內部
81 | 3.3 創建新的訓練數據:旋轉圖像
82 | 3.4 結語
83 | 附錄A 微積分簡介
84 | A.1 一條平直的線
85 | A.2 一條斜線
86 | A.3 一條曲線
87 | A.4 手繪微積分
88 | A.5 非手繪微積分
89 | A.6 無需繪製圖表的微積分
90 | A.7 模式
91 | A.8 函數的函數
92 | 附錄B 使用樹莓派來工作
93 | B.1 安裝IPython
94 | B.2 確保各項工作正常進行
95 | B.3 訓練和測試神經網絡
96 | B.4 樹莓派成功了
--------------------------------------------------------------------------------
/code/LICENSE:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 2, June 1991
3 |
4 | Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
5 | 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
6 | Everyone is permitted to copy and distribute verbatim copies
7 | of this license document, but changing it is not allowed.
8 |
9 | Preamble
10 |
11 | The licenses for most software are designed to take away your
12 | freedom to share and change it. By contrast, the GNU General Public
13 | License is intended to guarantee your freedom to share and change free
14 | software--to make sure the software is free for all its users. This
15 | General Public License applies to most of the Free Software
16 | Foundation's software and to any other program whose authors commit to
17 | using it. (Some other Free Software Foundation software is covered by
18 | the GNU Lesser General Public License instead.) You can apply it to
19 | your programs, too.
20 |
21 | When we speak of free software, we are referring to freedom, not
22 | price. Our General Public Licenses are designed to make sure that you
23 | have the freedom to distribute copies of free software (and charge for
24 | this service if you wish), that you receive source code or can get it
25 | if you want it, that you can change the software or use pieces of it
26 | in new free programs; and that you know you can do these things.
27 |
28 | To protect your rights, we need to make restrictions that forbid
29 | anyone to deny you these rights or to ask you to surrender the rights.
30 | These restrictions translate to certain responsibilities for you if you
31 | distribute copies of the software, or if you modify it.
32 |
33 | For example, if you distribute copies of such a program, whether
34 | gratis or for a fee, you must give the recipients all the rights that
35 | you have. You must make sure that they, too, receive or can get the
36 | source code. And you must show them these terms so they know their
37 | rights.
38 |
39 | We protect your rights with two steps: (1) copyright the software, and
40 | (2) offer you this license which gives you legal permission to copy,
41 | distribute and/or modify the software.
42 |
43 | Also, for each author's protection and ours, we want to make certain
44 | that everyone understands that there is no warranty for this free
45 | software. If the software is modified by someone else and passed on, we
46 | want its recipients to know that what they have is not the original, so
47 | that any problems introduced by others will not reflect on the original
48 | authors' reputations.
49 |
50 | Finally, any free program is threatened constantly by software
51 | patents. We wish to avoid the danger that redistributors of a free
52 | program will individually obtain patent licenses, in effect making the
53 | program proprietary. To prevent this, we have made it clear that any
54 | patent must be licensed for everyone's free use or not licensed at all.
55 |
56 | The precise terms and conditions for copying, distribution and
57 | modification follow.
58 |
59 | GNU GENERAL PUBLIC LICENSE
60 | TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
61 |
62 | 0. This License applies to any program or other work which contains
63 | a notice placed by the copyright holder saying it may be distributed
64 | under the terms of this General Public License. The "Program", below,
65 | refers to any such program or work, and a "work based on the Program"
66 | means either the Program or any derivative work under copyright law:
67 | that is to say, a work containing the Program or a portion of it,
68 | either verbatim or with modifications and/or translated into another
69 | language. (Hereinafter, translation is included without limitation in
70 | the term "modification".) Each licensee is addressed as "you".
71 |
72 | Activities other than copying, distribution and modification are not
73 | covered by this License; they are outside its scope. The act of
74 | running the Program is not restricted, and the output from the Program
75 | is covered only if its contents constitute a work based on the
76 | Program (independent of having been made by running the Program).
77 | Whether that is true depends on what the Program does.
78 |
79 | 1. You may copy and distribute verbatim copies of the Program's
80 | source code as you receive it, in any medium, provided that you
81 | conspicuously and appropriately publish on each copy an appropriate
82 | copyright notice and disclaimer of warranty; keep intact all the
83 | notices that refer to this License and to the absence of any warranty;
84 | and give any other recipients of the Program a copy of this License
85 | along with the Program.
86 |
87 | You may charge a fee for the physical act of transferring a copy, and
88 | you may at your option offer warranty protection in exchange for a fee.
89 |
90 | 2. You may modify your copy or copies of the Program or any portion
91 | of it, thus forming a work based on the Program, and copy and
92 | distribute such modifications or work under the terms of Section 1
93 | above, provided that you also meet all of these conditions:
94 |
95 | a) You must cause the modified files to carry prominent notices
96 | stating that you changed the files and the date of any change.
97 |
98 | b) You must cause any work that you distribute or publish, that in
99 | whole or in part contains or is derived from the Program or any
100 | part thereof, to be licensed as a whole at no charge to all third
101 | parties under the terms of this License.
102 |
103 | c) If the modified program normally reads commands interactively
104 | when run, you must cause it, when started running for such
105 | interactive use in the most ordinary way, to print or display an
106 | announcement including an appropriate copyright notice and a
107 | notice that there is no warranty (or else, saying that you provide
108 | a warranty) and that users may redistribute the program under
109 | these conditions, and telling the user how to view a copy of this
110 | License. (Exception: if the Program itself is interactive but
111 | does not normally print such an announcement, your work based on
112 | the Program is not required to print an announcement.)
113 |
114 | These requirements apply to the modified work as a whole. If
115 | identifiable sections of that work are not derived from the Program,
116 | and can be reasonably considered independent and separate works in
117 | themselves, then this License, and its terms, do not apply to those
118 | sections when you distribute them as separate works. But when you
119 | distribute the same sections as part of a whole which is a work based
120 | on the Program, the distribution of the whole must be on the terms of
121 | this License, whose permissions for other licensees extend to the
122 | entire whole, and thus to each and every part regardless of who wrote it.
123 |
124 | Thus, it is not the intent of this section to claim rights or contest
125 | your rights to work written entirely by you; rather, the intent is to
126 | exercise the right to control the distribution of derivative or
127 | collective works based on the Program.
128 |
129 | In addition, mere aggregation of another work not based on the Program
130 | with the Program (or with a work based on the Program) on a volume of
131 | a storage or distribution medium does not bring the other work under
132 | the scope of this License.
133 |
134 | 3. You may copy and distribute the Program (or a work based on it,
135 | under Section 2) in object code or executable form under the terms of
136 | Sections 1 and 2 above provided that you also do one of the following:
137 |
138 | a) Accompany it with the complete corresponding machine-readable
139 | source code, which must be distributed under the terms of Sections
140 | 1 and 2 above on a medium customarily used for software interchange; or,
141 |
142 | b) Accompany it with a written offer, valid for at least three
143 | years, to give any third party, for a charge no more than your
144 | cost of physically performing source distribution, a complete
145 | machine-readable copy of the corresponding source code, to be
146 | distributed under the terms of Sections 1 and 2 above on a medium
147 | customarily used for software interchange; or,
148 |
149 | c) Accompany it with the information you received as to the offer
150 | to distribute corresponding source code. (This alternative is
151 | allowed only for noncommercial distribution and only if you
152 | received the program in object code or executable form with such
153 | an offer, in accord with Subsection b above.)
154 |
155 | The source code for a work means the preferred form of the work for
156 | making modifications to it. For an executable work, complete source
157 | code means all the source code for all modules it contains, plus any
158 | associated interface definition files, plus the scripts used to
159 | control compilation and installation of the executable. However, as a
160 | special exception, the source code distributed need not include
161 | anything that is normally distributed (in either source or binary
162 | form) with the major components (compiler, kernel, and so on) of the
163 | operating system on which the executable runs, unless that component
164 | itself accompanies the executable.
165 |
166 | If distribution of executable or object code is made by offering
167 | access to copy from a designated place, then offering equivalent
168 | access to copy the source code from the same place counts as
169 | distribution of the source code, even though third parties are not
170 | compelled to copy the source along with the object code.
171 |
172 | 4. You may not copy, modify, sublicense, or distribute the Program
173 | except as expressly provided under this License. Any attempt
174 | otherwise to copy, modify, sublicense or distribute the Program is
175 | void, and will automatically terminate your rights under this License.
176 | However, parties who have received copies, or rights, from you under
177 | this License will not have their licenses terminated so long as such
178 | parties remain in full compliance.
179 |
180 | 5. You are not required to accept this License, since you have not
181 | signed it. However, nothing else grants you permission to modify or
182 | distribute the Program or its derivative works. These actions are
183 | prohibited by law if you do not accept this License. Therefore, by
184 | modifying or distributing the Program (or any work based on the
185 | Program), you indicate your acceptance of this License to do so, and
186 | all its terms and conditions for copying, distributing or modifying
187 | the Program or works based on it.
188 |
189 | 6. Each time you redistribute the Program (or any work based on the
190 | Program), the recipient automatically receives a license from the
191 | original licensor to copy, distribute or modify the Program subject to
192 | these terms and conditions. You may not impose any further
193 | restrictions on the recipients' exercise of the rights granted herein.
194 | You are not responsible for enforcing compliance by third parties to
195 | this License.
196 |
197 | 7. If, as a consequence of a court judgment or allegation of patent
198 | infringement or for any other reason (not limited to patent issues),
199 | conditions are imposed on you (whether by court order, agreement or
200 | otherwise) that contradict the conditions of this License, they do not
201 | excuse you from the conditions of this License. If you cannot
202 | distribute so as to satisfy simultaneously your obligations under this
203 | License and any other pertinent obligations, then as a consequence you
204 | may not distribute the Program at all. For example, if a patent
205 | license would not permit royalty-free redistribution of the Program by
206 | all those who receive copies directly or indirectly through you, then
207 | the only way you could satisfy both it and this License would be to
208 | refrain entirely from distribution of the Program.
209 |
210 | If any portion of this section is held invalid or unenforceable under
211 | any particular circumstance, the balance of the section is intended to
212 | apply and the section as a whole is intended to apply in other
213 | circumstances.
214 |
215 | It is not the purpose of this section to induce you to infringe any
216 | patents or other property right claims or to contest validity of any
217 | such claims; this section has the sole purpose of protecting the
218 | integrity of the free software distribution system, which is
219 | implemented by public license practices. Many people have made
220 | generous contributions to the wide range of software distributed
221 | through that system in reliance on consistent application of that
222 | system; it is up to the author/donor to decide if he or she is willing
223 | to distribute software through any other system and a licensee cannot
224 | impose that choice.
225 |
226 | This section is intended to make thoroughly clear what is believed to
227 | be a consequence of the rest of this License.
228 |
229 | 8. If the distribution and/or use of the Program is restricted in
230 | certain countries either by patents or by copyrighted interfaces, the
231 | original copyright holder who places the Program under this License
232 | may add an explicit geographical distribution limitation excluding
233 | those countries, so that distribution is permitted only in or among
234 | countries not thus excluded. In such case, this License incorporates
235 | the limitation as if written in the body of this License.
236 |
237 | 9. The Free Software Foundation may publish revised and/or new versions
238 | of the General Public License from time to time. Such new versions will
239 | be similar in spirit to the present version, but may differ in detail to
240 | address new problems or concerns.
241 |
242 | Each version is given a distinguishing version number. If the Program
243 | specifies a version number of this License which applies to it and "any
244 | later version", you have the option of following the terms and conditions
245 | either of that version or of any later version published by the Free
246 | Software Foundation. If the Program does not specify a version number of
247 | this License, you may choose any version ever published by the Free Software
248 | Foundation.
249 |
250 | 10. If you wish to incorporate parts of the Program into other free
251 | programs whose distribution conditions are different, write to the author
252 | to ask for permission. For software which is copyrighted by the Free
253 | Software Foundation, write to the Free Software Foundation; we sometimes
254 | make exceptions for this. Our decision will be guided by the two goals
255 | of preserving the free status of all derivatives of our free software and
256 | of promoting the sharing and reuse of software generally.
257 |
258 | NO WARRANTY
259 |
260 | 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
261 | FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
262 | OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
263 | PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
264 | OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
265 | MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
266 | TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
267 | PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
268 | REPAIR OR CORRECTION.
269 |
270 | 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
271 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
272 | REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
273 | INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
274 | OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
275 | TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
276 | YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
277 | PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
278 | POSSIBILITY OF SUCH DAMAGES.
279 |
280 | END OF TERMS AND CONDITIONS
281 |
282 | How to Apply These Terms to Your New Programs
283 |
284 | If you develop a new program, and you want it to be of the greatest
285 | possible use to the public, the best way to achieve this is to make it
286 | free software which everyone can redistribute and change under these terms.
287 |
288 | To do so, attach the following notices to the program. It is safest
289 | to attach them to the start of each source file to most effectively
290 | convey the exclusion of warranty; and each file should have at least
291 | the "copyright" line and a pointer to where the full notice is found.
292 |
293 | {description}
294 | Copyright (C) {year} {fullname}
295 |
296 | This program is free software; you can redistribute it and/or modify
297 | it under the terms of the GNU General Public License as published by
298 | the Free Software Foundation; either version 2 of the License, or
299 | (at your option) any later version.
300 |
301 | This program is distributed in the hope that it will be useful,
302 | but WITHOUT ANY WARRANTY; without even the implied warranty of
303 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
304 | GNU General Public License for more details.
305 |
306 | You should have received a copy of the GNU General Public License along
307 | with this program; if not, write to the Free Software Foundation, Inc.,
308 | 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
309 |
310 | Also add information on how to contact you by electronic and paper mail.
311 |
312 | If the program is interactive, make it output a short notice like this
313 | when it starts in an interactive mode:
314 |
315 | Gnomovision version 69, Copyright (C) year name of author
316 | Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
317 | This is free software, and you are welcome to redistribute it
318 | under certain conditions; type `show c' for details.
319 |
320 | The hypothetical commands `show w' and `show c' should show the appropriate
321 | parts of the General Public License. Of course, the commands you use may
322 | be called something other than `show w' and `show c'; they could even be
323 | mouse-clicks or menu items--whatever suits your program.
324 |
325 | You should also get your employer (if you work as a programmer) or your
326 | school, if any, to sign a "copyright disclaimer" for the program, if
327 | necessary. Here is a sample; alter the names:
328 |
329 | Yoyodyne, Inc., hereby disclaims all copyright interest in the program
330 | `Gnomovision' (which makes passes at compilers) written by James Hacker.
331 |
332 | {signature of Ty Coon}, 1 April 1989
333 | Ty Coon, President of Vice
334 |
335 | This General Public License does not permit incorporating your program into
336 | proprietary programs. If your program is a subroutine library, you may
337 | consider it more useful to permit linking proprietary applications with the
338 | library. If this is what you want to do, use the GNU Lesser General
339 | Public License instead of this License.
340 |
--------------------------------------------------------------------------------
/code/README.md:
--------------------------------------------------------------------------------
1 | # makeyourownneuralnetwork
2 | Code for the Make Your Own Neural Network book
3 |
--------------------------------------------------------------------------------
/code/mnist_dataset/mnist_readme.txt:
--------------------------------------------------------------------------------
1 | These are small subsets of the MNIST data set, transformed into CSV, and made available for easy testing as your code develops.
2 |
3 | The full dataset in CSV format is available at: http://pjreddie.com/projects/mnist-in-csv/
4 |
--------------------------------------------------------------------------------
/code/mnist_dataset/mnist_test_10.csv:
--------------------------------------------------------------------------------
1 | 7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,84,185,159,151,60,36,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,222,254,254,254,254,241,198,198,198,198,198,198,198,198,170,52,0,0,0,0,0,0,0,0,0,0,0,0,67,114,72,114,163,227,254,225,254,254,254,250,229,254,254,140,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,17,66,14,67,67,67,59,21,236,254,106,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,83,253,209,18,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,233,255,83,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,129,254,238,44,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,59,249,254,62,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,133,254,187,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,9,205,248,58,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,126,254,182,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,75,251,240,57,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19,221,254,166,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,203,254,219,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,38,254,254,77,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,31,224,254,115,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,133,254,254,52,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,61,242,254,254,52,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,121,254,254,219,40,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,121,254,207,18,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
2 | 2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,116,125,171,255,255,150,93,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,169,253,253,253,253,253,253,218,30,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,169,253,253,253,213,142,176,253,253,122,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,52,250,253,210,32,12,0,6,206,253,140,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,77,251,210,25,0,0,0,122,248,253,65,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,31,18,0,0,0,0,209,253,253,65,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,117,247,253,198,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,76,247,253,231,63,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,128,253,253,144,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,176,246,253,159,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,25,234,253,233,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,198,253,253,141,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,78,248,253,189,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19,200,253,253,141,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,134,253,253,173,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,248,253,253,25,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,248,253,253,43,20,20,20,20,5,0,5,20,20,37,150,150,150,147,10,0,0,0,0,0,0,0,0,0,248,253,253,253,253,253,253,253,168,143,166,253,253,253,253,253,253,253,123,0,0,0,0,0,0,0,0,0,174,253,253,253,253,253,253,253,253,253,253,253,249,247,247,169,117,117,57,0,0,0,0,0,0,0,0,0,0,118,123,123,123,166,253,253,253,155,123,123,41,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
3 | 1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,38,254,109,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,87,252,82,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,135,241,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,45,244,150,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,84,254,63,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,202,223,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,32,254,216,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,95,254,195,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,140,254,77,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,57,237,205,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,124,255,165,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,171,254,81,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,24,232,215,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,120,254,159,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,151,254,142,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,228,254,66,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,61,251,254,66,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,141,254,205,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,10,215,254,121,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,5,198,176,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
4 | 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,11,150,253,202,31,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,37,251,251,253,107,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,21,197,251,251,253,107,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,110,190,251,251,251,253,169,109,62,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,253,251,251,251,251,253,251,251,220,51,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,182,255,253,253,253,253,234,222,253,253,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,63,221,253,251,251,251,147,77,62,128,251,251,105,0,0,0,0,0,0,0,0,0,0,0,0,0,0,32,231,251,253,251,220,137,10,0,0,31,230,251,243,113,5,0,0,0,0,0,0,0,0,0,0,0,0,37,251,251,253,188,20,0,0,0,0,0,109,251,253,251,35,0,0,0,0,0,0,0,0,0,0,0,0,37,251,251,201,30,0,0,0,0,0,0,31,200,253,251,35,0,0,0,0,0,0,0,0,0,0,0,0,37,253,253,0,0,0,0,0,0,0,0,32,202,255,253,164,0,0,0,0,0,0,0,0,0,0,0,0,140,251,251,0,0,0,0,0,0,0,0,109,251,253,251,35,0,0,0,0,0,0,0,0,0,0,0,0,217,251,251,0,0,0,0,0,0,21,63,231,251,253,230,30,0,0,0,0,0,0,0,0,0,0,0,0,217,251,251,0,0,0,0,0,0,144,251,251,251,221,61,0,0,0,0,0,0,0,0,0,0,0,0,0,217,251,251,0,0,0,0,0,182,221,251,251,251,180,0,0,0,0,0,0,0,0,0,0,0,0,0,0,218,253,253,73,73,228,253,253,255,253,253,253,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,113,251,251,253,251,251,251,251,253,251,251,251,147,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,31,230,251,253,251,251,251,251,253,230,189,35,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,62,142,253,251,251,251,251,253,107,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,72,174,251,173,71,72,30,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
5 | 4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,50,224,0,0,0,0,0,0,0,70,29,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,121,231,0,0,0,0,0,0,0,148,168,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4,195,231,0,0,0,0,0,0,0,96,210,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,69,252,134,0,0,0,0,0,0,0,114,252,21,0,0,0,0,0,0,0,0,0,0,0,0,0,0,45,236,217,12,0,0,0,0,0,0,0,192,252,21,0,0,0,0,0,0,0,0,0,0,0,0,0,0,168,247,53,0,0,0,0,0,0,0,18,255,253,21,0,0,0,0,0,0,0,0,0,0,0,0,0,84,242,211,0,0,0,0,0,0,0,0,141,253,189,5,0,0,0,0,0,0,0,0,0,0,0,0,0,169,252,106,0,0,0,0,0,0,0,32,232,250,66,0,0,0,0,0,0,0,0,0,0,0,0,0,15,225,252,0,0,0,0,0,0,0,0,134,252,211,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,252,164,0,0,0,0,0,0,0,0,169,252,167,0,0,0,0,0,0,0,0,0,0,0,0,0,0,9,204,209,18,0,0,0,0,0,0,22,253,253,107,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,169,252,199,85,85,85,85,129,164,195,252,252,106,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,41,170,245,252,252,252,252,232,231,251,252,252,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,49,84,84,84,84,0,0,161,252,252,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,127,252,252,45,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,128,253,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,127,252,252,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,135,252,244,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,232,236,111,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,179,66,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
6 | 1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,77,254,107,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19,227,254,254,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,81,254,254,165,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,7,203,254,254,73,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,53,254,254,250,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,134,254,254,180,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,196,254,248,48,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,58,254,254,237,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,111,254,254,132,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,163,254,238,28,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,60,252,254,223,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,79,254,254,154,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,163,254,238,53,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,28,252,254,210,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,86,254,254,131,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,105,254,234,20,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,175,254,204,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,5,211,254,196,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,158,254,160,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,26,157,107,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
7 | 4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,192,134,32,0,0,0,0,0,0,0,0,15,77,5,0,0,0,0,0,0,0,0,0,0,0,0,17,235,250,169,0,0,0,0,0,0,0,0,15,220,241,37,0,0,0,0,0,0,0,0,0,0,0,20,189,253,147,0,0,0,0,0,0,0,0,0,139,253,100,0,0,0,0,0,0,0,0,0,0,0,0,70,253,253,21,0,0,0,0,0,0,0,0,43,254,173,13,0,0,0,0,0,0,0,0,0,0,0,22,153,253,96,0,0,0,0,0,0,0,0,43,231,254,92,0,0,0,0,0,0,0,0,0,0,0,0,163,255,204,11,0,0,0,0,0,0,0,0,104,254,158,0,0,0,0,0,0,0,0,0,0,0,0,0,162,253,178,5,0,0,0,0,0,0,9,131,237,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,162,253,253,191,175,70,70,70,70,133,197,253,253,169,0,0,0,0,0,0,0,0,0,0,0,0,0,0,51,228,253,253,254,253,253,253,253,254,253,253,219,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,17,65,137,254,232,137,137,137,44,253,253,161,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,34,254,206,21,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,160,253,69,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,85,254,241,50,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,158,254,165,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,231,244,50,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,104,254,232,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,208,253,157,0,13,30,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,208,253,154,91,204,161,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,208,253,254,253,154,29,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,61,190,128,23,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
8 | 9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,14,149,193,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,91,224,253,253,19,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,28,235,254,253,253,166,18,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,144,253,254,253,253,253,238,115,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,31,241,253,208,185,253,253,253,231,24,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,79,254,193,0,8,98,219,254,255,201,18,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,86,253,80,0,0,0,182,253,254,191,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,175,253,155,0,0,0,234,253,254,135,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,86,253,208,40,85,166,251,237,254,236,42,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,18,238,253,254,253,253,185,36,216,253,152,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,68,240,255,254,145,8,0,134,254,223,35,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,68,158,142,12,0,0,9,175,253,161,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,88,253,226,18,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,166,253,126,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,48,245,253,38,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,115,254,172,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,21,218,254,46,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,30,254,165,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,186,244,42,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,14,223,78,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
9 | 5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,17,47,47,47,16,129,85,47,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,75,153,217,253,253,253,215,246,253,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,35,142,244,252,253,253,253,253,253,253,253,253,253,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,63,253,253,253,253,253,253,253,213,170,170,170,170,0,0,0,0,0,0,0,0,0,0,0,20,132,72,0,57,238,227,238,168,124,69,20,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,11,206,253,78,0,0,32,0,30,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,6,177,253,132,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,12,133,253,233,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,92,253,223,28,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,150,253,174,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,234,253,246,127,49,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,255,253,253,253,251,147,91,121,85,42,42,85,28,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,139,253,253,253,253,253,253,253,253,253,253,253,232,168,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,53,218,222,251,253,253,253,253,253,253,253,253,252,124,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,67,72,200,253,253,253,253,253,253,253,175,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,120,253,249,152,51,164,253,253,175,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,50,253,253,253,188,252,253,253,148,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,9,167,253,253,253,253,250,175,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,23,180,231,253,221,128,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,93,149,22,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
10 | 9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,36,56,137,201,199,95,37,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,45,152,234,254,254,254,254,254,250,211,151,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,46,153,240,254,254,227,166,133,251,200,254,229,225,104,0,0,0,0,0,0,0,0,0,0,0,0,0,153,234,254,254,187,142,8,0,0,191,40,198,246,223,253,21,0,0,0,0,0,0,0,0,0,0,8,126,253,254,233,128,11,0,0,0,0,210,43,70,254,254,254,21,0,0,0,0,0,0,0,0,0,0,72,243,254,228,54,0,0,0,0,3,32,116,225,242,254,255,162,5,0,0,0,0,0,0,0,0,0,0,75,240,254,223,109,138,178,178,169,210,251,231,254,254,254,232,38,0,0,0,0,0,0,0,0,0,0,0,9,175,244,253,255,254,254,251,254,254,254,254,254,252,171,25,0,0,0,0,0,0,0,0,0,0,0,0,0,0,16,136,195,176,146,153,200,254,254,254,254,150,16,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,162,254,254,241,99,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,118,250,254,254,90,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,100,242,254,254,211,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,54,241,254,254,242,59,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,131,254,254,244,64,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,13,249,254,254,152,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,12,228,254,254,208,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,78,255,254,254,66,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,209,254,254,137,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,227,255,233,25,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,113,255,108,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
11 |
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_2.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_3.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_4.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_5.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_5.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_6.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_image.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_image.png
--------------------------------------------------------------------------------
/code/my_own_images/2828_my_own_noisy_6.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jash-git/Book-Python-Neural-Network/25d09c171230a606d52886d4bedb379889b37920/code/my_own_images/2828_my_own_noisy_6.png
--------------------------------------------------------------------------------
/code/my_own_images/readme.txt:
--------------------------------------------------------------------------------
1 | These files are PNG images of size 28 x 28
2 |
3 | The file name pattern is 2828_my_own_....N_.png where N is the actual label (eg 3) and the ... can be any short text
4 |
--------------------------------------------------------------------------------
/code/part2_mnist_data_set.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 2,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# working with the MNIST data set\n",
13 | "#\n",
14 | "# (c) Tariq Rashid, 2016\n",
15 | "# license is GPLv2"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 3,
21 | "metadata": {},
22 | "outputs": [],
23 | "source": [
24 | "import numpy\n",
25 | "import matplotlib.pyplot\n",
26 | "%matplotlib inline"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "execution_count": 4,
32 | "metadata": {},
33 | "outputs": [],
34 | "source": [
35 | "# open the CSV file and read its contents into a list\n",
36 | "data_file = open(\"mnist_dataset/mnist_train_100.csv\", 'r')\n",
37 | "data_list = data_file.readlines()\n",
38 | "data_file.close()"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": 5,
44 | "metadata": {},
45 | "outputs": [
46 | {
47 | "data": {
48 | "text/plain": [
49 | "100"
50 | ]
51 | },
52 | "execution_count": 5,
53 | "metadata": {},
54 | "output_type": "execute_result"
55 | }
56 | ],
57 | "source": [
58 | "# check the number of data records (examples)\n",
59 | "len(data_list)"
60 | ]
61 | },
62 | {
63 | "cell_type": "code",
64 | "execution_count": 6,
65 | "metadata": {},
66 | "outputs": [
67 | {
68 | "data": {
69 | "text/plain": [
70 | "'0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,51,159,253,159,50,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,48,238,252,252,252,237,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,54,227,253,252,239,233,252,57,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,10,60,224,252,253,252,202,84,252,253,122,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,163,252,252,252,253,252,252,96,189,253,167,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,51,238,253,253,190,114,253,228,47,79,255,168,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,48,238,252,252,179,12,75,121,21,0,0,253,243,50,0,0,0,0,0,0,0,0,0,0,0,0,0,38,165,253,233,208,84,0,0,0,0,0,0,253,252,165,0,0,0,0,0,0,0,0,0,0,0,0,7,178,252,240,71,19,28,0,0,0,0,0,0,253,252,195,0,0,0,0,0,0,0,0,0,0,0,0,57,252,252,63,0,0,0,0,0,0,0,0,0,253,252,195,0,0,0,0,0,0,0,0,0,0,0,0,198,253,190,0,0,0,0,0,0,0,0,0,0,255,253,196,0,0,0,0,0,0,0,0,0,0,0,76,246,252,112,0,0,0,0,0,0,0,0,0,0,253,252,148,0,0,0,0,0,0,0,0,0,0,0,85,252,230,25,0,0,0,0,0,0,0,0,7,135,253,186,12,0,0,0,0,0,0,0,0,0,0,0,85,252,223,0,0,0,0,0,0,0,0,7,131,252,225,71,0,0,0,0,0,0,0,0,0,0,0,0,85,252,145,0,0,0,0,0,0,0,48,165,252,173,0,0,0,0,0,0,0,0,0,0,0,0,0,0,86,253,225,0,0,0,0,0,0,114,238,253,162,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,85,252,249,146,48,29,85,178,225,253,223,167,56,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,85,252,252,252,229,215,252,252,252,196,130,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,28,199,252,252,253,252,252,233,145,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,25,128,252,253,252,141,37,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0\\n'"
71 | ]
72 | },
73 | "execution_count": 6,
74 | "metadata": {},
75 | "output_type": "execute_result"
76 | }
77 | ],
78 | "source": [
79 | "# show a dataset record\n",
80 | "# the first number is the label, the rest are pixel colour values (greyscale 0-255)\n",
81 | "data_list[1]"
82 | ]
83 | },
84 | {
85 | "cell_type": "code",
86 | "execution_count": 7,
87 | "metadata": {},
88 | "outputs": [
89 | {
90 | "data": {
91 | "text/plain": [
92 | ""
93 | ]
94 | },
95 | "execution_count": 7,
96 | "metadata": {},
97 | "output_type": "execute_result"
98 | },
99 | {
100 | "data": {
101 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP4AAAD8CAYAAABXXhlaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADxZJREFUeJzt3X+MVfWZx/HPg4pKiT8gyxCBAtXYNZsY0rUmG1e9xN2C\npgnSKMtq/MESUpKiuE0UITEMsn+0xBAhBiOWGqjVtlvDYv9xbTXXH5GuZIVdEJQaMzBOZcANKGg0\nsDz7x1zYO+Pc77nMuT/O8LxfycQ75zlnzjMHP3POvd9z79fcXQBiGdHuBgC0HsEHAiL4QEAEHwiI\n4AMBEXwgoFzBN7OZZvaeme01syWNagpAc9lQx/HNbISkvZJukvRnSdskzXX39wasx40CQJu4uw22\nPM8Z/1pJf3L3fe5+XNKvJM2qsfPTX8uXL+/3fdG+6O/s7a/IvTWjv5Q8wZ8gqbvq+48qywAUHC/u\nAQGdm2PbHknfrPp+YmXZ13R2dp5+fMkll+TYZfOVSqV2t5BEf0NX5N6k/P2Vy2WVy+W61s3z4t45\nkt5X34t7H0t6W9I/uvueAev5UPcBYOjMTF7jxb0hn/Hd/X/NbJGkl9X3lGHDwNADKKYhn/Hr3gFn\nfKAtUmd8XtwDAiL4QEAEHwiI4AMBEXwgIIIPBETwgYAIPhAQwQcCIvhAQAQfCIjgAwERfCAggg8E\nRPCBgAg+EBDBBwIi+EBABB8IiOADARF8ICCCDwRE8IGA8kyhhQCy5kTo7u6uWVu7dm1y28cffzxZ\nf+CBB5L1xYsXJ+uTJk1K1iPjjA8ERPCBgAg+EBDBBwIi+EBABB8IiOADAVmeuevNrEvSp5JOSjru\n7tcOso7n2QeaK+vfpqenJ1mfNm1azdqRI0dy7dts0KndTxszZkyyfvDgwWT9bGdmcvdBD2LeG3hO\nSiq5++GcPwdAC+W91LcG/AwALZY3tC7p92a2zcwWNKIhAM2X91L/Onf/2Mz+Qn1/APa4+5sDV+rs\n7Dz9uFQqqVQq5dwtgIHK5bLK5XJd6+Z6ca/fDzJbLumou68esJwX9wqMF/fOXqkX94Z8qW9mo8xs\ndOXxNyR9T9Kuof48AK2T51K/Q9JmM/PKz/mlu7/cmLYANFPDLvVr7oBL/bbKOvb79u1L1qdPn56s\n79+/v2Yt61L9oosuStbPP//8ZP3QoUPJ+t69e5P1yZMnJ+vnnHNOsl50TbnUBzB8EXwgIIIPBETw\ngYAIPhAQwQcCIvhAQIzjF1zWsTt+/HiynjVOf/PNNyfrXV1dyXqqv6xx/BtvvDFZX7lyZbJ+/fXX\nJ+tZ+3/qqaeS9fnz5yfrRcc4PoB+CD4QEMEHAiL4QEAEHwiI4AMBEXwgoLyfuYc2e+ihh5L1J554\nItfPb+Y9GK+99lqy/sUXXyTrs2fPTtY3b96crG/fvj1ZP5txxgcCIvhAQAQfCIjgAwERfCAggg8E\nRPCBgBjHb7OscfLu7u5k/dlnn83187NkjZWn6nfffXdy20mTJiXrV111VbKedQ/DCy+8kKxH/pwI\nzvhAQAQfCIjgAwERfCAggg8ERPCBgAg+EFDm5+qb2QZJ35fU6+5XV5ZdKunXkiZL6pI0x90/rbF9\n6M/Vz/rde3p6kvVp06Yl60eOHDnjnqrdeeedyfr69euT9d27d9esvfPOO8lt586dm6yPGjUqWc86\ntuedd16yPnr06GR9165dyXrWfQjtlvdz9Z+RNGPAsocl/cHdvy3pVUlL87UIoJUyg+/ub0o6PGDx\nLEkbK483Srq1wX0BaKKhPscf5+69kuTuBySNa1xLAJqtUffqJ59sdXZ2nn5cKpVUKpUatFsAp5TL\nZZXL5brWHWrwe82sw917zWy8pIOplauDD6A5Bp5UV6xYUXPdei/1rfJ1youS7q08vkfSljNpEEB7\nZQbfzJ6T9JakK81sv5nNk/QTSX9vZu9LuqnyPYBhIvNS393vqFH6uwb3MixljSV/8sknyfqqVauS\n9cOHBw6o9NfR0ZGsT506NVlfuHBhsj5y5MhkPXWfQdY9CFmy5rfP6/PPP0/WH3vssWR9zZo1jWyn\npbhzDwiI4AMBEXwgIIIPBETwgYAIPhAQwQcC4nP1M2SN0584cSJZf/DBB5P1rM/Fv/jii5P1l156\nKVm/4oorkvXjx48n680eSy+yDz/8sN0tNA1nfCAggg8ERPCBgAg+EBDBBwIi+EBABB8IiHH8nPbv\n35+sZ43TZ9m6dWuyfuWVV+b6+RdeeGGu7TE8ccYHAiL4QEAEHwiI4AMBEXwgIIIPBETwgYAYx89p\n0aJFyXrW+/lnz56drGeN00d+v3yWkydPJusjRqTPe1n/dsMZZ3wgIIIPBETwgYAIPhAQwQcCIvhA\nQAQfCChzHN/MNkj6vqRed7+6smy5pAWSDlZWW+bu6Q94L6issdrt27cn66+//nqynjXOfvvtt+fa\nHrVlHbus+jXXXNPIdgqlnjP+M5JmDLJ8tbt/p/I1LEMPRJUZfHd/U9LhQUqcioBhKs9z/EVmtsPM\nfmZm6XmeABTKUO/VXyfpUXd3M/sXSaslza+1cmdn5+nHpVJJpVJpiLsFUEu5XFa5XK5r3SEF390P\nVX37tKTfpdavDj6A5hh4Ul2xYkXNdeu91DdVPac3s/FVtR9I2nVGHQJoq3qG856TVJI01sz2S1ou\nabqZTZN0UlKXpB82sUcADZYZfHe/Y5DFzzShl0L68ssvk/WvvvoqWb/sssuS9VtuueWMezpbZN1D\nceLEiWR97dq1yXrWOP1tt92WrC9dujRZH864cw8IiOADARF8ICCCDwRE8IGACD4QEMEHAuJz9Zvs\nggsuSNZHjx7dok5aL+84/ZNPPpmsL1myJFmfMmVKsr5s2bJkfeTIkcn6cMYZHwiI4AMBEXwgIIIP\nBETwgYAIPhAQwQcCYhy/ye66665kfbh/bn5qrL6npye57apVq5L1devWJevz5s1L1tevX5+sZxnu\n/zYpnPGBgAg+EBDBBwIi+EBABB8IiOADARF8ICDLes907h2YebP3kUdWb2+99VayfsMNNyTrU6dO\nTdY/+OCDZL3dso7P888/X7N2//33J7c9fHiwSZjr33716tXJ+tk8Dl8PM5O7D3oQOOMDARF8ICCC\nDwRE8IGACD4QEMEHAiL4QECZ78c3s4mSNknqkHRS0tPuvtbMLpX0a0mTJXVJmuPunzax17bIGgvO\nqnd3dyfrjz76aLI+f/78ZD3rc/nffffdZD3rPetvvPFGst7V1VWzdvnllye3nTt3brJ+3333JevR\nx+nzqOeMf0LSj939ryT9jaQfmdlfSnpY0h/c/duSXpW0tHltAmikzOC7+wF331F5fEzSHkkTJc2S\ntLGy2kZJtzarSQCNdUbP8c1siqRpkv4oqcPde6W+Pw6SxjW6OQDNUfdn7pnZaEm/lbTY3Y+Z2cCb\nuGve1N3Z2Xn6calUUqlUOrMuAWQql8sql8t1rVtX8M3sXPWF/hfuvqWyuNfMOty918zGSzpYa/vq\n4ANojoEn1RUrVtRct95L/Z9L2u3ua6qWvSjp3srjeyRtGbgRgGKqZzjvOkl3StppZtvVd0m/TNJP\nJf3GzP5J0j5Jc5rZKIDG4f34Gb1t3bo1Wc96P35eEyZMSNbHjBmTrO/cubOR7XzNjBkzatZmzpyZ\n3HbRokW59s04fhrvxwfQD8EHAiL4QEAEHwiI4AMBEXwgIIIPBMQ4fkZvn332WbI+Z076vqVXXnkl\n1/6z5B3LHjcu/d6qhQsXJuuPPPLIkPfNOHxzMY4PoB+CDwRE8IGACD4QEMEHAiL4QEAEHwgo/Dh+\nlqzejx49mqxv2rQpWV+8ePEZ91Qtayx85cqVyfqCBQuS9bFjx+baP9qHcXwA/RB8ICCCDwRE8IGA\nCD4QEMEHAiL4QECM4+d0Nv9uEuP0wxnj+AD6IfhAQAQfCIjgAwERfCAggg8ElBl8M5toZq+a2btm\nttPM7qssX25mH5nZO5Wv9JzIAAojcxzfzMZLGu/uO8xstKT/lDRL0j9IOuruqzO2P6vH8YGiSo3j\nn5u1sbsfkHSg8viYme2RNOHUz25YlwBa5oye45vZFEnTJP1HZdEiM9thZj8zs4sb3BuAJqk7+JXL\n/N9KWuzuxyStk/Qtd5+mviuC5CU/gOLIvNSXJDM7V32h/4W7b5Ekdz9UtcrTkn5Xa/vOzs7Tj0ul\nkkql0hBaBZBSLpdVLpfrWreuN+mY2SZJn7j7j6uWja88/5eZ/bOk77r7HYNsy4t7QBukXtyr51X9\n6yS9LmmnJK98LZN0h/qe75+U1CXph+7eO8j2BB9og1zBb8DOCT7QBrwtF0A/BB8IiOADARF8ICCC\nDwRE8IGACD4QEMEHAiL4QEAEHwiI4AMBEXwgoJYHv973C7cL/eVT5P6K3JvU2v4I/gD0l0+R+yty\nb9JZHnwA7UfwgYBa8kEcTd0BgJra9gk8AIqHS30gIIIPBNSy4JvZTDN7z8z2mtmSVu23XmbWZWb/\nZWbbzeztAvSzwcx6zey/q5ZdamYvm9n7Zvbv7Zy9qEZ/hZlIdZDJXu+vLC/EMWz3ZLQteY5vZiMk\n7ZV0k6Q/S9omaa67v9f0ndfJzD6U9NfufrjdvUiSmf2tpGOSNrn71ZVlP5X0P+6+qvLH81J3f7hA\n/S1XHROptkJistd5KsAxzDsZbV6tOuNfK+lP7r7P3Y9L+pX6fskiMRXoqY+7vylp4B+hWZI2Vh5v\nlHRrS5uqUqM/qSATqbr7AXffUXl8TNIeSRNVkGNYo7+WTUbbqv/RJ0jqrvr+I/3/L1kULun3ZrbN\nzBa0u5kaxp2atKQyi9G4NvczmMJNpFo12esfJXUU7Ri2YzLawpzhCuA6d/+OpFsk/ahyKVt0RRuL\nLdxEqoNM9jrwmLX1GLZrMtpWBb9H0jervp9YWVYY7v5x5b+HJG1W39OTouk1sw7p9HPEg23upx93\nP1Q1bdLTkr7bzn4Gm+xVBTqGtSajbcUxbFXwt0m6wswmm9lISXMlvdiifWcys1GVv7wys29I+p6k\nXe3tSlLfc73q53svSrq38vgeSVsGbtBi/fqrBOmUH6j9x/Dnkna7+5qqZUU6hl/rr1XHsGV37lWG\nJdao74/NBnf/SUt2XAczm6q+s7yrb+rwX7a7PzN7TlJJ0lhJvZKWS/o3Sf8qaZKkfZLmuPuRAvU3\nXXVMpNqi/mpN9vq2pN+ozccw72S0uffPLbtAPLy4BwRE8IGACD4QEMEHAiL4QEAEHwiI4AMBEXwg\noP8DyWBnmMFJ3d4AAAAASUVORK5CYII=\n",
102 | "text/plain": [
103 | ""
104 | ]
105 | },
106 | "metadata": {},
107 | "output_type": "display_data"
108 | }
109 | ],
110 | "source": [
111 | "# take the data from a record, rearrange it into a 28*28 array and plot it as an image\n",
112 | "all_values = data_list[1].split(',')\n",
113 | "image_array = numpy.asfarray(all_values[1:]).reshape((28,28))\n",
114 | "matplotlib.pyplot.imshow(image_array, cmap='Greys', interpolation='None')"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": 8,
120 | "metadata": {},
121 | "outputs": [
122 | {
123 | "name": "stdout",
124 | "output_type": "stream",
125 | "text": [
126 | "[ 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
127 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
128 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
129 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
130 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
131 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
132 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
133 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
134 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
135 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
136 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
137 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
138 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
139 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
140 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
141 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
142 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
143 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
144 | " 0.01 0.208 0.62729412 0.99223529 0.62729412 0.20411765\n",
145 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
146 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
147 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
148 | " 0.01 0.19635294 0.934 0.98835294 0.98835294 0.98835294\n",
149 | " 0.93011765 0.01 0.01 0.01 0.01 0.01 0.01\n",
150 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
151 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
152 | " 0.01 0.21964706 0.89129412 0.99223529 0.98835294 0.93788235\n",
153 | " 0.91458824 0.98835294 0.23129412 0.03329412 0.01 0.01 0.01\n",
154 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
155 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
156 | " 0.04882353 0.24294118 0.87964706 0.98835294 0.99223529 0.98835294\n",
157 | " 0.79423529 0.33611765 0.98835294 0.99223529 0.48364706 0.01 0.01\n",
158 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
159 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
160 | " 0.01 0.64282353 0.98835294 0.98835294 0.98835294 0.99223529\n",
161 | " 0.98835294 0.98835294 0.38270588 0.74376471 0.99223529 0.65835294\n",
162 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
163 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
164 | " 0.01 0.01 0.208 0.934 0.99223529 0.99223529\n",
165 | " 0.74764706 0.45258824 0.99223529 0.89517647 0.19247059 0.31670588\n",
166 | " 1. 0.66223529 0.01 0.01 0.01 0.01 0.01\n",
167 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
168 | " 0.01 0.01 0.01 0.19635294 0.934 0.98835294\n",
169 | " 0.98835294 0.70494118 0.05658824 0.30117647 0.47976471 0.09152941\n",
170 | " 0.01 0.01 0.99223529 0.95341176 0.20411765 0.01 0.01\n",
171 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
172 | " 0.01 0.01 0.01 0.01 0.15752941 0.65058824\n",
173 | " 0.99223529 0.91458824 0.81752941 0.33611765 0.01 0.01 0.01\n",
174 | " 0.01 0.01 0.01 0.99223529 0.98835294 0.65058824\n",
175 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
176 | " 0.01 0.01 0.01 0.01 0.01 0.03717647\n",
177 | " 0.70105882 0.98835294 0.94176471 0.28564706 0.08376471 0.11870588\n",
178 | " 0.01 0.01 0.01 0.01 0.01 0.01\n",
179 | " 0.99223529 0.98835294 0.76705882 0.01 0.01 0.01 0.01\n",
180 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
181 | " 0.01 0.23129412 0.98835294 0.98835294 0.25458824 0.01 0.01\n",
182 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
183 | " 0.99223529 0.98835294 0.76705882 0.01 0.01 0.01 0.01\n",
184 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
185 | " 0.01 0.77870588 0.99223529 0.74764706 0.01 0.01 0.01\n",
186 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
187 | " 1. 0.99223529 0.77094118 0.01 0.01 0.01 0.01\n",
188 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
189 | " 0.30505882 0.96505882 0.98835294 0.44482353 0.01 0.01 0.01\n",
190 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
191 | " 0.99223529 0.98835294 0.58458824 0.01 0.01 0.01 0.01\n",
192 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
193 | " 0.34 0.98835294 0.90294118 0.10705882 0.01 0.01 0.01\n",
194 | " 0.01 0.01 0.01 0.01 0.01 0.03717647\n",
195 | " 0.53411765 0.99223529 0.73211765 0.05658824 0.01 0.01 0.01\n",
196 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
197 | " 0.01 0.34 0.98835294 0.87576471 0.01 0.01 0.01\n",
198 | " 0.01 0.01 0.01 0.01 0.01 0.03717647\n",
199 | " 0.51858824 0.98835294 0.88352941 0.28564706 0.01 0.01 0.01\n",
200 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
201 | " 0.01 0.01 0.34 0.98835294 0.57294118 0.01 0.01\n",
202 | " 0.01 0.01 0.01 0.01 0.01 0.19635294\n",
203 | " 0.65058824 0.98835294 0.68164706 0.01 0.01 0.01 0.01\n",
204 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
205 | " 0.01 0.01 0.01 0.34388235 0.99223529 0.88352941\n",
206 | " 0.01 0.01 0.01 0.01 0.01 0.01\n",
207 | " 0.45258824 0.934 0.99223529 0.63894118 0.01 0.01 0.01\n",
208 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
209 | " 0.01 0.01 0.01 0.01 0.01 0.34\n",
210 | " 0.98835294 0.97670588 0.57682353 0.19635294 0.12258824 0.34\n",
211 | " 0.70105882 0.88352941 0.99223529 0.87576471 0.65835294 0.22741176\n",
212 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
213 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
214 | " 0.01 0.34 0.98835294 0.98835294 0.98835294 0.89905882\n",
215 | " 0.84470588 0.98835294 0.98835294 0.98835294 0.77094118 0.51470588\n",
216 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
217 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
218 | " 0.01 0.01 0.01 0.11870588 0.78258824 0.98835294\n",
219 | " 0.98835294 0.99223529 0.98835294 0.98835294 0.91458824 0.57294118\n",
220 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
221 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
222 | " 0.01 0.01 0.01 0.01 0.01 0.01\n",
223 | " 0.10705882 0.50694118 0.98835294 0.99223529 0.98835294 0.55741176\n",
224 | " 0.15364706 0.01 0.01 0.01 0.01 0.01 0.01\n",
225 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
226 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
227 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
228 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
229 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
230 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
231 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
232 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
233 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
234 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
235 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
236 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
237 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
238 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
239 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
240 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
241 | " 0.01 0.01 0.01 0.01 0.01 0.01 0.01\n",
242 | " 0.01 ]\n"
243 | ]
244 | }
245 | ],
246 | "source": [
247 | "# scale input to range 0.01 to 1.00\n",
248 | "scaled_input = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
249 | "print(scaled_input)"
250 | ]
251 | },
252 | {
253 | "cell_type": "code",
254 | "execution_count": 12,
255 | "metadata": {},
256 | "outputs": [],
257 | "source": [
258 | "#output nodes is 10 (example)\n",
259 | "onodes = 10\n",
260 | "targets = numpy.zeros(onodes) + 0.01\n",
261 | "targets[int(all_values[0])] = 0.99"
262 | ]
263 | },
264 | {
265 | "cell_type": "code",
266 | "execution_count": 13,
267 | "metadata": {},
268 | "outputs": [
269 | {
270 | "name": "stdout",
271 | "output_type": "stream",
272 | "text": [
273 | "[ 0.99 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01]\n"
274 | ]
275 | }
276 | ],
277 | "source": [
278 | "print(targets)"
279 | ]
280 | },
281 | {
282 | "cell_type": "code",
283 | "execution_count": null,
284 | "metadata": {
285 | "collapsed": true
286 | },
287 | "outputs": [],
288 | "source": []
289 | }
290 | ],
291 | "metadata": {
292 | "kernelspec": {
293 | "display_name": "Python 3",
294 | "language": "python",
295 | "name": "python3"
296 | },
297 | "language_info": {
298 | "codemirror_mode": {
299 | "name": "ipython",
300 | "version": 3
301 | },
302 | "file_extension": ".py",
303 | "mimetype": "text/x-python",
304 | "name": "python",
305 | "nbconvert_exporter": "python",
306 | "pygments_lexer": "ipython3",
307 | "version": "3.6.1"
308 | }
309 | },
310 | "nbformat": 4,
311 | "nbformat_minor": 1
312 | }
313 |
--------------------------------------------------------------------------------
/code/part2_neural_network.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# (c) Tariq Rashid, 2016\n",
13 | "# license is GPLv2"
14 | ]
15 | },
16 | {
17 | "cell_type": "code",
18 | "execution_count": 2,
19 | "metadata": {
20 | "collapsed": true
21 | },
22 | "outputs": [],
23 | "source": [
24 | "import numpy\n",
25 | "# scipy.special for the sigmoid function expit()\n",
26 | "import scipy.special"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "execution_count": 3,
32 | "metadata": {},
33 | "outputs": [],
34 | "source": [
35 | "# neural network class definition\n",
36 | "class neuralNetwork:\n",
37 | " \n",
38 | " \n",
39 | " # initialise the neural network\n",
40 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
41 | " # set number of nodes in each input, hidden, output layer\n",
42 | " self.inodes = inputnodes\n",
43 | " self.hnodes = hiddennodes\n",
44 | " self.onodes = outputnodes\n",
45 | " \n",
46 | " # link weight matrices, wih and who\n",
47 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
48 | " # w11 w21\n",
49 | " # w12 w22 etc \n",
50 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
51 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
52 | "\n",
53 | " # learning rate\n",
54 | " self.lr = learningrate\n",
55 | " \n",
56 | " # activation function is the sigmoid function\n",
57 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
58 | " \n",
59 | " pass\n",
60 | "\n",
61 | " \n",
62 | " # train the neural network\n",
63 | " def train(self, inputs_list, targets_list):\n",
64 | " # convert inputs list to 2d array\n",
65 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
66 | " targets = numpy.array(targets_list, ndmin=2).T\n",
67 | " \n",
68 | " # calculate signals into hidden layer\n",
69 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
70 | " # calculate the signals emerging from hidden layer\n",
71 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
72 | " \n",
73 | " # calculate signals into final output layer\n",
74 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
75 | " # calculate the signals emerging from final output layer\n",
76 | " final_outputs = self.activation_function(final_inputs)\n",
77 | " \n",
78 | " # output layer error is the (target - actual)\n",
79 | " output_errors = targets - final_outputs\n",
80 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
81 | " hidden_errors = numpy.dot(self.who.T, output_errors) \n",
82 | " \n",
83 | " # update the weights for the links between the hidden and output layers\n",
84 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
85 | " \n",
86 | " # update the weights for the links between the input and hidden layers\n",
87 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
88 | " \n",
89 | " pass\n",
90 | "\n",
91 | " \n",
92 | " # query the neural network\n",
93 | " def query(self, inputs_list):\n",
94 | " # convert inputs list to 2d array\n",
95 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
96 | " \n",
97 | " # calculate signals into hidden layer\n",
98 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
99 | " # calculate the signals emerging from hidden layer\n",
100 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
101 | " \n",
102 | " # calculate signals into final output layer\n",
103 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
104 | " # calculate the signals emerging from final output layer\n",
105 | " final_outputs = self.activation_function(final_inputs)\n",
106 | " \n",
107 | " return final_outputs"
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "execution_count": 4,
113 | "metadata": {},
114 | "outputs": [],
115 | "source": [
116 | "# number of input, hidden and output nodes\n",
117 | "input_nodes = 3\n",
118 | "hidden_nodes = 3\n",
119 | "output_nodes = 3\n",
120 | "\n",
121 | "# learning rate is 0.3\n",
122 | "learning_rate = 0.3\n",
123 | "\n",
124 | "# create instance of neural network\n",
125 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "execution_count": 5,
131 | "metadata": {},
132 | "outputs": [
133 | {
134 | "data": {
135 | "text/plain": [
136 | "array([[ 0.43461026],\n",
137 | " [ 0.40331273],\n",
138 | " [ 0.56675401]])"
139 | ]
140 | },
141 | "execution_count": 5,
142 | "metadata": {},
143 | "output_type": "execute_result"
144 | }
145 | ],
146 | "source": [
147 | "# test query (doesn't mean anything useful yet)\n",
148 | "n.query([1.0, 0.5, -1.5])"
149 | ]
150 | },
151 | {
152 | "cell_type": "code",
153 | "execution_count": null,
154 | "metadata": {
155 | "collapsed": true
156 | },
157 | "outputs": [],
158 | "source": []
159 | }
160 | ],
161 | "metadata": {
162 | "kernelspec": {
163 | "display_name": "Python 3",
164 | "language": "python",
165 | "name": "python3"
166 | },
167 | "language_info": {
168 | "codemirror_mode": {
169 | "name": "ipython",
170 | "version": 3
171 | },
172 | "file_extension": ".py",
173 | "mimetype": "text/x-python",
174 | "name": "python",
175 | "nbconvert_exporter": "python",
176 | "pygments_lexer": "ipython3",
177 | "version": "3.6.1"
178 | }
179 | },
180 | "nbformat": 4,
181 | "nbformat_minor": 1
182 | }
183 |
--------------------------------------------------------------------------------
/code/part2_neural_network_mnist_data.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# code for a 3-layer neural network, and code for learning the MNIST dataset\n",
13 | "# (c) Tariq Rashid, 2016\n",
14 | "# license is GPLv2"
15 | ]
16 | },
17 | {
18 | "cell_type": "code",
19 | "execution_count": 2,
20 | "metadata": {
21 | "collapsed": true
22 | },
23 | "outputs": [],
24 | "source": [
25 | "import numpy\n",
26 | "# scipy.special for the sigmoid function expit()\n",
27 | "import scipy.special\n",
28 | "# library for plotting arrays\n",
29 | "import matplotlib.pyplot\n",
30 | "# ensure the plots are inside this notebook, not an external window\n",
31 | "%matplotlib inline"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 3,
37 | "metadata": {},
38 | "outputs": [],
39 | "source": [
40 | "# neural network class definition\n",
41 | "class neuralNetwork:\n",
42 | " \n",
43 | " \n",
44 | " # initialise the neural network\n",
45 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
46 | " # set number of nodes in each input, hidden, output layer\n",
47 | " self.inodes = inputnodes\n",
48 | " self.hnodes = hiddennodes\n",
49 | " self.onodes = outputnodes\n",
50 | " \n",
51 | " # link weight matrices, wih and who\n",
52 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
53 | " # w11 w21\n",
54 | " # w12 w22 etc \n",
55 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
56 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
57 | "\n",
58 | " # learning rate\n",
59 | " self.lr = learningrate\n",
60 | " \n",
61 | " # activation function is the sigmoid function\n",
62 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
63 | " \n",
64 | " pass\n",
65 | "\n",
66 | " \n",
67 | " # train the neural network\n",
68 | " def train(self, inputs_list, targets_list):\n",
69 | " # convert inputs list to 2d array\n",
70 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
71 | " targets = numpy.array(targets_list, ndmin=2).T\n",
72 | " \n",
73 | " # calculate signals into hidden layer\n",
74 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
75 | " # calculate the signals emerging from hidden layer\n",
76 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
77 | " \n",
78 | " # calculate signals into final output layer\n",
79 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
80 | " # calculate the signals emerging from final output layer\n",
81 | " final_outputs = self.activation_function(final_inputs)\n",
82 | " \n",
83 | " # output layer error is the (target - actual)\n",
84 | " output_errors = targets - final_outputs\n",
85 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
86 | " hidden_errors = numpy.dot(self.who.T, output_errors) \n",
87 | " \n",
88 | " # update the weights for the links between the hidden and output layers\n",
89 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
90 | " \n",
91 | " # update the weights for the links between the input and hidden layers\n",
92 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
93 | " \n",
94 | " pass\n",
95 | "\n",
96 | " \n",
97 | " # query the neural network\n",
98 | " def query(self, inputs_list):\n",
99 | " # convert inputs list to 2d array\n",
100 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
101 | " \n",
102 | " # calculate signals into hidden layer\n",
103 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
104 | " # calculate the signals emerging from hidden layer\n",
105 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
106 | " \n",
107 | " # calculate signals into final output layer\n",
108 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
109 | " # calculate the signals emerging from final output layer\n",
110 | " final_outputs = self.activation_function(final_inputs)\n",
111 | " \n",
112 | " return final_outputs"
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "execution_count": 4,
118 | "metadata": {},
119 | "outputs": [],
120 | "source": [
121 | "# number of input, hidden and output nodes\n",
122 | "input_nodes = 784\n",
123 | "hidden_nodes = 200\n",
124 | "output_nodes = 10\n",
125 | "\n",
126 | "# learning rate\n",
127 | "learning_rate = 0.1\n",
128 | "\n",
129 | "# create instance of neural network\n",
130 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
131 | ]
132 | },
133 | {
134 | "cell_type": "code",
135 | "execution_count": 5,
136 | "metadata": {},
137 | "outputs": [],
138 | "source": [
139 | "# load the mnist training data CSV file into a list\n",
140 | "training_data_file = open(\"mnist_dataset/mnist_train.csv\", 'r')\n",
141 | "training_data_list = training_data_file.readlines()\n",
142 | "training_data_file.close()"
143 | ]
144 | },
145 | {
146 | "cell_type": "code",
147 | "execution_count": 6,
148 | "metadata": {},
149 | "outputs": [],
150 | "source": [
151 | "# train the neural network\n",
152 | "\n",
153 | "# epochs is the number of times the training data set is used for training\n",
154 | "epochs = 5\n",
155 | "\n",
156 | "for e in range(epochs):\n",
157 | " # go through all records in the training data set\n",
158 | " for record in training_data_list:\n",
159 | " # split the record by the ',' commas\n",
160 | " all_values = record.split(',')\n",
161 | " # scale and shift the inputs\n",
162 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
163 | " # create the target output values (all 0.01, except the desired label which is 0.99)\n",
164 | " targets = numpy.zeros(output_nodes) + 0.01\n",
165 | " # all_values[0] is the target label for this record\n",
166 | " targets[int(all_values[0])] = 0.99\n",
167 | " n.train(inputs, targets)\n",
168 | " pass\n",
169 | " pass"
170 | ]
171 | },
172 | {
173 | "cell_type": "code",
174 | "execution_count": 7,
175 | "metadata": {
176 | "collapsed": true
177 | },
178 | "outputs": [],
179 | "source": [
180 | "# load the mnist test data CSV file into a list\n",
181 | "test_data_file = open(\"mnist_dataset/mnist_test.csv\", 'r')\n",
182 | "test_data_list = test_data_file.readlines()\n",
183 | "test_data_file.close()"
184 | ]
185 | },
186 | {
187 | "cell_type": "code",
188 | "execution_count": 8,
189 | "metadata": {},
190 | "outputs": [],
191 | "source": [
192 | "# test the neural network\n",
193 | "\n",
194 | "# scorecard for how well the network performs, initially empty\n",
195 | "scorecard = []\n",
196 | "\n",
197 | "# go through all the records in the test data set\n",
198 | "for record in test_data_list:\n",
199 | " # split the record by the ',' commas\n",
200 | " all_values = record.split(',')\n",
201 | " # correct answer is first value\n",
202 | " correct_label = int(all_values[0])\n",
203 | " # scale and shift the inputs\n",
204 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
205 | " # query the network\n",
206 | " outputs = n.query(inputs)\n",
207 | " # the index of the highest value corresponds to the label\n",
208 | " label = numpy.argmax(outputs)\n",
209 | " # append correct or incorrect to list\n",
210 | " if (label == correct_label):\n",
211 | " # network's answer matches correct answer, add 1 to scorecard\n",
212 | " scorecard.append(1)\n",
213 | " else:\n",
214 | " # network's answer doesn't match correct answer, add 0 to scorecard\n",
215 | " scorecard.append(0)\n",
216 | " pass\n",
217 | " \n",
218 | " pass"
219 | ]
220 | },
221 | {
222 | "cell_type": "code",
223 | "execution_count": 9,
224 | "metadata": {},
225 | "outputs": [
226 | {
227 | "name": "stdout",
228 | "output_type": "stream",
229 | "text": [
230 | "performance = 0.9712\n"
231 | ]
232 | }
233 | ],
234 | "source": [
235 | "# calculate the performance score, the fraction of correct answers\n",
236 | "scorecard_array = numpy.asarray(scorecard)\n",
237 | "print (\"performance = \", scorecard_array.sum() / scorecard_array.size)"
238 | ]
239 | },
240 | {
241 | "cell_type": "code",
242 | "execution_count": null,
243 | "metadata": {
244 | "collapsed": true
245 | },
246 | "outputs": [],
247 | "source": []
248 | }
249 | ],
250 | "metadata": {
251 | "kernelspec": {
252 | "display_name": "Python 3",
253 | "language": "python",
254 | "name": "python3"
255 | },
256 | "language_info": {
257 | "codemirror_mode": {
258 | "name": "ipython",
259 | "version": 3
260 | },
261 | "file_extension": ".py",
262 | "mimetype": "text/x-python",
263 | "name": "python",
264 | "nbconvert_exporter": "python",
265 | "pygments_lexer": "ipython3",
266 | "version": "3.6.1"
267 | }
268 | },
269 | "nbformat": 4,
270 | "nbformat_minor": 1
271 | }
272 |
--------------------------------------------------------------------------------
/code/part3_mnist_data_set_with_rotations.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# working with the MNIST data set\n",
13 | "# this code demonstrates rotating the training images to create more examples\n",
14 | "#\n",
15 | "# (c) Tariq Rashid, 2016\n",
16 | "# license is GPLv2"
17 | ]
18 | },
19 | {
20 | "cell_type": "code",
21 | "execution_count": 2,
22 | "metadata": {},
23 | "outputs": [],
24 | "source": [
25 | "import numpy\n",
26 | "import matplotlib.pyplot\n",
27 | "%matplotlib inline"
28 | ]
29 | },
30 | {
31 | "cell_type": "code",
32 | "execution_count": 3,
33 | "metadata": {
34 | "collapsed": true
35 | },
36 | "outputs": [],
37 | "source": [
38 | "# scipy.ndimage for rotating image arrays\n",
39 | "import scipy.ndimage"
40 | ]
41 | },
42 | {
43 | "cell_type": "code",
44 | "execution_count": 4,
45 | "metadata": {},
46 | "outputs": [],
47 | "source": [
48 | "# open the CSV file and read its contents into a list\n",
49 | "data_file = open(\"mnist_dataset/mnist_train_100.csv\", 'r')\n",
50 | "data_list = data_file.readlines()\n",
51 | "data_file.close()"
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": 5,
57 | "metadata": {
58 | "collapsed": true
59 | },
60 | "outputs": [],
61 | "source": [
62 | "# which record will be use\n",
63 | "record = 6"
64 | ]
65 | },
66 | {
67 | "cell_type": "code",
68 | "execution_count": 6,
69 | "metadata": {
70 | "collapsed": true
71 | },
72 | "outputs": [],
73 | "source": [
74 | "# scale input to range 0.01 to 1.00\n",
75 | "all_values = data_list[record].split(',')\n",
76 | "scaled_input = ((numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01).reshape(28,28)"
77 | ]
78 | },
79 | {
80 | "cell_type": "code",
81 | "execution_count": 7,
82 | "metadata": {},
83 | "outputs": [
84 | {
85 | "name": "stdout",
86 | "output_type": "stream",
87 | "text": [
88 | "0.01\n",
89 | "1.0\n"
90 | ]
91 | }
92 | ],
93 | "source": [
94 | "print(numpy.min(scaled_input))\n",
95 | "print(numpy.max(scaled_input))"
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": 8,
101 | "metadata": {},
102 | "outputs": [
103 | {
104 | "data": {
105 | "text/plain": [
106 | ""
107 | ]
108 | },
109 | "execution_count": 8,
110 | "metadata": {},
111 | "output_type": "execute_result"
112 | },
113 | {
114 | "data": {
115 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADDJJREFUeJzt3V2IXPUdxvHniW0u1LAYM12CL90IUhBjIw5LQSkW22Kl\nGM2FNBclBWFFrLbiRcWKDYggpVUUipDUkG2xtoUq5kJaYyhIoYqzmmpM2mplqwkx2fiCeiG+5NeL\nPZGt7pyZzJyZM5vf9wPDzJz/mTkPo8+emXMm83dECEA+y+oOAKAelB9IivIDSVF+ICnKDyRF+YGk\nKD+QFOUHkqL8QFJfGObGVq1aFRMTE8PcJJDK7Oysjhw54m7W7av8ti+XdJ+kkyT9OiLuLlt/YmJC\nrVarn00CKNFsNrtet+e3/bZPkvQrSd+RdJ6kjbbP6/X5AAxXP5/5JyW9EhGvRsSHkn4vaX01sQAM\nWj/lP0PS6wvu7y+W/R/bU7Zbtltzc3N9bA5AlQZ+tD8itkREMyKajUZj0JsD0KV+yn9A0lkL7p9Z\nLAOwBPRT/mclnWt7je3lkr4naUc1sQAMWs+n+iLiY9s/lPQXzZ/q2xYRL1WWDMBA9XWePyIel/R4\nRVkADBFf7wWSovxAUpQfSIryA0lRfiApyg8kRfmBpCg/kBTlB5Ki/EBSlB9IivIDSVF+ICnKDyRF\n+YGkKD+QFOUHkqL8QFKUH0iK8gNJUX4gqaFO0Y3hu/POO0vH77jjjtLxycnJ0vEnnniidHxsbKx0\nHPVhzw8kRfmBpCg/kBTlB5Ki/EBSlB9IivIDSfV1nt/2rKT3JH0i6eOIaFYRCsfnnXfeaTt2//33\nlz522bLyv/8zMzOl46+99lrp+Nq1a0vHUZ8qvuTzjYg4UsHzABgi3vYDSfVb/pD0pO0Z21NVBAIw\nHP2+7b8kIg7Y/pKknbb/GRFPLVyh+KMwJUlnn312n5sDUJW+9vwRcaC4PizpUUmf+1cgEbElIpoR\n0Ww0Gv1sDkCFei6/7VNsrzh2W9K3Je2pKhiAwernbf+4pEdtH3ue30XEnytJBWDgei5/RLwq6asV\nZkGPTj755LZjV155Zeljt2/fXnEaLBWc6gOSovxAUpQfSIryA0lRfiApyg8kxU93nwCWL1/edmzN\nmjVDTIKlhD0/kBTlB5Ki/EBSlB9IivIDSVF+ICnKDyTFef4TwAcffNB27Pnnnx9iEiwl7PmBpCg/\nkBTlB5Ki/EBSlB9IivIDSVF+ICnO858APvroo7Zje/fuHei2n3766dLxsinaxsbGqo6D48CeH0iK\n8gNJUX4gKcoPJEX5gaQoP5AU5QeS6nie3/Y2Sd+VdDgizi+WrZT0B0kTkmYlXRMRbw8uJsqsWLGi\n7djNN99c+tjrr7++r213evzpp5/edmzDhg19bRv96WbPv13S5Z9ZdqukXRFxrqRdxX0AS0jH8kfE\nU5Le+szi9ZKmi9vTkq6qOBeAAev1M/94RBwsbr8habyiPACGpO8DfhERkqLduO0p2y3brbm5uX43\nB6AivZb/kO3VklRcH263YkRsiYhmRDQbjUaPmwNQtV7Lv0PSpuL2JkmPVRMHwLB0LL/thyX9XdJX\nbO+3fa2kuyV9y/bLkr5Z3AewhHQ8zx8RG9sMXVZxFgzA1NRU6Xi/5/mxdPENPyApyg8kRfmBpCg/\nkBTlB5Ki/EBS/HR3ckePHi0dX7aM/cOJiv+yQFKUH0iK8gNJUX4gKcoPJEX5gaQoP5AU5/mT63Qe\n3/aQkmDY2PMDSVF+ICnKDyRF+YGkKD+QFOUHkqL8QFKUH0iK8gNJUX4gKcoPJEX5gaQoP5AU5QeS\novxAUh3Lb3ub7cO29yxYttn2Adu7i8sVg40JoGrd7Pm3S7p8keX3RsS64vJ4tbEADFrH8kfEU5Le\nGkIWAEPUz2f+G22/UHwsOK2yRACGotfyPyDpHEnrJB2U9Mt2K9qest2y3Zqbm+txcwCq1lP5I+JQ\nRHwSEUclbZU0WbLulohoRkSz0Wj0mhNAxXoqv+3VC+5eLWlPu3UBjKaOP91t+2FJl0paZXu/pJ9J\nutT2OkkhaVbSdQPMCGAAOpY/IjYusvjBAWRBDY4ePVo63ul3/TvZuXNn27ENGzb09dzoD9/wA5Ki\n/EBSlB9IivIDSVF+ICnKDyTFFN3JDXqK7q1bt7Yd27x5c+ljx8fH+9o2yrHnB5Ki/EBSlB9IivID\nSVF+ICnKDyRF+YGkOM+f3O233146ftdddw1s22XfAZA6Z0N/2PMDSVF+ICnKDyRF+YGkKD+QFOUH\nkqL8QFKc50/uggsuqDsCasKeH0iK8gNJUX4gKcoPJEX5gaQoP5AU5QeSckSUr2CfJek3ksYlhaQt\nEXGf7ZWS/iBpQtKspGsi4u2y52o2m9FqtSqIjWFZu3Zt6fjevXt7fu5O04O/+eabpeMrV67sedsn\nqmazqVar1dVkC93s+T+WdEtEnCfpa5JusH2epFsl7YqIcyXtKu4DWCI6lj8iDkbEc8Xt9yTtk3SG\npPWSpovVpiVdNaiQAKp3XJ/5bU9IulDSM5LGI+JgMfSG5j8WAFgiui6/7VMl/UnSjyPi3YVjMX/g\nYNGDB7anbLdst+bm5voKC6A6XZXf9hc1X/yHIuKRYvEh26uL8dWSDi/22IjYEhHNiGg2Go0qMgOo\nQMfye36a1gcl7YuIexYM7ZC0qbi9SdJj1ccDMCjd/JPeiyV9X9KLtncXy26TdLekP9q+VtJ/JV0z\nmIio0+TkZOn4vn37en7uTtODY7A6lj8i/iap3XnDy6qNA2BY+NMLJEX5gaQoP5AU5QeSovxAUpQf\nSIqf7kapm266qXR8enq6dByjiz0/kBTlB5Ki/EBSlB9IivIDSVF+ICnKDyTFeX6UmpiYKB2/6KKL\nSsdnZmYqTIMqsecHkqL8QFKUH0iK8gNJUX4gKcoPJEX5gaQ4z49SY2NjpePPPPPMkJKgauz5gaQo\nP5AU5QeSovxAUpQfSIryA0lRfiCpjuW3fZbtv9rea/sl2z8qlm+2fcD27uJyxeDjAqhKN1/y+VjS\nLRHxnO0VkmZs7yzG7o2IXwwuHoBB6Vj+iDgo6WBx+z3b+ySdMehgAAbruD7z256QdKGkY9/pvNH2\nC7a32T6tzWOmbLdst+bm5voKC6A6XZff9qmS/iTpxxHxrqQHJJ0jaZ3m3xn8crHHRcSWiGhGRLPR\naFQQGUAVuiq/7S9qvvgPRcQjkhQRhyLik4g4KmmrpMnBxQRQtW6O9lvSg5L2RcQ9C5avXrDa1ZL2\nVB8PwKB0c7T/Yknfl/Si7d3FstskbbS9TlJImpV03UASAhiIbo72/02SFxl6vPo4AIaFb/gBSVF+\nICnKDyRF+YGkKD+QFOUHkqL8QFKUH0iK8gNJUX4gKcoPJEX5gaQoP5AU5QeSckQMb2P2nKT/Lli0\nStKRoQU4PqOabVRzSWTrVZXZvhwRXf1e3lDL/7mN262IaNYWoMSoZhvVXBLZelVXNt72A0lRfiCp\nusu/pebtlxnVbKOaSyJbr2rJVutnfgD1qXvPD6AmtZTf9uW2/2X7Fdu31pGhHduztl8sZh5u1Zxl\nm+3DtvcsWLbS9k7bLxfXi06TVlO2kZi5uWRm6Vpfu1Gb8Xrob/ttnyTp35K+JWm/pGclbYyIvUMN\n0obtWUnNiKj9nLDtr0t6X9JvIuL8YtnPJb0VEXcXfzhPi4ifjEi2zZLer3vm5mJCmdULZ5aWdJWk\nH6jG164k1zWq4XWrY88/KemViHg1Ij6U9HtJ62vIMfIi4ilJb31m8XpJ08Xtac3/zzN0bbKNhIg4\nGBHPFbffk3RsZulaX7uSXLWoo/xnSHp9wf39Gq0pv0PSk7ZnbE/VHWYR48W06ZL0hqTxOsMsouPM\nzcP0mZmlR+a162XG66pxwO/zLomIdZK+I+mG4u3tSIr5z2yjdLqmq5mbh2WRmaU/Vedr1+uM11Wr\no/wHJJ214P6ZxbKREBEHiuvDkh7V6M0+fOjYJKnF9eGa83xqlGZuXmxmaY3AazdKM17XUf5nJZ1r\ne43t5ZK+J2lHDTk+x/YpxYEY2T5F0rc1erMP75C0qbi9SdJjNWb5P6Myc3O7maVV82s3cjNeR8TQ\nL5Ku0PwR//9I+mkdGdrkOkfSP4rLS3Vnk/Sw5t8GfqT5YyPXSjpd0i5JL0t6UtLKEcr2W0kvSnpB\n80VbXVO2SzT/lv4FSbuLyxV1v3YluWp53fiGH5AUB/yApCg/kBTlB5Ki/EBSlB9IivIDSVF+ICnK\nDyT1PxaBvvZVxMOzAAAAAElFTkSuQmCC\n",
116 | "text/plain": [
117 | ""
118 | ]
119 | },
120 | "metadata": {},
121 | "output_type": "display_data"
122 | }
123 | ],
124 | "source": [
125 | "# plot the original image\n",
126 | "matplotlib.pyplot.imshow(scaled_input, cmap='Greys', interpolation='None')"
127 | ]
128 | },
129 | {
130 | "cell_type": "code",
131 | "execution_count": 9,
132 | "metadata": {},
133 | "outputs": [],
134 | "source": [
135 | "# create rotated variations\n",
136 | "# rotated anticlockwise by 10 degrees\n",
137 | "inputs_plus10_img = scipy.ndimage.rotate(scaled_input, 10.0, cval=0.01, order=1, reshape=False)\n",
138 | "# rotated clockwise by 10 degrees\n",
139 | "inputs_minus10_img = scipy.ndimage.rotate(scaled_input, -10.0, cval=0.01, order=1, reshape=False)"
140 | ]
141 | },
142 | {
143 | "cell_type": "code",
144 | "execution_count": 10,
145 | "metadata": {},
146 | "outputs": [
147 | {
148 | "name": "stdout",
149 | "output_type": "stream",
150 | "text": [
151 | "0.01\n",
152 | "0.99748795356\n"
153 | ]
154 | }
155 | ],
156 | "source": [
157 | "print(numpy.min(inputs_plus10_img))\n",
158 | "print(numpy.max(inputs_plus10_img))"
159 | ]
160 | },
161 | {
162 | "cell_type": "code",
163 | "execution_count": 11,
164 | "metadata": {},
165 | "outputs": [
166 | {
167 | "data": {
168 | "text/plain": [
169 | ""
170 | ]
171 | },
172 | "execution_count": 11,
173 | "metadata": {},
174 | "output_type": "execute_result"
175 | },
176 | {
177 | "data": {
178 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADZRJREFUeJzt3V+MVOUZx/Hf0wUkAokK42ZV7GJCahADJhOswVRMa/0T\nDXhDIMbQhBQvrNHEixK9qJfGVI0X1YhKpGppm1AiF2qjpIkxaYyjWRaEtmzJKmyAHSIGCYJleXqx\nB7PqzjvjzJk5sz7fT7LZmfOcs+fJ6I8zc94z5zV3F4B4flR0AwCKQfiBoAg/EBThB4Ii/EBQhB8I\nivADQRF+ICjCDwQ1rZM7mzdvnvf393dyl0Aow8PDOnbsmDWybkvhN7PbJD0jqUfSi+7+eGr9/v5+\nVSqVVnYJIKFcLje8btNv+82sR9IfJN0uaZGktWa2qNm/B6CzWvnMv0zSkLsfcPevJP1Z0sp82gLQ\nbq2E/3JJByc8P5Qt+wYz22BmFTOrVKvVFnYHIE9tP9vv7pvcvezu5VKp1O7dAWhQK+EfkTR/wvMr\nsmUApoBWwv+BpIVmtsDMZkhaI2lHPm0BaLemh/rc/ayZ/UbS3zU+1LfZ3T/OrTMAbdXSOL+7vyHp\njZx6AdBBXN4LBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQHb11\nN7rPuXPnknWz9F2g69XRvTjyA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQjPNPAcPDw8n6s88+W7M2\nMDCQ3Hbjxo3J+vLly5P1Cy64IFlH9+LIDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBtTTOb2bDkr6Q\nNCbprLuX82gqmrGxsWS9Uqkk66+++mrNWrVaTW579dVXJ+tLlixJ1hnnn7ryuMjnZnc/lsPfAdBB\nvO0Hgmo1/C7pHTP70Mw25NEQgM5o9W3/je4+YmaXSnrbzP7l7u9OXCH7R2GDJF155ZUt7g5AXlo6\n8rv7SPZ7VNJ2ScsmWWeTu5fdvVwqlVrZHYAcNR1+M5tlZnPOP5b0S0l78moMQHu18ra/V9L27NbN\n0yT9yd3fyqUrAG3XdPjd/YCk9CAwGlLv3vl9fX3J+qWXXlqzduTIkeS2Q0NDyfqpU6eS9blz5ybr\n6F4M9QFBEX4gKMIPBEX4gaAIPxAU4QeC4tbdXWDatPR/hmuuuSZZv/7662vWBgcHk9vu3bs3WT99\n+nSyjqmLIz8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBMU4fxfI7olQ0+zZs5P11K2/6/3tgwcPJuv1\nvhI8f/78ZD11a+96vaG9OPIDQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCM808BZ8+eTdZT4/z1ptA+\nc+ZMsr5169ZkfcaMGcl6aorvmTNnJrdFe3HkB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGg6o7zm9lm\nSXdKGnX3xdmySyT9RVK/pGFJq939ePvajK2npydZv+uuu2rWdu3aldx2YGAgWX/++eeT9XrfyX/0\n0Udr1i677LLktmivRo78L0u67VvLNkra6e4LJe3MngOYQuqG393flfTZtxavlLQle7xF0qqc+wLQ\nZs1+5u9198PZ4yOSenPqB0CHtHzCz91dkteqm9kGM6uYWaVarba6OwA5aTb8R82sT5Ky36O1VnT3\nTe5edvdyqVRqcncA8tZs+HdIWpc9Xifp9XzaAdApdcNvZlsl/VPST8zskJmtl/S4pFvMbL+kX2TP\nAUwhdcf53X1tjdLPc+4FNUyfPj1Zv/XWW2vWRkZGkts++OCDTfV03qFDh5L1EydO1Kwxzl8srvAD\ngiL8QFCEHwiK8ANBEX4gKMIPBMWtu38AZs2aVbN20003Jbcdvzq7eUNDQ8n6559/3tLfR/tw5AeC\nIvxAUIQfCIrwA0ERfiAowg8ERfiBoBjn/4FbsGBBsl7v1tv16p9++mmyfuzYsWQdxeHIDwRF+IGg\nCD8QFOEHgiL8QFCEHwiK8ANBMc7/Azdnzpxkvbc3Pc3i6GjNyZgkSadOnUrWU7f2Pn36dHLbmTNn\nJutoDUd+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiq7ji/mW2WdKekUXdfnC17TNKvJVWz1R5x9zfa\n1STaZ/ny5cn69u3bW/r7u3btqlk7fvx4ctu+vr6W9o20Ro78L0u6bZLlT7v70uyH4ANTTN3wu/u7\nkj7rQC8AOqiVz/wPmNmgmW02s4tz6whARzQb/uckXSVpqaTDkp6staKZbTCziplVqtVqrdUAdFhT\n4Xf3o+4+5u7nJL0gaVli3U3uXnb3cqlUarZPADlrKvxmNvE07N2S9uTTDoBOaWSob6ukFZLmmdkh\nSb+TtMLMlkpyScOS7mtjjwDaoG743X3tJItfakMvKMC1116brG/bti1Zr3df/7feeqtm7YYbbkhu\nu3r16mSd7/u3hiv8gKAIPxAU4QeCIvxAUIQfCIrwA0Fx6+7gli2reXGmJKmnpydZd/dkPTWF9yuv\nvJLcdtGiRcl6uVxO1pHGkR8IivADQRF+ICjCDwRF+IGgCD8QFOEHgmKcP7h64/zr169P1l988cWm\n97179+5kfWBgIFlnnL81HPmBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjG+YO76KKLkvUVK1Yk6/W+\nk3/mzJmatdHR0eS2TzzxRLK+atWqZH3u3Lk1a/VuOR4BR34gKMIPBEX4gaAIPxAU4QeCIvxAUIQf\nCKruOL+ZzZf0R0m9klzSJnd/xswukfQXSf2ShiWtdvfj7WsV7VDvvvxr1qxJ1t98881kPTXF95df\nfpncdv/+/cn6gQMHkvXUNQzTpnGJSyNH/rOSHnb3RZJ+Kul+M1skaaOkne6+UNLO7DmAKaJu+N39\nsLt/lD3+QtI+SZdLWilpS7baFknpy60AdJXv9ZnfzPolXSfpfUm97n44Kx3R+McCAFNEw+E3s9mS\ntkl6yN1PTKz5+IRtk07aZmYbzKxiZpVqtdpSswDy01D4zWy6xoP/mrv/LVt81Mz6snqfpEm/peHu\nm9y97O7lUqmUR88AclA3/Db+9aeXJO1z96cmlHZIWpc9Xifp9fzbA9AujYx3LJd0r6TdZnb+XsqP\nSHpc0l/NbL2kTyStbk+LKFK9r77W+1rt4OBgzVq9W3fXs2/fvmR9yZIlNWsM9TUQfnd/T1Kt/wN+\nnm87ADqFK/yAoAg/EBThB4Ii/EBQhB8IivADQTHYiZbcfPPNyXpqnL/eV3JPnjyZrO/ZsydZHxsb\nS9aj48gPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0Exzo+WzJ49O1m/5557atb6+/uT2y5YsCBZX7x4\ncbJ+4YUXJuvRceQHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAY50dLpk+fnqwvXLiwqRrajyM/EBTh\nB4Ii/EBQhB8IivADQRF+ICjCDwRVN/xmNt/M/mFme83sYzN7MFv+mJmNmNlA9nNH+9sFkJdGLvI5\nK+lhd//IzOZI+tDM3s5qT7v779vXHoB2qRt+dz8s6XD2+Asz2yfp8nY3BqC9vtdnfjPrl3SdpPez\nRQ+Y2aCZbTazi2tss8HMKmZWqVarLTULID8Nh9/MZkvaJukhdz8h6TlJV0laqvF3Bk9Otp27b3L3\nsruXS6VSDi0DyEND4Tez6RoP/mvu/jdJcvej7j7m7uckvSBpWfvaBJC3Rs72m6SXJO1z96cmLO+b\nsNrdktJTpgLoKo2c7V8u6V5Ju81sIFv2iKS1ZrZUkksalnRfWzoE0BaNnO1/T5JNUnoj/3YAdApX\n+AFBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Iyd+/czsyq\nkj6ZsGiepGMda+D76dbeurUvid6alWdvP3b3hu6X19Hwf2fnZhV3LxfWQEK39tatfUn01qyieuNt\nPxAU4QeCKjr8mwref0q39tatfUn01qxCeiv0Mz+A4hR95AdQkELCb2a3mdm/zWzIzDYW0UMtZjZs\nZruzmYcrBfey2cxGzWzPhGWXmNnbZrY/+z3pNGkF9dYVMzcnZpYu9LXrthmvO/6238x6JP1H0i2S\nDkn6QNJad9/b0UZqMLNhSWV3L3xM2Mx+JumkpD+6++Js2ROSPnP3x7N/OC929992SW+PSTpZ9MzN\n2YQyfRNnlpa0StKvVOBrl+hrtQp43Yo48i+TNOTuB9z9K0l/lrSygD66nru/K+mzby1eKWlL9niL\nxv/n6bgavXUFdz/s7h9lj7+QdH5m6UJfu0RfhSgi/JdLOjjh+SF115TfLukdM/vQzDYU3cwkerNp\n0yXpiKTeIpuZRN2ZmzvpWzNLd81r18yM13njhN933ejuSyXdLun+7O1tV/Lxz2zdNFzT0MzNnTLJ\nzNJfK/K1a3bG67wVEf4RSfMnPL8iW9YV3H0k+z0qabu6b/bho+cnSc1+jxbcz9e6aebmyWaWVhe8\ndt0043UR4f9A0kIzW2BmMyStkbSjgD6+w8xmZSdiZGazJP1S3Tf78A5J67LH6yS9XmAv39AtMzfX\nmllaBb92XTfjtbt3/EfSHRo/4/9fSY8W0UONvq6StCv7+bjo3iRt1fjbwP9p/NzIeklzJe2UtF/S\nO5Iu6aLeXpG0W9KgxoPWV1BvN2r8Lf2gpIHs546iX7tEX4W8blzhBwTFCT8gKMIPBEX4gaAIPxAU\n4QeCIvxAUIQfCIrwA0H9H9woLPnasvM+AAAAAElFTkSuQmCC\n",
179 | "text/plain": [
180 | ""
181 | ]
182 | },
183 | "metadata": {},
184 | "output_type": "display_data"
185 | }
186 | ],
187 | "source": [
188 | "# plot the +10 degree rotated variation\n",
189 | "matplotlib.pyplot.imshow(inputs_plus10_img, cmap='Greys', interpolation='None')"
190 | ]
191 | },
192 | {
193 | "cell_type": "code",
194 | "execution_count": 12,
195 | "metadata": {},
196 | "outputs": [
197 | {
198 | "data": {
199 | "text/plain": [
200 | ""
201 | ]
202 | },
203 | "execution_count": 12,
204 | "metadata": {},
205 | "output_type": "execute_result"
206 | },
207 | {
208 | "data": {
209 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADV1JREFUeJzt3V+IXOUZx/Hfk5jEmD+Q7U6TsEm6EaOggaYwhIJSUvpH\nK8XYG2kuSgRpvLClhV5UUrReSmkVL0phW0NjaW0LrRpQFI0FKZbiKom6tW22cUuzrNlJomaDuDGb\npxd70q5x551xzjlzJvt8P7DszHnOmfMw7G/PzLxnzmvuLgDxLKq6AQDVIPxAUIQfCIrwA0ERfiAo\nwg8ERfiBoAg/EBThB4K6rJs76+/v98HBwW7uEghlbGxMJ06csHbWzRV+M7tJ0kOSFkv6hbvfn1p/\ncHBQw8PDeXYJIKFer7e9bscv+81ssaSfSvqKpGsl7TKzazt9PADdlec9/3ZJo+5+1N3PSvqtpJ3F\ntAWgbHnCPyDpP3PuH8uWfYiZ7TGzYTMbbjQaOXYHoEilf9rv7kPuXnf3eq1WK3t3ANqUJ/zjkjbO\nub8hWwbgEpAn/C9J2mJmm81sqaSvSzpQTFsAytbxUJ+7nzOzb0l6RrNDffvcfaSwzgCUKtc4v7s/\nJempgnoB0EWc3gsERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFBd\nvXQ3es/MzEyu+tKlS4tsB13EkR8IivADQRF+ICjCDwRF+IGgCD8QFOEHgmKcf4E7f/58sj46Opqs\n33vvvcn6kSNHkvW9e/c2rd1yyy3JbTmHoFwc+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqFzj/GY2\nJmlK0oykc+5eL6IpFOeDDz5I1p955plk/cknn0zWp6enk/VDhw41rd14443JbRnnL1cRJ/l83t1P\nFPA4ALqIl/1AUHnD75KeM7OXzWxPEQ0B6I68L/tvcPdxM/ukpGfN7O/u/sLcFbJ/CnskadOmTTl3\nB6AouY787j6e/Z6U9Jik7fOsM+TudXev12q1PLsDUKCOw29mK8xs1YXbkr4s6fWiGgNQrjwv+9dK\neszMLjzOb9z96UK6AlC6jsPv7kclfbrAXlCCJUuWJOsrVqxI1vv6+pL1Y8eOJev9/f1Na9mBAxVh\nqA8IivADQRF+ICjCDwRF+IGgCD8QFJfuXuDcPVl/9913k/UzZ87k2v8111zTtNZqGBLl4sgPBEX4\ngaAIPxAU4QeCIvxAUIQfCIrwA0Exzr8ApMbyp6amkts+/XT6EgzvvPNOst5qrH7Dhg1Na8uWLUtu\ni3Jx5AeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoBjnXwBS4/wnT55MbjsxMZFr3wMDA8l66jyBs2fP\nJrdliu5yceQHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaBajvOb2T5JX5U06e5bs2V9kn4naVDSmKTb\n3P3t8tpEyqJFzf+Hr1mzJrntjh07kvWRkZFkvdFoJOuPP/5409qWLVuS265bty5ZRz7tHPl/Kemm\ni5bdLemgu2+RdDC7D+AS0jL87v6CpFMXLd4paX92e7+kWwvuC0DJOn3Pv9bdL5wX+paktQX1A6BL\ncn/g57Mnljc9udzM9pjZsJkNt3p/CKB7Og3/cTNbL0nZ78lmK7r7kLvX3b1eq9U63B2AonUa/gOS\ndme3d0t6oph2AHRLy/Cb2aOS/iLpGjM7ZmZ3SLpf0pfM7IikL2b3AVxCWo7zu/uuJqUvFNwLStDX\n15es530r9t577yXrp05dPFD0f9PT07n2jXw4ww8IivADQRF+ICjCDwRF+IGgCD8QFJfuDu7w4cPJ\neuqy4O1IXdp79erVuR4b+XDkB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgGOcP7sUXX0zWzSzX42/Y\nsKFpbfny5bkeG/lw5AeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoBjnX+CmpqaS9ePHjyfrrcb5r7ji\nimQ9Nc5/+eWXJ7dFuTjyA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQLcf5zWyfpK9KmnT3rdmy+yR9\nU1IjW22vuz9VVpPo3Jtvvlnq42/atClZ7+/vL3X/6Fw7R/5fSrppnuUPuvu27IfgA5eYluF39xck\nnepCLwC6KM97/m+b2atmts/M1hTWEYCu6DT8P5N0paRtkiYk/aTZima2x8yGzWy40Wg0Ww1Al3UU\nfnc/7u4z7n5e0s8lbU+sO+TudXev12q1TvsEULCOwm9m6+fc/Zqk14tpB0C3tDPU96ikHZL6zeyY\npB9K2mFm2yS5pDFJd5bYI4AStAy/u++aZ/HDJfSCEhw+fDhZd/dcj9/q+/x5Hx/l4Qw/ICjCDwRF\n+IGgCD8QFOEHgiL8QFBcunuBO3r0aLK+ePHiZP38+fPJequvDJ84caJpbWZmJrltq96QD0d+ICjC\nDwRF+IGgCD8QFOEHgiL8QFCEHwiKcf4F7uqrr07W161bl6xPTEwk62+//XayPjo62rQ2PT2d3LbV\n14WRD0d+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiKcf4FbuPGjcn6yZMnS93/5ORk0xrf168WR34g\nKMIPBEX4gaAIPxAU4QeCIvxAUIQfCKrlOL+ZbZT0iKS1klzSkLs/ZGZ9kn4naVDSmKTb3D395W50\n3datW5P1gYGBZL3Vdf8vuyz9JzQ+Pt60tmgRx54qtfPsn5P0PXe/VtJnJd1lZtdKulvSQXffIulg\ndh/AJaJl+N19wt1fyW5PSXpD0oCknZL2Z6vtl3RrWU0CKN7Het1lZoOSPiPpr5LWuvuFazy9pdm3\nBQAuEW2H38xWSvqDpO+6++m5NXd3zX4eMN92e8xs2MyGG41GrmYBFKet8JvZEs0G/9fu/sds8XEz\nW5/V10ua9xsc7j7k7nV3r9dqtSJ6BlCAluE3M5P0sKQ33P2BOaUDknZnt3dLeqL49gCUpZ2v9F4v\n6RuSXjOzQ9myvZLul/R7M7tD0r8l3VZOi8jj9OnTyfrmzZuT9VZDfefOnUvWU1N0v//++8ltlyxZ\nkqwjn5bhd/c/S7Im5S8U2w6AbuEsCyAowg8ERfiBoAg/EBThB4Ii/EBQXLp7gVu9enWyft111yXr\nzz//fLI+e2Z3Z4/PpburxZEfCIrwA0ERfiAowg8ERfiBoAg/EBThB4JinH+BW7ZsWbK+cuXKZH3V\nqlXJeqvrBTDO37s48gNBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIzzL3DLly9P1u+5555k/fbbb0/W\nR0ZGkvV6vd601mp6b5SLIz8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBNVyoNXMNkp6RNJaSS5pyN0f\nMrP7JH1TUiNbda+7P1VWoyhHq+/7X3XVVbnq6F3tnGVxTtL33P0VM1sl6WUzezarPejuPy6vPQBl\naRl+d5+QNJHdnjKzNyQNlN0YgHJ9rPf8ZjYo6TOS/pot+raZvWpm+8xsTZNt9pjZsJkNNxqN+VYB\nUIG2w29mKyX9QdJ33f20pJ9JulLSNs2+MvjJfNu5+5C71929XqvVCmgZQBHaCr+ZLdFs8H/t7n+U\nJHc/7u4z7n5e0s8lbS+vTQBFaxl+MzNJD0t6w90fmLN8/ZzVvibp9eLbA1CWdj7tv17SNyS9ZmaH\nsmV7Je0ys22aHf4bk3RnKR0CKEU7n/b/WZLNU2JMH7iEcYYfEBThB4Ii/EBQhB8IivADQRF+ICjC\nDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKHP37u3MrCHp33MW9Us60bUGPp5e7a1X+5LorVNF\n9vYpd2/renldDf9Hdm427O7NJ3CvUK/21qt9SfTWqap642U/EBThB4KqOvxDFe8/pVd769W+JHrr\nVCW9VfqeH0B1qj7yA6hIJeE3s5vM7B9mNmpmd1fRQzNmNmZmr5nZITMbrriXfWY2aWavz1nWZ2bP\nmtmR7Pe806RV1Nt9ZjaePXeHzOzminrbaGZ/MrO/mdmImX0nW17pc5foq5Lnresv+81ssaR/SvqS\npGOSXpK0y93/1tVGmjCzMUl1d698TNjMPifpjKRH3H1rtuxHkk65+/3ZP8417v79HuntPklnqp65\nOZtQZv3cmaUl3SrpdlX43CX6uk0VPG9VHPm3Sxp196PuflbSbyXtrKCPnufuL0g6ddHinZL2Z7f3\na/aPp+ua9NYT3H3C3V/Jbk9JujCzdKXPXaKvSlQR/gFJ/5lz/5h6a8pvl/Scmb1sZnuqbmYea7Np\n0yXpLUlrq2xmHi1nbu6mi2aW7pnnrpMZr4vGB34fdYO7b5P0FUl3ZS9ve5LPvmfrpeGatmZu7pZ5\nZpb+nyqfu05nvC5aFeEfl7Rxzv0N2bKe4O7j2e9JSY+p92YfPn5hktTs92TF/fxPL83cPN/M0uqB\n566XZryuIvwvSdpiZpvNbKmkr0s6UEEfH2FmK7IPYmRmKyR9Wb03+/ABSbuz27slPVFhLx/SKzM3\nN5tZWhU/dz0347W7d/1H0s2a/cT/X5J+UEUPTfq6UtLh7Gek6t4kParZl4EfaPazkTskfULSQUlH\nJD0nqa+HevuVpNckvarZoK2vqLcbNPuS/lVJh7Kfm6t+7hJ9VfK8cYYfEBQf+AFBEX4gKMIPBEX4\ngaAIPxAU4QeCIvxAUIQfCOq/JLAmNYBsjLcAAAAASUVORK5CYII=\n",
210 | "text/plain": [
211 | ""
212 | ]
213 | },
214 | "metadata": {},
215 | "output_type": "display_data"
216 | }
217 | ],
218 | "source": [
219 | "# plot the +10 degree rotated variation\n",
220 | "matplotlib.pyplot.imshow(inputs_minus10_img, cmap='Greys', interpolation='None')"
221 | ]
222 | },
223 | {
224 | "cell_type": "code",
225 | "execution_count": null,
226 | "metadata": {
227 | "collapsed": true
228 | },
229 | "outputs": [],
230 | "source": []
231 | }
232 | ],
233 | "metadata": {
234 | "kernelspec": {
235 | "display_name": "Python 3",
236 | "language": "python",
237 | "name": "python3"
238 | },
239 | "language_info": {
240 | "codemirror_mode": {
241 | "name": "ipython",
242 | "version": 3
243 | },
244 | "file_extension": ".py",
245 | "mimetype": "text/x-python",
246 | "name": "python",
247 | "nbconvert_exporter": "python",
248 | "pygments_lexer": "ipython3",
249 | "version": "3.6.1"
250 | }
251 | },
252 | "nbformat": 4,
253 | "nbformat_minor": 1
254 | }
255 |
--------------------------------------------------------------------------------
/code/part3_neural_network_mnist_and_own_data.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# code for a 3-layer neural network, and code for learning the MNIST dataset\n",
13 | "# this version trains using the MNIST dataset, then tests on our own images\n",
14 | "# (c) Tariq Rashid, 2016\n",
15 | "# license is GPLv2"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 2,
21 | "metadata": {
22 | "collapsed": true
23 | },
24 | "outputs": [],
25 | "source": [
26 | "import numpy\n",
27 | "# scipy.special for the sigmoid function expit()\n",
28 | "import scipy.special\n",
29 | "# library for plotting arrays\n",
30 | "import matplotlib.pyplot\n",
31 | "# ensure the plots are inside this notebook, not an external window\n",
32 | "%matplotlib inline"
33 | ]
34 | },
35 | {
36 | "cell_type": "code",
37 | "execution_count": 3,
38 | "metadata": {
39 | "collapsed": true
40 | },
41 | "outputs": [],
42 | "source": [
43 | "# helper to load data from PNG image files\n",
44 | "import scipy.misc\n",
45 | "# glob helps select multiple files using patterns\n",
46 | "import glob"
47 | ]
48 | },
49 | {
50 | "cell_type": "code",
51 | "execution_count": 4,
52 | "metadata": {
53 | "collapsed": true
54 | },
55 | "outputs": [],
56 | "source": [
57 | "# neural network class definition\n",
58 | "class neuralNetwork:\n",
59 | " \n",
60 | " \n",
61 | " # initialise the neural network\n",
62 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
63 | " # set number of nodes in each input, hidden, output layer\n",
64 | " self.inodes = inputnodes\n",
65 | " self.hnodes = hiddennodes\n",
66 | " self.onodes = outputnodes\n",
67 | " \n",
68 | " # link weight matrices, wih and who\n",
69 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
70 | " # w11 w21\n",
71 | " # w12 w22 etc \n",
72 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
73 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
74 | "\n",
75 | " # learning rate\n",
76 | " self.lr = learningrate\n",
77 | " \n",
78 | " # activation function is the sigmoid function\n",
79 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
80 | " \n",
81 | " pass\n",
82 | "\n",
83 | " \n",
84 | " # train the neural network\n",
85 | " def train(self, inputs_list, targets_list):\n",
86 | " # convert inputs list to 2d array\n",
87 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
88 | " targets = numpy.array(targets_list, ndmin=2).T\n",
89 | " \n",
90 | " # calculate signals into hidden layer\n",
91 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
92 | " # calculate the signals emerging from hidden layer\n",
93 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
94 | " \n",
95 | " # calculate signals into final output layer\n",
96 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
97 | " # calculate the signals emerging from final output layer\n",
98 | " final_outputs = self.activation_function(final_inputs)\n",
99 | " \n",
100 | " # output layer error is the (target - actual)\n",
101 | " output_errors = targets - final_outputs\n",
102 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
103 | " hidden_errors = numpy.dot(self.who.T, output_errors) \n",
104 | " \n",
105 | " # update the weights for the links between the hidden and output layers\n",
106 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
107 | " \n",
108 | " # update the weights for the links between the input and hidden layers\n",
109 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
110 | " \n",
111 | " pass\n",
112 | "\n",
113 | " \n",
114 | " # query the neural network\n",
115 | " def query(self, inputs_list):\n",
116 | " # convert inputs list to 2d array\n",
117 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
118 | " \n",
119 | " # calculate signals into hidden layer\n",
120 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
121 | " # calculate the signals emerging from hidden layer\n",
122 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
123 | " \n",
124 | " # calculate signals into final output layer\n",
125 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
126 | " # calculate the signals emerging from final output layer\n",
127 | " final_outputs = self.activation_function(final_inputs)\n",
128 | " \n",
129 | " return final_outputs"
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": 5,
135 | "metadata": {
136 | "collapsed": true
137 | },
138 | "outputs": [],
139 | "source": [
140 | "# number of input, hidden and output nodes\n",
141 | "input_nodes = 784\n",
142 | "hidden_nodes = 200\n",
143 | "output_nodes = 10\n",
144 | "\n",
145 | "# learning rate\n",
146 | "learning_rate = 0.1\n",
147 | "\n",
148 | "# create instance of neural network\n",
149 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "execution_count": 6,
155 | "metadata": {
156 | "collapsed": true
157 | },
158 | "outputs": [],
159 | "source": [
160 | "# load the mnist training data CSV file into a list\n",
161 | "training_data_file = open(\"mnist_dataset/mnist_train.csv\", 'r')\n",
162 | "training_data_list = training_data_file.readlines()\n",
163 | "training_data_file.close()"
164 | ]
165 | },
166 | {
167 | "cell_type": "code",
168 | "execution_count": 11,
169 | "metadata": {
170 | "collapsed": true
171 | },
172 | "outputs": [],
173 | "source": [
174 | "# train the neural network\n",
175 | "\n",
176 | "# epochs is the number of times the training data set is used for training\n",
177 | "epochs = 10\n",
178 | "\n",
179 | "for e in range(epochs):\n",
180 | " # go through all records in the training data set\n",
181 | " for record in training_data_list:\n",
182 | " # split the record by the ',' commas\n",
183 | " all_values = record.split(',')\n",
184 | " # scale and shift the inputs\n",
185 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
186 | " # create the target output values (all 0.01, except the desired label which is 0.99)\n",
187 | " targets = numpy.zeros(output_nodes) + 0.01\n",
188 | " # all_values[0] is the target label for this record\n",
189 | " targets[int(all_values[0])] = 0.99\n",
190 | " n.train(inputs, targets)\n",
191 | " pass\n",
192 | " pass"
193 | ]
194 | },
195 | {
196 | "cell_type": "code",
197 | "execution_count": 12,
198 | "metadata": {},
199 | "outputs": [
200 | {
201 | "name": "stdout",
202 | "output_type": "stream",
203 | "text": [
204 | "loading ... my_own_images/2828_my_own_6.png\n",
205 | "0.01\n",
206 | "1.0\n",
207 | "loading ... my_own_images/2828_my_own_5.png\n",
208 | "0.01\n",
209 | "0.868\n",
210 | "loading ... my_own_images/2828_my_own_3.png\n",
211 | "0.01\n",
212 | "1.0\n",
213 | "loading ... my_own_images/2828_my_own_4.png\n",
214 | "0.01\n",
215 | "0.930118\n",
216 | "loading ... my_own_images/2828_my_own_2.png\n",
217 | "0.01\n",
218 | "1.0\n"
219 | ]
220 | }
221 | ],
222 | "source": [
223 | "# our own image test data set\n",
224 | "our_own_dataset = []\n",
225 | "\n",
226 | "# load the png image data as test data set\n",
227 | "for image_file_name in glob.glob('my_own_images/2828_my_own_?.png'):\n",
228 | " \n",
229 | " # use the filename to set the correct label\n",
230 | " label = int(image_file_name[-5:-4])\n",
231 | " \n",
232 | " # load image data from png files into an array\n",
233 | " print (\"loading ... \", image_file_name)\n",
234 | " img_array = scipy.misc.imread(image_file_name, flatten=True)\n",
235 | " \n",
236 | " # reshape from 28x28 to list of 784 values, invert values\n",
237 | " img_data = 255.0 - img_array.reshape(784)\n",
238 | " \n",
239 | " # then scale data to range from 0.01 to 1.0\n",
240 | " img_data = (img_data / 255.0 * 0.99) + 0.01\n",
241 | " print(numpy.min(img_data))\n",
242 | " print(numpy.max(img_data))\n",
243 | " \n",
244 | " # append label and image data to test data set\n",
245 | " record = numpy.append(label,img_data)\n",
246 | " our_own_dataset.append(record)\n",
247 | " \n",
248 | " pass"
249 | ]
250 | },
251 | {
252 | "cell_type": "code",
253 | "execution_count": 17,
254 | "metadata": {},
255 | "outputs": [
256 | {
257 | "name": "stdout",
258 | "output_type": "stream",
259 | "text": [
260 | "[[ 1.04770075e-02]\n",
261 | " [ 3.10563984e-03]\n",
262 | " [ 2.89956440e-03]\n",
263 | " [ 6.71413536e-01]\n",
264 | " [ 5.94502331e-03]\n",
265 | " [ 3.01325959e-02]\n",
266 | " [ 1.70101838e-03]\n",
267 | " [ 1.11106700e-03]\n",
268 | " [ 3.90449549e-04]\n",
269 | " [ 1.34192975e-03]]\n",
270 | "network says 3\n",
271 | "match!\n"
272 | ]
273 | },
274 | {
275 | "data": {
276 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAADW9JREFUeJzt3X2IXOUVx/HfiRpijAlJM4bVqptIqIrYFMdQiZQ2tdWK\nGPuPNmjYgrgVa3wLUrHi2x8Sa1UKKYVU16YlNRZ8iyAtMRYkoUQ3kvqStMaGDU2ySSYoGhOlzfb0\nj73KqjvPTGbuzJ3d8/3AMDP33Lv3cNnf3jvzzM5j7i4A8UwougEAxSD8QFCEHwiK8ANBEX4gKMIP\nBEX4gaAIPxAU4QeCOradO5s5c6Z3d3e3c5dAKAMDAzpw4IDVs25T4TezSyT9StIxkh5z9+Wp9bu7\nu9Xf39/MLgEklMvlutdt+LLfzI6R9GtJP5B0tqTFZnZ2oz8PQHs185p/vqR33X2Hu/9H0hpJi/Jp\nC0CrNRP+UyT9e8TzXdmyzzGzXjPrN7P+SqXSxO4A5Knl7/a7+0p3L7t7uVQqtXp3AOrUTPh3Szp1\nxPOvZssAjAHNhP81SXPNbLaZTZT0I0lr82kLQKs1PNTn7kfM7EZJf9HwUF+fu7+dW2cAWqqpcX53\nf1HSizn1AqCN+HgvEBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANB\ntfWru9GYoaGhZP3QoUNVa1OnTk1uu2bNmmR9y5Ytyfq0adOS9auuuqpqbc6cOclt0Vqc+YGgCD8Q\nFOEHgiL8QFCEHwiK8ANBEX4gKMb522Dr1q3J+urVq5P1DRs2JOu9vb1Va1dffXVy2zPOOCNZT32G\nQJIGBweT9bvuuqtq7aSTTkpuu3x5ctJnTZo0KVlHGmd+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiq\nqXF+MxuQdFDSkKQj7l7Oo6nxZsKE9N/Yrq6uZP2JJ55I1ru7u4+2pc+cf/75TdWbsWDBgmT9scce\nS9ZvvPHGPNsJJ48P+XzH3Q/k8HMAtBGX/UBQzYbfJb1kZpvNrPpnTAF0nGYv+y90991mdpKkdWb2\nD3d/ZeQK2R+FXkk67bTTmtwdgLw0deZ3993Z/X5Jz0qaP8o6K9297O7lUqnUzO4A5Kjh8JvZCWZ2\n4qePJX1f0lt5NQagtZq57J8l6Vkz+/Tn/NHd/5xLVwBaruHwu/sOSV/PsZdx68wzz2yqPl5t3Lgx\nWd+5c2eyfvjw4WR98uTJR91TJAz1AUERfiAowg8ERfiBoAg/EBThB4Liq7vRsVasWJGsz507N1lP\nfaU5OPMDYRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCM86Nj7dixI1m/+OKL29TJ+MSZHwiK8ANBEX4g\nKMIPBEX4gaAIPxAU4QeCYpwfhbnnnnuS9WOPTf96Lly4MM92wuHMDwRF+IGgCD8QFOEHgiL8QFCE\nHwiK8ANB1RznN7M+SZdJ2u/u52TLZkh6SlK3pAFJV7r7+61rE63y8ccfJ+vr1q1L1jds2JCsX3fd\ndVVrS5YsSW57+umnJ+sTJnDuakY9R+93ki75wrI7JK1397mS1mfPAYwhNcPv7q9Ieu8LixdJWpU9\nXiXpipz7AtBijV43zXL3wezxXkmzcuoHQJs0/aLJ3V2SV6ubWa+Z9ZtZf6VSaXZ3AHLSaPj3mVmX\nJGX3+6ut6O4r3b3s7uVSqdTg7gDkrdHwr5XUkz3ukfR8Pu0AaJea4TezJyX9TdLXzGyXmV0rabmk\n75nZdkkXZc8BjCE1x/ndfXGV0ndz7gUFWLFiRbJ+3333JeuHDh1K1qdMmVK1dvfddye3RWvxKQkg\nKMIPBEX4gaAIPxAU4QeCIvxAUHx1d3BLly5N1m+++eZkfc+ePcn6NddcU7V25MiR5Lb3339/so7m\ncOYHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAY5w9u0qRJTW3f3d2drL/88stVa7W+9nvTpk3J+rnn\nnpusH3/88cl6dJz5gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAoxvnRUhMnTqxaW7hwYXLbCy64IFm/\n/vrrk/Wenp5kPTrO/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QVM1xfjPrk3SZpP3ufk627F5J10mq\nZKvd6e4vtqpJxPTQQw8l6xs3bmxTJ+NTPWf+30m6ZJTlj7r7vOxG8IExpmb43f0VSe+1oRcAbdTM\na/6lZvaGmfWZ2fTcOgLQFo2G/zeS5kiaJ2lQ0sPVVjSzXjPrN7P+SqVSbTUAbdZQ+N19n7sPufv/\nJP1W0vzEuivdvezu5VKp1GifAHLWUPjNrGvE0x9KeiufdgC0Sz1DfU9K+rakmWa2S9I9kr5tZvMk\nuaQBST9pYY8AWqBm+N198SiLH29BL8DnzJ9f9dWkJOnkk09uUyfjE5/wA4Ii/EBQhB8IivADQRF+\nICjCDwTFV3ejY23evDlZf+CBB5L1F154Ic92xh3O/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOP8\nbbB3795kferUqcn65MmT82ynYwwNDSXrS5YsSdZrTdGNNM78QFCEHwiK8ANBEX4gKMIPBEX4gaAI\nPxAU4/xt8Oqrrybrt912W7L+zjvvJOsTJozNv+EHDx5M1h9+uOoscJKkRYsW5dlOOGPztwZA0wg/\nEBThB4Ii/EBQhB8IivADQRF+IKia4/xmdqqk30uaJcklrXT3X5nZDElPSeqWNCDpSnd/v3Wtjl2X\nX355sr59+/Zk/aKLLkrWn3vuuaq1Wt8VUMvhw4eT9W3btiXrN9xwQ9VaX19fclvG8VurnjP/EUnL\n3P1sSd+U9FMzO1vSHZLWu/tcSeuz5wDGiJrhd/dBd389e3xQ0jZJp0haJGlVttoqSVe0qkkA+Tuq\n1/xm1i3pG5I2SZrl7oNZaa+GXxYAGCPqDr+ZTZH0tKRb3P3DkTV3dw2/HzDadr1m1m9m/ZVKpalm\nAeSnrvCb2XEaDv5qd38mW7zPzLqyepek/aNt6+4r3b3s7uVSqZRHzwByUDP8ZmaSHpe0zd0fGVFa\nK6kne9wj6fn82wPQKvX8S+8CSUskvWlmW7Jld0paLulPZnatpJ2SrmxNi+PfsmXLkvVa/7K7Z8+e\nqrVPPvkkue3tt9+erO/bty9Znz59erL+4IMPVq2dddZZyW3RWjXD7+4bJFmV8nfzbQdAu/AJPyAo\nwg8ERfiBoAg/EBThB4Ii/EBQfHX3GHDrrbc2vO0HH3yQrN90003J+uzZs5P1GTNmHHVP6Ayc+YGg\nCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMb5x7lp06Yl6+edd16bOkGn4cwPBEX4gaAIPxAU4QeCIvxA\nUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQdUMv5mdamZ/NbOtZva2md2cLb/X\nzHab2Zbsdmnr2wWQl3q+zOOIpGXu/rqZnShps5mty2qPuvsvW9cegFapGX53H5Q0mD0+aGbbJJ3S\n6sYAtNZRveY3s25J35C0KVu01MzeMLM+M5teZZteM+s3s/5KpdJUswDyU3f4zWyKpKcl3eLuH0r6\njaQ5kuZp+Mrg4dG2c/eV7l5293KpVMqhZQB5qCv8ZnachoO/2t2fkSR33+fuQ+7+P0m/lTS/dW0C\nyFs97/abpMclbXP3R0Ys7xqx2g8lvZV/ewBapZ53+xdIWiLpTTPbki27U9JiM5snySUNSPpJSzoE\n0BL1vNu/QZKNUnox/3YAtAuf8AOCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8I\nivADQRF+ICjCDwRl7t6+nZlVJO0csWimpANta+DodGpvndqXRG+NyrO30929ru/La2v4v7Rzs353\nLxfWQEKn9tapfUn01qiieuOyHwiK8ANBFR3+lQXvP6VTe+vUviR6a1QhvRX6mh9AcYo+8wMoSCHh\nN7NLzOyfZvaumd1RRA/VmNmAmb2ZzTzcX3AvfWa238zeGrFshpmtM7Pt2f2o06QV1FtHzNycmFm6\n0GPXaTNet/2y38yOkfSOpO9J2iXpNUmL3X1rWxupwswGJJXdvfAxYTP7lqSPJP3e3c/Jlv1C0nvu\nvjz7wznd3X/WIb3dK+mjomduziaU6Ro5s7SkKyT9WAUeu0RfV6qA41bEmX++pHfdfYe7/0fSGkmL\nCuij47n7K5Le+8LiRZJWZY9XafiXp+2q9NYR3H3Q3V/PHh+U9OnM0oUeu0RfhSgi/KdI+veI57vU\nWVN+u6SXzGyzmfUW3cwoZmXTpkvSXkmzimxmFDVnbm6nL8ws3THHrpEZr/PGG35fdqG7z5P0A0k/\nzS5vO5IPv2brpOGaumZubpdRZpb+TJHHrtEZr/NWRPh3Szp1xPOvZss6grvvzu73S3pWnTf78L5P\nJ0nN7vcX3M9nOmnm5tFmllYHHLtOmvG6iPC/Jmmumc02s4mSfiRpbQF9fImZnZC9ESMzO0HS99V5\nsw+vldSTPe6R9HyBvXxOp8zcXG1maRV87Dpuxmt3b/tN0qUafsf/X5J+XkQPVfqaI+nv2e3tonuT\n9KSGLwP/q+H3Rq6V9BVJ6yVtl/SSpBkd1NsfJL0p6Q0NB62roN4u1PAl/RuStmS3S4s+dom+Cjlu\nfMIPCIo3/ICgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBPV/OtAMJKwFHisAAAAASUVORK5CYII=\n",
277 | "text/plain": [
278 | ""
279 | ]
280 | },
281 | "metadata": {},
282 | "output_type": "display_data"
283 | }
284 | ],
285 | "source": [
286 | "# test the neural network with our own images\n",
287 | "\n",
288 | "# record to test\n",
289 | "item = 2\n",
290 | "\n",
291 | "# plot image\n",
292 | "matplotlib.pyplot.imshow(our_own_dataset[item][1:].reshape(28,28), cmap='Greys', interpolation='None')\n",
293 | "\n",
294 | "# correct answer is first value\n",
295 | "correct_label = our_own_dataset[item][0]\n",
296 | "# data is remaining values\n",
297 | "inputs = our_own_dataset[item][1:]\n",
298 | "\n",
299 | "# query the network\n",
300 | "outputs = n.query(inputs)\n",
301 | "print (outputs)\n",
302 | "\n",
303 | "# the index of the highest value corresponds to the label\n",
304 | "label = numpy.argmax(outputs)\n",
305 | "print(\"network says \", label)\n",
306 | "# append correct or incorrect to list\n",
307 | "if (label == correct_label):\n",
308 | " print (\"match!\")\n",
309 | "else:\n",
310 | " print (\"no match!\")\n",
311 | " pass\n"
312 | ]
313 | },
314 | {
315 | "cell_type": "code",
316 | "execution_count": null,
317 | "metadata": {
318 | "collapsed": true
319 | },
320 | "outputs": [],
321 | "source": []
322 | }
323 | ],
324 | "metadata": {
325 | "kernelspec": {
326 | "display_name": "Python 3",
327 | "language": "python",
328 | "name": "python3"
329 | },
330 | "language_info": {
331 | "codemirror_mode": {
332 | "name": "ipython",
333 | "version": 3
334 | },
335 | "file_extension": ".py",
336 | "mimetype": "text/x-python",
337 | "name": "python",
338 | "nbconvert_exporter": "python",
339 | "pygments_lexer": "ipython3",
340 | "version": "3.6.1"
341 | }
342 | },
343 | "nbformat": 4,
344 | "nbformat_minor": 1
345 | }
346 |
--------------------------------------------------------------------------------
/code/part3_neural_network_mnist_and_own_single_image.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# code for a 3-layer neural network, and code for learning the MNIST dataset\n",
13 | "# this version trains using the MNIST dataset, then tests on our own images\n",
14 | "# (c) Tariq Rashid, 2016\n",
15 | "# license is GPLv2"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 2,
21 | "metadata": {
22 | "collapsed": true
23 | },
24 | "outputs": [],
25 | "source": [
26 | "import numpy\n",
27 | "# scipy.special for the sigmoid function expit()\n",
28 | "import scipy.special\n",
29 | "# library for plotting arrays\n",
30 | "import matplotlib.pyplot\n",
31 | "# ensure the plots are inside this notebook, not an external window\n",
32 | "%matplotlib inline"
33 | ]
34 | },
35 | {
36 | "cell_type": "code",
37 | "execution_count": 3,
38 | "metadata": {
39 | "collapsed": true
40 | },
41 | "outputs": [],
42 | "source": [
43 | "# helper to load data from PNG image files\n",
44 | "import scipy.misc"
45 | ]
46 | },
47 | {
48 | "cell_type": "code",
49 | "execution_count": 4,
50 | "metadata": {},
51 | "outputs": [],
52 | "source": [
53 | "# neural network class definition\n",
54 | "class neuralNetwork:\n",
55 | " \n",
56 | " \n",
57 | " # initialise the neural network\n",
58 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
59 | " # set number of nodes in each input, hidden, output layer\n",
60 | " self.inodes = inputnodes\n",
61 | " self.hnodes = hiddennodes\n",
62 | " self.onodes = outputnodes\n",
63 | " \n",
64 | " # link weight matrices, wih and who\n",
65 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
66 | " # w11 w21\n",
67 | " # w12 w22 etc \n",
68 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
69 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
70 | "\n",
71 | " # learning rate\n",
72 | " self.lr = learningrate\n",
73 | " \n",
74 | " # activation function is the sigmoid function\n",
75 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
76 | " \n",
77 | " pass\n",
78 | "\n",
79 | " \n",
80 | " # train the neural network\n",
81 | " def train(self, inputs_list, targets_list):\n",
82 | " # convert inputs list to 2d array\n",
83 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
84 | " targets = numpy.array(targets_list, ndmin=2).T\n",
85 | " \n",
86 | " # calculate signals into hidden layer\n",
87 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
88 | " # calculate the signals emerging from hidden layer\n",
89 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
90 | " \n",
91 | " # calculate signals into final output layer\n",
92 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
93 | " # calculate the signals emerging from final output layer\n",
94 | " final_outputs = self.activation_function(final_inputs)\n",
95 | " \n",
96 | " # output layer error is the (target - actual)\n",
97 | " output_errors = targets - final_outputs\n",
98 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
99 | " hidden_errors = numpy.dot(self.who.T, output_errors) \n",
100 | " \n",
101 | " # update the weights for the links between the hidden and output layers\n",
102 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
103 | " \n",
104 | " # update the weights for the links between the input and hidden layers\n",
105 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
106 | " \n",
107 | " pass\n",
108 | "\n",
109 | " \n",
110 | " # query the neural network\n",
111 | " def query(self, inputs_list):\n",
112 | " # convert inputs list to 2d array\n",
113 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
114 | " \n",
115 | " # calculate signals into hidden layer\n",
116 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
117 | " # calculate the signals emerging from hidden layer\n",
118 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
119 | " \n",
120 | " # calculate signals into final output layer\n",
121 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
122 | " # calculate the signals emerging from final output layer\n",
123 | " final_outputs = self.activation_function(final_inputs)\n",
124 | " \n",
125 | " return final_outputs"
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "execution_count": 5,
131 | "metadata": {},
132 | "outputs": [],
133 | "source": [
134 | "# number of input, hidden and output nodes\n",
135 | "input_nodes = 784\n",
136 | "hidden_nodes = 200\n",
137 | "output_nodes = 10\n",
138 | "\n",
139 | "# learning rate\n",
140 | "learning_rate = 0.1\n",
141 | "\n",
142 | "# create instance of neural network\n",
143 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
144 | ]
145 | },
146 | {
147 | "cell_type": "code",
148 | "execution_count": 6,
149 | "metadata": {},
150 | "outputs": [],
151 | "source": [
152 | "# load the mnist training data CSV file into a list\n",
153 | "training_data_file = open(\"mnist_dataset/mnist_train.csv\", 'r')\n",
154 | "training_data_list = training_data_file.readlines()\n",
155 | "training_data_file.close()"
156 | ]
157 | },
158 | {
159 | "cell_type": "code",
160 | "execution_count": 7,
161 | "metadata": {},
162 | "outputs": [],
163 | "source": [
164 | "# train the neural network\n",
165 | "\n",
166 | "# epochs is the number of times the training data set is used for training\n",
167 | "epochs = 10\n",
168 | "\n",
169 | "for e in range(epochs):\n",
170 | " # go through all records in the training data set\n",
171 | " for record in training_data_list:\n",
172 | " # split the record by the ',' commas\n",
173 | " all_values = record.split(',')\n",
174 | " # scale and shift the inputs\n",
175 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
176 | " # create the target output values (all 0.01, except the desired label which is 0.99)\n",
177 | " targets = numpy.zeros(output_nodes) + 0.01\n",
178 | " # all_values[0] is the target label for this record\n",
179 | " targets[int(all_values[0])] = 0.99\n",
180 | " n.train(inputs, targets)\n",
181 | " pass\n",
182 | " pass"
183 | ]
184 | },
185 | {
186 | "cell_type": "markdown",
187 | "metadata": {},
188 | "source": [
189 | "test with our own image "
190 | ]
191 | },
192 | {
193 | "cell_type": "code",
194 | "execution_count": 9,
195 | "metadata": {},
196 | "outputs": [
197 | {
198 | "name": "stdout",
199 | "output_type": "stream",
200 | "text": [
201 | "loading ... my_own_images/2828_my_own_image.png\n",
202 | "min = 0.01\n",
203 | "max = 1.0\n",
204 | "[[ 0.0027999 ]\n",
205 | " [ 0.0037432 ]\n",
206 | " [ 0.01817265]\n",
207 | " [ 0.92297039]\n",
208 | " [ 0.00246505]\n",
209 | " [ 0.00704918]\n",
210 | " [ 0.17021171]\n",
211 | " [ 0.02702109]\n",
212 | " [ 0.00896418]\n",
213 | " [ 0.01368427]]\n",
214 | "network says 3\n"
215 | ]
216 | },
217 | {
218 | "data": {
219 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAD1VJREFUeJzt3V+MVGWax/Hfs8gEUYiy9iBRkMGIUUyEpGKM/zLrMOIo\nCYJ/AhcGEp0eDTPuJF6sgYvVRAkSYeLFxohKBjYsg4YxoqgbJas4ZjEWxhUcdFVsAgjShonDXEg3\n8OxFH2db7PNWWXW6TnU/30/S6arznLfrseKPU1VvnfOauwtAPP9QdgMAykH4gaAIPxAU4QeCIvxA\nUIQfCIrwA0ERfiAowg8EdVorH+ycc87xyZMnt/IhgVC6urr01VdfWT37NhV+M7tR0uOSRkh62t2X\np/afPHmyqtVqMw8JIKFSqdS9b8Mv+81shKR/k/QLSZdKWmBmlzb69wC0VjPv+a+Q9Km773H3Hkl/\nkDSnmLYADLZmwn+epH397u/Ptn2HmXWaWdXMqt3d3U08HIAiDfqn/e6+2t0r7l7p6OgY7IcDUKdm\nwn9A0sR+98/PtgEYApoJ/7uSLjKzn5jZjyTNl7S5mLYADLaGp/rc/biZ/VrSf6pvqm+Nu39YWGcA\nBlVT8/zu/rKklwvqBUAL8fVeICjCDwRF+IGgCD8QFOEHgiL8QFAtPZ8fAzt06FCy/tRTTyXru3bt\nyq0dPHgwOfbYsWPJuln61PBa9dSKUBdffHFy7A033JCsz507N1kfPXp0sh4dR34gKMIPBEX4gaAI\nPxAU4QeCIvxAUEz1tcA777yTrF933XXJek9PT5HttI1az8u6deuS9TFjxiTrr776am7tqquuSo6N\ngCM/EBThB4Ii/EBQhB8IivADQRF+ICjCDwTFPH8Bap2Se+211ybrx48fT9ZXrFiRrM+fPz+3Nnbs\n2OTYESNGJOu1TtmtJXVK7969e5Nja83zr1y5MlmfOXNmbm3//v3JsePGjUvWhwOO/EBQhB8IivAD\nQRF+ICjCDwRF+IGgCD8QVFPz/GbWJemopBOSjrt7pYim2tHJkydza/fee29ybG9vb7Je69Lcd999\nd7I+VE2bNi1Zf/TRR5P1Cy64IFlfvHhxbu3+++9Pjl2zZk2y3uz3H9pBEV/y+Sd3/6qAvwOghXjZ\nDwTVbPhd0utmtsPMOotoCEBrNPuy/xp3P2BmP5b0mpl95O7b+u+Q/aPQKUmTJk1q8uEAFKWpI7+7\nH8h+H5b0vKQrBthntbtX3L3S0dHRzMMBKFDD4TezM8xszLe3Jd0gKX/FSABtpZmX/eMlPZ9NeZwm\n6T/cPf9ayQDaiqXOty5apVLxarXasscr0tKlS3Nry5YtS4698sork/W33norWT/tNC67MJBa35+4\n8MILc2v79u1Ljv3444+T9alTpybrZalUKqpWq3V9CYGpPiAowg8ERfiBoAg/EBThB4Ii/EBQzCHV\n6Y033sitjR8/Pjl2y5YtyTpTeY0ZOXJksr5q1arc2u23354cW+uy4Q8//HCyPhRw5AeCIvxAUIQf\nCIrwA0ERfiAowg8ERfiBoJhgrtPWrVtza6nLekvS6NGji24HdWjmsnHd3d0FdtKeOPIDQRF+ICjC\nDwRF+IGgCD8QFOEHgiL8QFDM89dp1KhRZbeAU3zzzTfJejNLm8+aNavhsUMFR34gKMIPBEX4gaAI\nPxAU4QeCIvxAUIQfCKrmPL+ZrZE0W9Jhd78s2zZO0kZJkyV1SbrD3f8yeG0iop6enmR93rx5yfrO\nnTtza4sWLUqOnTt3brI+HNRz5P+9pBtP2faApK3ufpGkrdl9AENIzfC7+zZJR07ZPEfS2uz2Wkm3\nFNwXgEHW6Hv+8e5+MLt9SFJ6vSoAbafpD/zc3SV5Xt3MOs2sambVCNdFA4aKRsP/pZlNkKTs9+G8\nHd19tbtX3L3S0dHR4MMBKFqj4d8saWF2e6GkF4ppB0Cr1Ay/mW2Q9N+SLjaz/WZ2l6Tlkn5uZp9I\nmpndBzCE1Jznd/cFOaWfFdwLhqBaaxYcO3Yst7Zr167k2Pvuuy9Z3759e7I+c+bM3NoTTzyRHGtm\nyfpwwDf8gKAIPxAU4QeCIvxAUIQfCIrwA0Fx6e5hoLe3N7e2adOm5NgNGzYk659//nmy/sUXXyTr\nR46cek7Y/+v7Znjjbr311mR93bp1uTUuxc6RHwiL8ANBEX4gKMIPBEX4gaAIPxAU4QeCYp5/GHjk\nkUdyaw899FALOynWWWedlaxff/31yXpqCe/Ro0c31NNwwpEfCIrwA0ERfiAowg8ERfiBoAg/EBTh\nB4Jinn8YmDNnTm5t9+7dybHNXqK6mXPyt23blqwfOnQoWV+8eHGyvnTp0tzajh07kmOnTJmSrA8H\nHPmBoAg/EBThB4Ii/EBQhB8IivADQRF+IKia8/xmtkbSbEmH3f2ybNuDkn4pqTvbbYm7vzxYTSJt\nxowZubWNGze2sJMf5sSJE8n6Z599lqw/88wzyfqKFStya5VKJTl2z549yXqtaw0MBfUc+X8v6cYB\ntv/O3adnPwQfGGJqht/dt0nKX3YFwJDUzHv+35jZB2a2xszOLqwjAC3RaPifkDRF0nRJByWtzNvR\nzDrNrGpm1e7u7rzdALRYQ+F39y/d/YS7n5T0lKQrEvuudveKu1c6Ojoa7RNAwRoKv5lN6Hd3rqRd\nxbQDoFXqmerbIOmnks4xs/2S/lXST81suiSX1CXpV4PYI4BBUDP87r5ggM3pCVagDiNGjEjWp06d\nmqwvX748WT9+/HhubdWqVcmxL774YrJ+5513JutDAd/wA4Ii/EBQhB8IivADQRF+ICjCDwTFpbsx\nZNW67Pjs2bNza7Wm+rq6uhppaUjhyA8ERfiBoAg/EBThB4Ii/EBQhB8IivADQTHPj2Gr1inDKSdP\nniywk/bEkR8IivADQRF+ICjCDwRF+IGgCD8QFOEHgmKeH8PW22+/3fDYiRMnFthJe+LIDwRF+IGg\nCD8QFOEHgiL8QFCEHwiK8ANB1ZznN7OJktZJGi/JJa1298fNbJykjZImS+qSdIe7/2XwWgW+q6en\nJ1l/7LHHcmu1zvW/7bbbGuppKKnnyH9c0v3ufqmkKyUtNrNLJT0gaau7XyRpa3YfwBBRM/zuftDd\n38tuH5W0W9J5kuZIWpvttlbSLYPVJIDi/aD3/GY2WdIMSe9IGu/uB7PSIfW9LQAwRNQdfjM7U9Im\nSb9197/2r7m7q+/zgIHGdZpZ1cyq3d3dTTULoDh1hd/MRqov+Ovd/Y/Z5i/NbEJWnyDp8EBj3X21\nu1fcvdLR0VFEzwAKUDP81rcU6jOSdrt7/6VNN0tamN1eKOmF4tsDMFjqOaX3akl3StppZu9n25ZI\nWi7pWTO7S9JeSXcMTout0dvbm6xv3rw5tzZr1qzk2DPPPLOhnqLrezeZb8mSJcn6kSNHcmv33HNP\ncuzYsWOT9eGgZvjd/U+S8hZC/1mx7QBoFb7hBwRF+IGgCD8QFOEHgiL8QFCEHwiKS3dnqtVqsp46\nxXP27NnJsc8991yyPmrUqGQ9qmeffTZZX7lyZbJ+7rnn5tZWrFjRUE/DCUd+ICjCDwRF+IGgCD8Q\nFOEHgiL8QFCEHwiKef7MjBkzkvVLLrkkt/bSSy8lx86bNy9Zf/LJJ5P1888/P1nvu95Ke0pdJ2H9\n+vXJsZ2dncl6rf/uN998M7c2ZsyY5NgIOPIDQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFDM82dqnVO/\nY8eO3Fqt6/a/8soryfqkSZOS9dNPPz1Zv/nmm3NrV199dXJsraWqa/n666+T9aeffjq3tnfv3uTY\nkSNHJutbtmxJ1qdOnZqsR8eRHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCslproJvZREnrJI2X5JJW\nu/vjZvagpF9K6s52XeLuL6f+VqVS8VrXxx+Kenp6kvW1a9cm67Wu6799+/Zk/ejRo8l6mVLn3C9a\ntCg5dtmyZcl66rr8UVUqFVWr1bou8FDPl3yOS7rf3d8zszGSdpjZa1ntd+7+WKONAihPzfC7+0FJ\nB7PbR81st6TzBrsxAIPrB73nN7PJkmZIeifb9Bsz+8DM1pjZ2TljOs2sambV7u7ugXYBUIK6w29m\nZ0raJOm37v5XSU9ImiJpuvpeGQy4cJq7r3b3irtXOjo6CmgZQBHqCr+ZjVRf8Ne7+x8lyd2/dPcT\n7n5S0lOSrhi8NgEUrWb4re/j2mck7Xb3Vf22T+i321xJu4pvD8BgqWeq7xpJb0naKelktnmJpAXq\ne8nvkrok/Sr7cDDXcJ3qG2wnTpxI1j/66KPc2r59+5Jjm73sd61Tgi+//PLcGm8Di1foVJ+7/0nS\nQH8sOacPoL3xDT8gKMIPBEX4gaAIPxAU4QeCIvxAUFy6ewioNZc+bdq0hmqIjSM/EBThB4Ii/EBQ\nhB8IivADQRF+ICjCDwRV83z+Qh/MrFtS/3WZz5H0Vcsa+GHatbd27Uuit0YV2dsF7l7XhRJaGv7v\nPbhZ1d0rpTWQ0K69tWtfEr01qqzeeNkPBEX4gaDKDv/qkh8/pV17a9e+JHprVCm9lfqeH0B5yj7y\nAyhJKeE3sxvN7GMz+9TMHiijhzxm1mVmO83sfTMr9Trj2TJoh81sV79t48zsNTP7JPs94DJpJfX2\noJkdyJ67983sppJ6m2hm/2VmfzazD83sn7PtpT53ib5Ked5a/rLfzEZI+l9JP5e0X9K7kha4+59b\n2kgOM+uSVHH30ueEzew6SX+TtM7dL8u2rZB0xN2XZ/9wnu3u/9ImvT0o6W9lr9ycLSgzof/K0pJu\nkbRIJT53ib7uUAnPWxlH/iskferue9y9R9IfJM0poY+25+7bJB05ZfMcSWuz22vV9z9Py+X01hbc\n/aC7v5fdPirp25WlS33uEn2Voozwnyep/zIy+9VeS367pNfNbIeZdZbdzADG91sZ6ZCk8WU2M4Ca\nKze30ikrS7fNc9fIitdF4wO/77vG3adL+oWkxdnL27bkfe/Z2mm6pq6Vm1tlgJWl/67M567RFa+L\nVkb4D0ia2O/++dm2tuDuB7LfhyU9r/ZbffjLbxdJzX4fLrmfv2unlZsHWllabfDctdOK12WE/11J\nF5nZT8zsR5LmS9pcQh/fY2ZnZB/EyMzOkHSD2m/14c2SFma3F0p6ocRevqNdVm7OW1laJT93bbfi\ntbu3/EfSTer7xP8zSUvL6CGnrymS/if7+bDs3iRtUN/LwF71fTZyl6R/lLRV0ieSXpc0ro16+3f1\nreb8gfqCNqGk3q5R30v6DyS9n/3cVPZzl+irlOeNb/gBQfGBHxAU4QeCIvxAUIQfCIrwA0ERfiAo\nwg8ERfiBoP4PYja7cJBT5mIAAAAASUVORK5CYII=\n",
220 | "text/plain": [
221 | ""
222 | ]
223 | },
224 | "metadata": {},
225 | "output_type": "display_data"
226 | }
227 | ],
228 | "source": [
229 | "# test the neural network withour own images\n",
230 | "\n",
231 | "# load image data from png files into an array\n",
232 | "print (\"loading ... my_own_images/2828_my_own_image.png\")\n",
233 | "img_array = scipy.misc.imread('my_own_images/2828_my_own_image.png', flatten=True)\n",
234 | " \n",
235 | "# reshape from 28x28 to list of 784 values, invert values\n",
236 | "img_data = 255.0 - img_array.reshape(784)\n",
237 | " \n",
238 | "# then scale data to range from 0.01 to 1.0\n",
239 | "img_data = (img_data / 255.0 * 0.99) + 0.01\n",
240 | "print(\"min = \", numpy.min(img_data))\n",
241 | "print(\"max = \", numpy.max(img_data))\n",
242 | "\n",
243 | "# plot image\n",
244 | "matplotlib.pyplot.imshow(img_data.reshape(28,28), cmap='Greys', interpolation='None')\n",
245 | "\n",
246 | "# query the network\n",
247 | "outputs = n.query(img_data)\n",
248 | "print (outputs)\n",
249 | "\n",
250 | "# the index of the highest value corresponds to the label\n",
251 | "label = numpy.argmax(outputs)\n",
252 | "print(\"network says \", label)\n"
253 | ]
254 | },
255 | {
256 | "cell_type": "code",
257 | "execution_count": null,
258 | "metadata": {
259 | "collapsed": true
260 | },
261 | "outputs": [],
262 | "source": []
263 | }
264 | ],
265 | "metadata": {
266 | "kernelspec": {
267 | "display_name": "Python 3",
268 | "language": "python",
269 | "name": "python3"
270 | },
271 | "language_info": {
272 | "codemirror_mode": {
273 | "name": "ipython",
274 | "version": 3
275 | },
276 | "file_extension": ".py",
277 | "mimetype": "text/x-python",
278 | "name": "python",
279 | "nbconvert_exporter": "python",
280 | "pygments_lexer": "ipython3",
281 | "version": "3.6.1"
282 | }
283 | },
284 | "nbformat": 4,
285 | "nbformat_minor": 1
286 | }
287 |
--------------------------------------------------------------------------------
/code/part3_neural_network_mnist_backquery.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# code for a 3-layer neural network, and code for learning the MNIST dataset\n",
13 | "# this version asks the network what the image should be, given a label\n",
14 | "# (c) Tariq Rashid, 2016\n",
15 | "# license is GPLv2"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 2,
21 | "metadata": {
22 | "collapsed": true
23 | },
24 | "outputs": [],
25 | "source": [
26 | "import numpy\n",
27 | "# scipy.special for the sigmoid function expit(), and its inverse logit()\n",
28 | "import scipy.special\n",
29 | "# library for plotting arrays\n",
30 | "import matplotlib.pyplot\n",
31 | "# ensure the plots are inside this notebook, not an external window\n",
32 | "%matplotlib inline"
33 | ]
34 | },
35 | {
36 | "cell_type": "code",
37 | "execution_count": 3,
38 | "metadata": {},
39 | "outputs": [],
40 | "source": [
41 | "# neural network class definition\n",
42 | "class neuralNetwork:\n",
43 | " \n",
44 | " \n",
45 | " # initialise the neural network\n",
46 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
47 | " # set number of nodes in each input, hidden, output layer\n",
48 | " self.inodes = inputnodes\n",
49 | " self.hnodes = hiddennodes\n",
50 | " self.onodes = outputnodes\n",
51 | " \n",
52 | " # link weight matrices, wih and who\n",
53 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
54 | " # w11 w21\n",
55 | " # w12 w22 etc \n",
56 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
57 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
58 | "\n",
59 | " # learning rate\n",
60 | " self.lr = learningrate\n",
61 | " \n",
62 | " # activation function is the sigmoid function\n",
63 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
64 | " self.inverse_activation_function = lambda x: scipy.special.logit(x)\n",
65 | " \n",
66 | " pass\n",
67 | "\n",
68 | " \n",
69 | " # train the neural network\n",
70 | " def train(self, inputs_list, targets_list):\n",
71 | " # convert inputs list to 2d array\n",
72 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
73 | " targets = numpy.array(targets_list, ndmin=2).T\n",
74 | " \n",
75 | " # calculate signals into hidden layer\n",
76 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
77 | " # calculate the signals emerging from hidden layer\n",
78 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
79 | " \n",
80 | " # calculate signals into final output layer\n",
81 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
82 | " # calculate the signals emerging from final output layer\n",
83 | " final_outputs = self.activation_function(final_inputs)\n",
84 | " \n",
85 | " # output layer error is the (target - actual)\n",
86 | " output_errors = targets - final_outputs\n",
87 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
88 | " hidden_errors = numpy.dot(self.who.T, output_errors) \n",
89 | " \n",
90 | " # update the weights for the links between the hidden and output layers\n",
91 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
92 | " \n",
93 | " # update the weights for the links between the input and hidden layers\n",
94 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
95 | " \n",
96 | " pass\n",
97 | "\n",
98 | " \n",
99 | " # query the neural network\n",
100 | " def query(self, inputs_list):\n",
101 | " # convert inputs list to 2d array\n",
102 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
103 | " \n",
104 | " # calculate signals into hidden layer\n",
105 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
106 | " # calculate the signals emerging from hidden layer\n",
107 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
108 | " \n",
109 | " # calculate signals into final output layer\n",
110 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
111 | " # calculate the signals emerging from final output layer\n",
112 | " final_outputs = self.activation_function(final_inputs)\n",
113 | " \n",
114 | " return final_outputs\n",
115 | " \n",
116 | " \n",
117 | " # backquery the neural network\n",
118 | " # we'll use the same termnimology to each item, \n",
119 | " # eg target are the values at the right of the network, albeit used as input\n",
120 | " # eg hidden_output is the signal to the right of the middle nodes\n",
121 | " def backquery(self, targets_list):\n",
122 | " # transpose the targets list to a vertical array\n",
123 | " final_outputs = numpy.array(targets_list, ndmin=2).T\n",
124 | " \n",
125 | " # calculate the signal into the final output layer\n",
126 | " final_inputs = self.inverse_activation_function(final_outputs)\n",
127 | "\n",
128 | " # calculate the signal out of the hidden layer\n",
129 | " hidden_outputs = numpy.dot(self.who.T, final_inputs)\n",
130 | " # scale them back to 0.01 to .99\n",
131 | " hidden_outputs -= numpy.min(hidden_outputs)\n",
132 | " hidden_outputs /= numpy.max(hidden_outputs)\n",
133 | " hidden_outputs *= 0.98\n",
134 | " hidden_outputs += 0.01\n",
135 | " \n",
136 | " # calculate the signal into the hidden layer\n",
137 | " hidden_inputs = self.inverse_activation_function(hidden_outputs)\n",
138 | " \n",
139 | " # calculate the signal out of the input layer\n",
140 | " inputs = numpy.dot(self.wih.T, hidden_inputs)\n",
141 | " # scale them back to 0.01 to .99\n",
142 | " inputs -= numpy.min(inputs)\n",
143 | " inputs /= numpy.max(inputs)\n",
144 | " inputs *= 0.98\n",
145 | " inputs += 0.01\n",
146 | " \n",
147 | " return inputs"
148 | ]
149 | },
150 | {
151 | "cell_type": "code",
152 | "execution_count": 4,
153 | "metadata": {},
154 | "outputs": [],
155 | "source": [
156 | "# number of input, hidden and output nodes\n",
157 | "input_nodes = 784\n",
158 | "hidden_nodes = 200\n",
159 | "output_nodes = 10\n",
160 | "\n",
161 | "# learning rate\n",
162 | "learning_rate = 0.1\n",
163 | "\n",
164 | "# create instance of neural network\n",
165 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
166 | ]
167 | },
168 | {
169 | "cell_type": "code",
170 | "execution_count": 5,
171 | "metadata": {},
172 | "outputs": [],
173 | "source": [
174 | "# load the mnist training data CSV file into a list\n",
175 | "training_data_file = open(\"mnist_dataset/mnist_train.csv\", 'r')\n",
176 | "training_data_list = training_data_file.readlines()\n",
177 | "training_data_file.close()"
178 | ]
179 | },
180 | {
181 | "cell_type": "code",
182 | "execution_count": 6,
183 | "metadata": {},
184 | "outputs": [],
185 | "source": [
186 | "# train the neural network\n",
187 | "\n",
188 | "# epochs is the number of times the training data set is used for training\n",
189 | "epochs = 5\n",
190 | "\n",
191 | "for e in range(epochs):\n",
192 | " # go through all records in the training data set\n",
193 | " for record in training_data_list:\n",
194 | " # split the record by the ',' commas\n",
195 | " all_values = record.split(',')\n",
196 | " # scale and shift the inputs\n",
197 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
198 | " # create the target output values (all 0.01, except the desired label which is 0.99)\n",
199 | " targets = numpy.zeros(output_nodes) + 0.01\n",
200 | " # all_values[0] is the target label for this record\n",
201 | " targets[int(all_values[0])] = 0.99\n",
202 | " n.train(inputs, targets)\n",
203 | " pass\n",
204 | " pass"
205 | ]
206 | },
207 | {
208 | "cell_type": "code",
209 | "execution_count": 7,
210 | "metadata": {
211 | "collapsed": true
212 | },
213 | "outputs": [],
214 | "source": [
215 | "# load the mnist test data CSV file into a list\n",
216 | "test_data_file = open(\"mnist_dataset/mnist_test.csv\", 'r')\n",
217 | "test_data_list = test_data_file.readlines()\n",
218 | "test_data_file.close()"
219 | ]
220 | },
221 | {
222 | "cell_type": "code",
223 | "execution_count": 8,
224 | "metadata": {},
225 | "outputs": [],
226 | "source": [
227 | "# test the neural network\n",
228 | "\n",
229 | "# scorecard for how well the network performs, initially empty\n",
230 | "scorecard = []\n",
231 | "\n",
232 | "# go through all the records in the test data set\n",
233 | "for record in test_data_list:\n",
234 | " # split the record by the ',' commas\n",
235 | " all_values = record.split(',')\n",
236 | " # correct answer is first value\n",
237 | " correct_label = int(all_values[0])\n",
238 | " # scale and shift the inputs\n",
239 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
240 | " # query the network\n",
241 | " outputs = n.query(inputs)\n",
242 | " # the index of the highest value corresponds to the label\n",
243 | " label = numpy.argmax(outputs)\n",
244 | " # append correct or incorrect to list\n",
245 | " if (label == correct_label):\n",
246 | " # network's answer matches correct answer, add 1 to scorecard\n",
247 | " scorecard.append(1)\n",
248 | " else:\n",
249 | " # network's answer doesn't match correct answer, add 0 to scorecard\n",
250 | " scorecard.append(0)\n",
251 | " pass\n",
252 | " \n",
253 | " pass"
254 | ]
255 | },
256 | {
257 | "cell_type": "code",
258 | "execution_count": 9,
259 | "metadata": {},
260 | "outputs": [
261 | {
262 | "name": "stdout",
263 | "output_type": "stream",
264 | "text": [
265 | "performance = 0.9733\n"
266 | ]
267 | }
268 | ],
269 | "source": [
270 | "# calculate the performance score, the fraction of correct answers\n",
271 | "scorecard_array = numpy.asarray(scorecard)\n",
272 | "print (\"performance = \", scorecard_array.sum() / scorecard_array.size)"
273 | ]
274 | },
275 | {
276 | "cell_type": "code",
277 | "execution_count": 10,
278 | "metadata": {
279 | "scrolled": true
280 | },
281 | "outputs": [
282 | {
283 | "name": "stdout",
284 | "output_type": "stream",
285 | "text": [
286 | "[ 0.99 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01]\n"
287 | ]
288 | },
289 | {
290 | "data": {
291 | "text/plain": [
292 | ""
293 | ]
294 | },
295 | "execution_count": 10,
296 | "metadata": {},
297 | "output_type": "execute_result"
298 | },
299 | {
300 | "data": {
301 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFU5JREFUeJzt3VtsndWVB/D/IiE320mcm0mCQwgJ4RKSNLIipIZRR00r\nCEVQHqIiUTISavrQqaZSHwYxD8MjGk1b8TCqlA5Rw1BoR1AgD6gjLiMBEmpwIpM7JISY2HHs3Bw7\nCQm5rHnwR2XA3/ofn+/4nJPZ/58UxTnL+5zt75yVY3vttbe5O0QkPdfVegIiUhtKfpFEKflFEqXk\nF0mUkl8kUUp+kUQp+UUSpeQXSZSSXyRR46v5YNOnT/d58+blxousNjSzssfW2nXXxf8HX716dcwe\nm1039pzU8jkby+ecXXP2nBW5Lmxs9HUfPXoU/f39JV2YQslvZvcCeAbAOAD/6e5PR58/b948PP/8\n87nxixcvho8XXfCJEyeGY69cuVL2fQPxE8LGMmzuFy5cCOPR1zZ+fPwUF03+Is9Z0bmxePTYLLm/\n+OKLMD5p0qQwzl5v0XW9fPlyOHbcuHG5sUcffTQcO1zZr1ozGwfgPwDcB+AOAI+Y2R3l3p+IVFeR\nt6zVAA66+yF3/wLAHwE8WJlpichYK5L88wEcGfbvruy2rzCzjWbWbmbtp0+fLvBwIlJJY/7bfnff\n5O5t7t7W3Nw81g8nIiUqkvzdAFqH/fvG7DYRuQYUSf4PACwxs5vNbAKAHwHYWplpichYK7vU5+6X\nzewfAfwPhkp9m919TzTGzMIyxeTJk8PHjMozrGzESjfXX399GL906VLZj81E913K/UfXhd03+7qZ\nmTNnhvETJ07kxli5jZXT2NdW5PXCrstYrr2o1u5ahV617v46gNcrNBcRqSIt7xVJlJJfJFFKfpFE\nKflFEqXkF0mUkl8kUVXt53f3sF2xSJska3udMGECnVskWoPAWkvPnz8fxtnXzdpmo7UTRdtm2WOz\n9tOpU6fmxljbK2t1LtKGXXRdCJs7i0fPOfu6o/UNo9njQO/8IolS8oskSskvkiglv0iilPwiiVLy\niySqrkp9RXZUZaU8dt+sPXTKlCm5MVbKi0pxAC/PsLJTd3f+Hipsbp988kkYZ+W0pUuXhvFDhw7l\nxlpaWsKxK1asCOPsOY/Keay8yuKDg4NhvMjOwqxsHY0dTTuw3vlFEqXkF0mUkl8kUUp+kUQp+UUS\npeQXSZSSXyRRVa3zs627WU052k656HHNrMXz3LlzuTHWvhmtEQCA48ePh3HWNhvV6j///PNwbG9v\nbxiPjlQHgJ07d4bxqJb/0UcfhWNZvfvWW28N42fPns2NzZgxIxzL1hBMmzYtjLO23P7+/rIfO3ot\nqs4vIpSSXyRRSn6RRCn5RRKl5BdJlJJfJFFKfpFEFarzm9lhAIMArgC47O5tbAyr5UeKHJPNeurZ\n8eCnTp3KjbE6/KeffhrGz5w5E8Y7OzvD+MDAQG6Mbb3d19cXxlnNuampKYxHX1tDQ0M4lu01wNYw\nROtCdu/eHY696667yr5vgL/eolo9ey1HawzY437lcUr+zHx/7+75h7CLSF3St/0iiSqa/A7gTTPb\nbmYbKzEhEamOot/2r3H3bjObA+ANM9vv7u8M/4TsP4WNAHDDDTcUfDgRqZRC7/zu3p393QfgFQCr\nR/icTe7e5u5tzc3NRR5ORCqo7OQ3swYza/ryYwDfBxD/ClVE6kaRb/tbALyStdKOB/CCu/+lIrMS\nkTFXdvK7+yEA8cbqI4j67lm9PKp/svomu++oVg7Eawx27doVjmU91qzezdZGtLa25sbYvv1sH4Sb\nbropjN9+++1hPDpz4MiRI+FYtkaB9fvPnDkzN8aes8bGxjDO9jlgZy1Ezzl7vUSvRfXziwil5BdJ\nlJJfJFFKfpFEKflFEqXkF0lUVbfuBuJSBGuTjEo/rMRR5JhrADh69GhujJWcWFvstm3bwjgrY0Zb\nVLOSFCv1sW3HoyO4gfjabN++PRz79ttvh3G2YnTt2rW5Mba1NrvmHR0dYXzRokVhPJo7ey1HW8Wr\n1CcilJJfJFFKfpFEKflFEqXkF0mUkl8kUUp+kURVvc4fYTXKKM5aV6M2SCCulbP7Z/f98ccfh/Gr\nV6+GcVZzXrx4cW7s4MGD4Vi2/uGFF14I41HbLAC8+uqrubElS5aEY1kdf86cOWE8es7Y+ofp06eH\ncYZtvx2t/WCtzNF9j+aoer3ziyRKyS+SKCW/SKKU/CKJUvKLJErJL5IoJb9Ioqpe54/qkKyeHdXT\nWb2aHcHNaqsRdlQ0u292zDVbBzB16tTcGOvHZ/sY3HfffWH89OnTYfyxxx7Lje3Zsycc29XVVeix\no9fEsmXLwrFsD4aenp4wHm2nDsT7CbBa/Wh69iN65xdJlJJfJFFKfpFEKflFEqXkF0mUkl8kUUp+\nkUTROr+ZbQbwAwB97r4su20GgD8BWAjgMID17h4XXYfGhb3I/f394fhJkyblxlgtPdrrHOBHdEdn\nCrA1BqxmzOZ+9913lz1+/vz54diHH344jLPrwvrW+/r6xmQsANxyyy1hPKrlR2sjAL6GgB2rzkTr\nTtjrga37KFUp7/y/B3Dv1257AsBb7r4EwFvZv0XkGkKT393fAXDqazc/CGBL9vEWAA9VeF4iMsbK\n/Zm/xd2/XN94DEBLheYjIlVS+Bd+PrTQOHexsZltNLN2M2tnP0eJSPWUm/y9ZjYXALK/c38z4+6b\n3L3N3dvYhowiUj3lJv9WABuyjzcAeK0y0xGRaqHJb2YvAngfwFIz6zKzxwE8DeB7ZnYAwNrs3yJy\nDaF1fnd/JCf03dE+mLuH9faojg/EtXa2zzqrta9YsSKMv/fee7kx1jPP6rJsrwHWO758+fLc2OrV\nq8OxM2bMCOOsd5ytA4jOqT927Fg4dt26dWGcrQOIzJ49O4xfvnw5jLNzHqLXKgBcd13++y5bkxLd\nt/btFxFKyS+SKCW/SKKU/CKJUvKLJErJL5Koqm7dbWZhmeLChQvh+KgUyLb9ZqU+tv32yZMnc2Mt\nLXFrw7Rp08I4K6f19vaG8Qg7Hpy1xbKWYPa1RVtUr1q1KhzLvu5Zs2aV/djsOWtvbw/ju3fvDuPn\nzp0L49Hx4qzVOdrCfjTbeuudXyRRSn6RRCn5RRKl5BdJlJJfJFFKfpFEKflFElXVOr+7h9sSs3bE\nqM2StcUODg6G8ePHj4fxaG5nzpwJxzY2NoZxVtdl20xHtd0bb7wxHMvajdl1YesnonUd0doJAPjw\nww/DOFujEH3tbE0JWx/BWnrZdY0en7W2R1vFq84vIpSSXyRRSn6RRCn5RRKl5BdJlJJfJFFKfpFE\nVb2fP6ppsy2Lozo/22qZ3ffhw4fDeLROgG1fzdYgsNrsnXfeGcajevm7774bjmXbZ69ZsyaMszr/\n0qVLc2MHDx4Mx7KeedaT39HRkRtjayvYluZs/4hoa24gfr2yNQjRY2vrbhGhlPwiiVLyiyRKyS+S\nKCW/SKKU/CKJUvKLJIrW+c1sM4AfAOhz92XZbU8B+AmAL5u9n3T310t5wKimXeRYY1bnZ/vys/ro\n+vXrc2Os550dJc3Gb9u2LYxHR3SzffXZ0easL531vUeOHj0axk+dOhXG2TqBqJ+/tbU1HFvkDAmA\n78EQvZbZnv/RY1e6n//3AO4d4fbfuPvK7E9JiS8i9YMmv7u/AyD+L1hErjlFfub/uZntNLPNZtZc\nsRmJSFWUm/y/BbAIwEoAPQB+lfeJZrbRzNrNrP306dNlPpyIVFpZye/uve5+xd2vAvgdgNXB525y\n9zZ3b2tu1jcIIvWirOQ3s7nD/vlDAHH7lYjUnVJKfS8C+A6AWWbWBeBfAXzHzFYCcACHAfx0DOco\nImOAJr+7PzLCzc+OwVxoTTmq5Tc0NIRjo73OSxkf1cNZPz/7cWfx4sVhnPW1L1iwIDfGet5ZvXr/\n/v1hvKenJ4xH1/XIkSPh2FmzZoXx2bNnh/Fob322rmPJkiVhPDp/AuB1/ui1zNa7qJ9fRApR8osk\nSskvkiglv0iilPwiiVLyiySq6kd0Ry2HrB0x2iaabaXMjlRmx2xHS5O7u7vDsWzrbVaGZKXC6Lo0\nNTWFY1nJ6rbbbgvjrO02KlNOnDgxHHv//feHcdbSG103tnX3+fPnwzjbspy9HqOt5Nl1iV6rbIv6\n4fTOL5IoJb9IopT8IolS8oskSskvkiglv0iilPwiiapqnR+Ia/nsWGNWD4+wNklWD9+7d29ubNGi\nReHYEydOhHFW12WtrdF1++yzz8KxbGvv6OsGgJdeeimMz507Nze2atWqcCzb9o1tOx4dD97Z2RmO\nZUeX33zzzWGcvd6ill+2RiA6PpytXxhO7/wiiVLyiyRKyS+SKCW/SKKU/CKJUvKLJErJL5Koqtb5\nzSysQ7Jth6OtvdkR3ax2ykTHaLN9CFiP9bJly8I465mPjsnet29fOJYdVf3++++H8XvuuSeMR/Vs\ntocCOyab7ZNw4MCBsmJAXEsH4uO/Ab51d4Qd0d3Y2Jgbq/QR3SLy/5CSXyRRSn6RRCn5RRKl5BdJ\nlJJfJFFKfpFE0Tq/mbUCeA5ACwAHsMndnzGzGQD+BGAhgMMA1rt72IDt7mHNm9Uoo3UAbF9+tg97\nVDsF4v5tVktnPfM7duwI4+y6RPVy1hP/8ssvh/GHHnoojDPRXgNRrz/Ae+LZ1xZdtzlz5oRj2dwm\nT55c9mMD8dqPKVOmhGOruW//ZQC/dPc7ANwN4GdmdgeAJwC85e5LALyV/VtErhE0+d29x913ZB8P\nAtgHYD6ABwFsyT5tC4BibxEiUlWj+pnfzBYC+BaAvwJocfeeLHQMQz8WiMg1ouTkN7NGAC8D+IW7\nDwyP+dAPOCP+kGNmG82s3cza+/v7C01WRCqnpOQ3s+sxlPh/cPc/Zzf3mtncLD4XQN9IY919k7u3\nuXsb23BRRKqHJr8N/Yr9WQD73P3Xw0JbAWzIPt4A4LXKT09ExkopLb3fBvBjALvMrCO77UkATwP4\nbzN7HEAngPVFJ3Pp0qUwHrX0sm2/Z8+eHcZZqS8qKzU0NIRjT548GcZZOzIrFUbto6zkxI7gPnTo\nUBhnW55H7anLly8Px7JyXG9vbxiPvjZWJhwYGAjjbGvvefPmhfHoeWGvh+i1OprWdZr87v4egLwC\n+3dLfiQRqSta4SeSKCW/SKKU/CKJUvKLJErJL5IoJb9Ioqp+RHeE1aSjFs9oDQDA69FsjUFf34gL\nGAEAg4OD4Vh2tHhzc3MY7+npCeNRuzK7LmyNQtE1DNHXzrZqf+CBB8L4hAkTwniRbcNZvZy9nlit\nPnots+sSvVa1dbeIUEp+kUQp+UUSpeQXSZSSXyRRSn6RRCn5RRJVV3V+1lMf9Yaz45xZT/ykSZPC\neLSVMzvuecGCBWG8pSXe/nDixIlhfP/+/bkx1q/P6sKdnZ1hnD1n7HmJ7N27N4yz9RFRzz7b4prd\nNxvPXk9RnD0nUZ2frREYTu/8IolS8oskSskvkiglv0iilPwiiVLyiyRKyS+SqKrW+c0srEOynvqo\nf5vVVdk+7exY5KhezY5znj9/fhhnxz2z+29ra8uNRWsAAH50+cKFC8M4u+7RXgdsX352RDd77Og5\nZes+2HVhj832cIjWrLAzKEazN39E7/wiiVLyiyRKyS+SKCW/SKKU/CKJUvKLJErJL5IoWuc3s1YA\nzwFoAeAANrn7M2b2FICfADiefeqT7v56dF/uHvYqjx8fTyfqoWY90Gwf9ajuCsRrDFasWBGOnTlz\nZhhne+v39/eXPX7t2rXh2Og8AoD3rV+8eDGMR9eNfd3sOWO19qgezvZIYGcCnD17NowXWTfCrnlk\nNPv2l7LI5zKAX7r7DjNrArDdzN7IYr9x938vY44iUmM0+d29B0BP9vGgme0DEC9ZE5G6N6qf+c1s\nIYBvAfhrdtPPzWynmW02sxH3PTKzjWbWbmbt7NtXEamekpPfzBoBvAzgF+4+AOC3ABYBWImh7wx+\nNdI4d9/k7m3u3jZ9+vQKTFlEKqGk5Dez6zGU+H9w9z8DgLv3uvsVd78K4HcAVo/dNEWk0mjy21Ab\n3rMA9rn7r4fdPrzV7IcAdld+eiIyVkr5bf+3AfwYwC4z68huexLAI2a2EkPlv8MAfsruyMzCch4r\nt0XlGVYeYW2zrJ042qKatZ4ODAyEcTaetXBGZaOurq5wLLvmRY/wjrBrHh2xDfC226gMyVpuWamP\nYde1SMk7avkdzdbdpfy2/z0AI91jWNMXkfqmFX4iiVLyiyRKyS+SKCW/SKKU/CKJUvKLJKqqW3e7\ne1iTZlsWRy2erDbKsLpu1H7K6vSsdZWtUWCtrdH6B1b3ZdtnF1l7wcaztle2DoAdox3V0tl1Ya8n\nFmdrO5qamnJjbA3CaNp2I3rnF0mUkl8kUUp+kUQp+UUSpeQXSZSSXyRRSn6RRFmlaoYlPZjZcQCd\nw26aBeBE1SYwOvU6t3qdF6C5lauSc7vJ3WeX8olVTf5vPLhZu7vnHy5fQ/U6t3qdF6C5latWc9O3\n/SKJUvKLJKrWyb+pxo8fqde51eu8AM2tXDWZW01/5heR2qn1O7+I1EhNkt/M7jWzj8zsoJk9UYs5\n5DGzw2a2y8w6zKy9xnPZbGZ9ZrZ72G0zzOwNMzuQ/R33tVZ3bk+ZWXd27TrMbF2N5tZqZv9rZnvN\nbI+Z/VN2e02vXTCvmly3qn/bb2bjAHwM4HsAugB8AOARd99b1YnkMLPDANrcveY1YTP7OwBnATzn\n7suy2/4NwCl3fzr7j7PZ3f+5Tub2FICztT65OTtQZu7wk6UBPATgH1DDaxfMaz1qcN1q8c6/GsBB\ndz/k7l8A+COAB2swj7rn7u8AOPW1mx8EsCX7eAuGXjxVlzO3uuDuPe6+I/t4EMCXJ0vX9NoF86qJ\nWiT/fABHhv27C/V15LcDeNPMtpvZxlpPZgQt2bHpAHAMQEstJzMCenJzNX3tZOm6uXblnHhdafqF\n3zetcfeVAO4D8LPs29u65EM/s9VTuaakk5urZYSTpf+mlteu3BOvK60Wyd8NoHXYv2/MbqsL7t6d\n/d0H4BXU3+nDvV8ekpr93Vfj+fxNPZ3cPNLJ0qiDa1dPJ17XIvk/ALDEzG42swkAfgRgaw3m8Q1m\n1pD9IgZm1gDg+6i/04e3AtiQfbwBwGs1nMtX1MvJzXknS6PG167uTrx296r/AbAOQ7/x/wTAv9Ri\nDjnzWgTgw+zPnlrPDcCLGPo28BKGfjfyOICZAN4CcADAmwBm1NHc/gvALgA7MZRoc2s0tzUY+pZ+\nJ4CO7M+6Wl+7YF41uW5a4SeSKP3CTyRRSn6RRCn5RRKl5BdJlJJfJFFKfpFEKflFEqXkF0nU/wE5\naCgMI17JnQAAAABJRU5ErkJggg==\n",
302 | "text/plain": [
303 | ""
304 | ]
305 | },
306 | "metadata": {},
307 | "output_type": "display_data"
308 | }
309 | ],
310 | "source": [
311 | "# run the network backwards, given a label, see what image it produces\n",
312 | "\n",
313 | "# label to test\n",
314 | "label = 0\n",
315 | "# create the output signals for this label\n",
316 | "targets = numpy.zeros(output_nodes) + 0.01\n",
317 | "# all_values[0] is the target label for this record\n",
318 | "targets[label] = 0.99\n",
319 | "print(targets)\n",
320 | "\n",
321 | "# get image data\n",
322 | "image_data = n.backquery(targets)\n",
323 | "\n",
324 | "# plot image data\n",
325 | "matplotlib.pyplot.imshow(image_data.reshape(28,28), cmap='Greys', interpolation='None')"
326 | ]
327 | },
328 | {
329 | "cell_type": "code",
330 | "execution_count": null,
331 | "metadata": {
332 | "collapsed": true
333 | },
334 | "outputs": [],
335 | "source": []
336 | }
337 | ],
338 | "metadata": {
339 | "kernelspec": {
340 | "display_name": "Python 3",
341 | "language": "python",
342 | "name": "python3"
343 | },
344 | "language_info": {
345 | "codemirror_mode": {
346 | "name": "ipython",
347 | "version": 3
348 | },
349 | "file_extension": ".py",
350 | "mimetype": "text/x-python",
351 | "name": "python",
352 | "nbconvert_exporter": "python",
353 | "pygments_lexer": "ipython3",
354 | "version": "3.6.1"
355 | }
356 | },
357 | "nbformat": 4,
358 | "nbformat_minor": 1
359 | }
360 |
--------------------------------------------------------------------------------
/code/part3_neural_network_mnist_data_with_rotations.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "collapsed": true
8 | },
9 | "outputs": [],
10 | "source": [
11 | "# python notebook for Make Your Own Neural Network\n",
12 | "# code for a 3-layer neural network, and code for learning the MNIST dataset\n",
13 | "# this version creates additional training examples by rotating each original by +/- 10 degrees\n",
14 | "# (c) Tariq Rashid, 2016\n",
15 | "# license is GPLv2"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": 2,
21 | "metadata": {
22 | "collapsed": true
23 | },
24 | "outputs": [],
25 | "source": [
26 | "# numpy provides arrays and useful functions for working with them\n",
27 | "import numpy\n",
28 | "# scipy.special for the sigmoid function expit()\n",
29 | "import scipy.special\n",
30 | "# scipy.ndimage for rotating image arrays\n",
31 | "import scipy.ndimage"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 3,
37 | "metadata": {},
38 | "outputs": [],
39 | "source": [
40 | "# neural network class definition\n",
41 | "class neuralNetwork:\n",
42 | " \n",
43 | " \n",
44 | " # initialise the neural network\n",
45 | " def __init__(self, inputnodes, hiddennodes, outputnodes, learningrate):\n",
46 | " # set number of nodes in each input, hidden, output layer\n",
47 | " self.inodes = inputnodes\n",
48 | " self.hnodes = hiddennodes\n",
49 | " self.onodes = outputnodes\n",
50 | " \n",
51 | " # link weight matrices, wih and who\n",
52 | " # weights inside the arrays are w_i_j, where link is from node i to node j in the next layer\n",
53 | " # w11 w21\n",
54 | " # w12 w22 etc \n",
55 | " self.wih = numpy.random.normal(0.0, pow(self.inodes, -0.5), (self.hnodes, self.inodes))\n",
56 | " self.who = numpy.random.normal(0.0, pow(self.hnodes, -0.5), (self.onodes, self.hnodes))\n",
57 | "\n",
58 | " # learning rate\n",
59 | " self.lr = learningrate\n",
60 | " \n",
61 | " # activation function is the sigmoid function\n",
62 | " self.activation_function = lambda x: scipy.special.expit(x)\n",
63 | " \n",
64 | " pass\n",
65 | "\n",
66 | " \n",
67 | " # train the neural network\n",
68 | " def train(self, inputs_list, targets_list):\n",
69 | " # convert inputs list to 2d array\n",
70 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
71 | " targets = numpy.array(targets_list, ndmin=2).T\n",
72 | " \n",
73 | " # calculate signals into hidden layer\n",
74 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
75 | " # calculate the signals emerging from hidden layer\n",
76 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
77 | " \n",
78 | " # calculate signals into final output layer\n",
79 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
80 | " # calculate the signals emerging from final output layer\n",
81 | " final_outputs = self.activation_function(final_inputs)\n",
82 | " \n",
83 | " # output layer error is the (target - actual)\n",
84 | " output_errors = targets - final_outputs\n",
85 | " # hidden layer error is the output_errors, split by weights, recombined at hidden nodes\n",
86 | " hidden_errors = numpy.dot(self.who.T, output_errors)\n",
87 | " \n",
88 | " # update the weights for the links between the hidden and output layers\n",
89 | " self.who += self.lr * numpy.dot((output_errors * final_outputs * (1.0 - final_outputs)), numpy.transpose(hidden_outputs))\n",
90 | " \n",
91 | " # update the weights for the links between the input and hidden layers\n",
92 | " self.wih += self.lr * numpy.dot((hidden_errors * hidden_outputs * (1.0 - hidden_outputs)), numpy.transpose(inputs))\n",
93 | " \n",
94 | " pass\n",
95 | "\n",
96 | " \n",
97 | " # query the neural network\n",
98 | " def query(self, inputs_list):\n",
99 | " # convert inputs list to 2d array\n",
100 | " inputs = numpy.array(inputs_list, ndmin=2).T\n",
101 | " \n",
102 | " # calculate signals into hidden layer\n",
103 | " hidden_inputs = numpy.dot(self.wih, inputs)\n",
104 | " # calculate the signals emerging from hidden layer\n",
105 | " hidden_outputs = self.activation_function(hidden_inputs)\n",
106 | " \n",
107 | " # calculate signals into final output layer\n",
108 | " final_inputs = numpy.dot(self.who, hidden_outputs)\n",
109 | " # calculate the signals emerging from final output layer\n",
110 | " final_outputs = self.activation_function(final_inputs)\n",
111 | " \n",
112 | " return final_outputs"
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "execution_count": 4,
118 | "metadata": {},
119 | "outputs": [],
120 | "source": [
121 | "# number of input, hidden and output nodes\n",
122 | "input_nodes = 784\n",
123 | "hidden_nodes = 200\n",
124 | "output_nodes = 10\n",
125 | "\n",
126 | "# learning rate\n",
127 | "learning_rate = 0.01\n",
128 | "\n",
129 | "# create instance of neural network\n",
130 | "n = neuralNetwork(input_nodes,hidden_nodes,output_nodes, learning_rate)"
131 | ]
132 | },
133 | {
134 | "cell_type": "code",
135 | "execution_count": 5,
136 | "metadata": {},
137 | "outputs": [],
138 | "source": [
139 | "# load the mnist training data CSV file into a list\n",
140 | "training_data_file = open(\"mnist_dataset/mnist_train.csv\", 'r')\n",
141 | "training_data_list = training_data_file.readlines()\n",
142 | "training_data_file.close()"
143 | ]
144 | },
145 | {
146 | "cell_type": "code",
147 | "execution_count": 6,
148 | "metadata": {},
149 | "outputs": [],
150 | "source": [
151 | "# train the neural network\n",
152 | "\n",
153 | "# epochs is the number of times the training data set is used for training\n",
154 | "epochs = 10\n",
155 | "\n",
156 | "for e in range(epochs):\n",
157 | " # go through all records in the training data set\n",
158 | " for record in training_data_list:\n",
159 | " # split the record by the ',' commas\n",
160 | " all_values = record.split(',')\n",
161 | " # scale and shift the inputs\n",
162 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
163 | " # create the target output values (all 0.01, except the desired label which is 0.99)\n",
164 | " targets = numpy.zeros(output_nodes) + 0.01\n",
165 | " # all_values[0] is the target label for this record\n",
166 | " targets[int(all_values[0])] = 0.99\n",
167 | " n.train(inputs, targets)\n",
168 | " \n",
169 | " ## create rotated variations\n",
170 | " # rotated anticlockwise by x degrees\n",
171 | " inputs_plusx_img = scipy.ndimage.interpolation.rotate(inputs.reshape(28,28), 10, cval=0.01, order=1, reshape=False)\n",
172 | " n.train(inputs_plusx_img.reshape(784), targets)\n",
173 | " # rotated clockwise by x degrees\n",
174 | " inputs_minusx_img = scipy.ndimage.interpolation.rotate(inputs.reshape(28,28), -10, cval=0.01, order=1, reshape=False)\n",
175 | " n.train(inputs_minusx_img.reshape(784), targets)\n",
176 | " \n",
177 | " # rotated anticlockwise by 10 degrees\n",
178 | " #inputs_plus10_img = scipy.ndimage.interpolation.rotate(inputs.reshape(28,28), 10, cval=0.01, order=1, reshape=False)\n",
179 | " #n.train(inputs_plus10_img.reshape(784), targets)\n",
180 | " # rotated clockwise by 10 degrees\n",
181 | " #inputs_minus10_img = scipy.ndimage.interpolation.rotate(inputs.reshape(28,28), -10, cval=0.01, order=1, reshape=False)\n",
182 | " #n.train(inputs_minus10_img.reshape(784), targets)\n",
183 | " \n",
184 | " pass\n",
185 | " pass"
186 | ]
187 | },
188 | {
189 | "cell_type": "code",
190 | "execution_count": 7,
191 | "metadata": {
192 | "collapsed": true
193 | },
194 | "outputs": [],
195 | "source": [
196 | "# load the mnist test data CSV file into a list\n",
197 | "test_data_file = open(\"mnist_dataset/mnist_test.csv\", 'r')\n",
198 | "test_data_list = test_data_file.readlines()\n",
199 | "test_data_file.close()"
200 | ]
201 | },
202 | {
203 | "cell_type": "code",
204 | "execution_count": 8,
205 | "metadata": {},
206 | "outputs": [],
207 | "source": [
208 | "# test the neural network\n",
209 | "\n",
210 | "# scorecard for how well the network performs, initially empty\n",
211 | "scorecard = []\n",
212 | "\n",
213 | "# go through all the records in the test data set\n",
214 | "for record in test_data_list:\n",
215 | " # split the record by the ',' commas\n",
216 | " all_values = record.split(',')\n",
217 | " # correct answer is first value\n",
218 | " correct_label = int(all_values[0])\n",
219 | " # scale and shift the inputs\n",
220 | " inputs = (numpy.asfarray(all_values[1:]) / 255.0 * 0.99) + 0.01\n",
221 | " # query the network\n",
222 | " outputs = n.query(inputs)\n",
223 | " # the index of the highest value corresponds to the label\n",
224 | " label = numpy.argmax(outputs)\n",
225 | " # append correct or incorrect to list\n",
226 | " if (label == correct_label):\n",
227 | " # network's answer matches correct answer, add 1 to scorecard\n",
228 | " scorecard.append(1)\n",
229 | " else:\n",
230 | " # network's answer doesn't match correct answer, add 0 to scorecard\n",
231 | " scorecard.append(0)\n",
232 | " pass\n",
233 | " \n",
234 | " pass"
235 | ]
236 | },
237 | {
238 | "cell_type": "code",
239 | "execution_count": 9,
240 | "metadata": {},
241 | "outputs": [
242 | {
243 | "name": "stdout",
244 | "output_type": "stream",
245 | "text": [
246 | "performance = 0.9754\n"
247 | ]
248 | }
249 | ],
250 | "source": [
251 | "# calculate the performance score, the fraction of correct answers\n",
252 | "scorecard_array = numpy.asarray(scorecard)\n",
253 | "print (\"performance = \", scorecard_array.sum() / scorecard_array.size)"
254 | ]
255 | },
256 | {
257 | "cell_type": "code",
258 | "execution_count": null,
259 | "metadata": {
260 | "collapsed": true
261 | },
262 | "outputs": [],
263 | "source": []
264 | }
265 | ],
266 | "metadata": {
267 | "kernelspec": {
268 | "display_name": "Python 3",
269 | "language": "python",
270 | "name": "python3"
271 | },
272 | "language_info": {
273 | "codemirror_mode": {
274 | "name": "ipython",
275 | "version": 3
276 | },
277 | "file_extension": ".py",
278 | "mimetype": "text/x-python",
279 | "name": "python",
280 | "nbconvert_exporter": "python",
281 | "pygments_lexer": "ipython3",
282 | "version": "3.6.1"
283 | }
284 | },
285 | "nbformat": 4,
286 | "nbformat_minor": 1
287 | }
288 |
--------------------------------------------------------------------------------