├── .gitignore ├── LICENSE ├── README.md ├── __init__.py ├── core ├── analysis_utils.py ├── computation_layer.py ├── connection_layer.py ├── graph_functions.py ├── learning │ ├── learning_rule.py │ ├── stdp.py │ └── weight_bounds.py ├── model.py ├── neuron_layer.py ├── spike_process.py └── utils.py ├── drawing_utils ├── connection_layer_drawing.py ├── neuron_layer_drawing.py └── trace_renderers.py ├── jupyter_examples ├── README.md ├── example_0_single_neuron.ipynb ├── example_10_synaptic_delay.ipynb ├── example_11_composite_model.ipynb ├── example_12_spike_processes.ipynb ├── example_13_stdp_basics.ipynb ├── example_14_poisson_neurons.ipynb ├── example_15_stdp_learning_rule.ipynb ├── example_16_stdp_in_situ.ipynb ├── example_1_three_identity_neurons.ipynb ├── example_2_three_izhikevich_neurons.ipynb ├── example_3_three_LIF_neurons.ipynb ├── example_4_recurrence.ipynb ├── example_5_two_layers.ipynb ├── example_6_draw_neuron_distributions.ipynb ├── example_7_synaptic_failure.ipynb ├── example_8_synaptic_reset.ipynb └── example_9_synaptic_modification_out_of_graph.ipynb └── jupyter_explorations └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | jupyter_examples/.ipynb_checkpoints 3 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 Colin Prepscius 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # spikeflow 2 | 3 | Spiking neural networks in tensorflow. 4 | 5 | ***Installation:*** pip install coming soon. I'll wait until it achieves just a little bit of stability... some feedback'd help here... 6 | 7 | 8 | 9 | **Hypothesis:** Biological plausibility in neural networks is not an obstacle, but a benefit, for machine learning applications. 10 | 11 | **Proof:** None. Yet. 12 | 13 | The purpose of this library is to explore the connection between computational neuroscience and machine learning, and to practice implementing efficient and fast models in tensorflow. 14 | 15 | Spikeflow makes it easy to create arbitrarily connected layers of spiking neurons into a tensorflow graph, and then to 'step time'. 16 | 17 | Spikeflow will concentrate on facilitating the exploration of spiking neuron models, local-only learning rules, dynamic network construction, the role of inhibition in the control of attractor dynamics, etc. 18 | 19 | The library will implement: 20 | - multiple models of spiking neurons, including 21 | - [x] Leaky Integrate-and-fire 22 | - [x] Izhikevich 23 | - [x] Poisson (can convert things like images to spike trains) 24 | - [ ] Hodgkin-Huxley? Taking requests... 25 | - arbitrary connections between layers of similar neurons, including 26 | - [x] feed-foward 27 | - [x] recurrent 28 | - [x] inhibitory 29 | - [x] layer compositing (layers can have sub-layers with a single input and output) 30 | - multiple models of synapses, including 31 | - [x] simple weights 32 | - [x] synapses with decay 33 | - [x] synapses with failure 34 | - [x] synapses with post-synaptic reset 35 | - [x] synapses with delay 36 | - out-of-graph and in-graph learning rules, including 37 | - [x] out-of-graph weight modification 38 | - [x] out-of-graph STDP, compatible with model.step_time's end timestep callback 39 | - [x] in-graph STDP with weight bounds 40 | - [x] symmetric STDP 41 | - [ ] three-factor STDP 42 | - [ ] others? taking requests! 43 | - forms of dynamic neural network construction, including 44 | - [ ] Synaptic pruning and synaptogenesis 45 | - [ ] Neuron pruning and neurogenesis 46 | - [ ] Other structure modification 47 | 48 | #### The basic modeling idea is this: 49 | 50 | ``` 51 | model = BPNNModel.compiled_model(input_shape, 52 | [ neuronlayer1, neuronlayer2, ...], 53 | [ connections1, connections2, ...]) 54 | 55 | model.run_time(data_generator, end_time_step_callback) 56 | ``` 57 | 58 | See the examples in the `jupyter_examples` directory. 59 | 60 | Feedback and collaborators welcome! 61 | -------------------------------------------------------------------------------- /__init__.py: -------------------------------------------------------------------------------- 1 | from spikeflow.core.utils import * 2 | from spikeflow.core.computation_layer import * 3 | from spikeflow.core.neuron_layer import * 4 | from spikeflow.core.connection_layer import * 5 | from spikeflow.core.model import * 6 | from spikeflow.core.spike_process import * 7 | from spikeflow.core.analysis_utils import * 8 | from spikeflow.core.learning.stdp import * 9 | from spikeflow.core.learning.weight_bounds import * 10 | -------------------------------------------------------------------------------- /core/analysis_utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from spikeflow.core.spike_process import * 3 | 4 | 5 | """ 6 | Some useful tools for analysis - get spike counts, firing rates, etc. 7 | """ 8 | 9 | def spike_counts(spike_processes): 10 | """ Gets the number of times each neuron fired. 11 | Args: 12 | spike_processes: [ np.array([indexes of spikes]) ] 13 | Returns: 14 | [ ints: counts of firings ] 15 | """ 16 | return np.array([ p.shape[0] for p in spike_processes ]) 17 | 18 | 19 | def num_inactive_processes(spike_processes): 20 | """ Gets the number of neurons that didn't fire at all. 21 | Args: 22 | spike_processes: [ np.array([indexes of spikes]) ] 23 | Returns: 24 | int 25 | """ 26 | return sum([0 if len(fp) > 1 else 0 for fp in spike_processes]) 27 | 28 | 29 | def num_zero_firing(firings): 30 | """ Gets the number of neurons that didn't fire at all. 31 | Args: 32 | firings: np.array((timesteps, # neurons)) of bools; True = spike 33 | Returns: 34 | int 35 | """ 36 | return num_inactive_processes(firings_to_processes(firings)) 37 | 38 | 39 | def spiking_rates(spike_processes, n_timesteps): 40 | """ Gets the spiking rate of each neuron 41 | Args: 42 | spike_processes: [ np.array([indexes of spikes]) ] 43 | n_timesteps: int 44 | Returns: 45 | [ float ] 46 | """ 47 | return np.array([ c / n_timesteps for c in spike_counts(spike_processes) ]) 48 | 49 | 50 | def firing_rates(firings): 51 | """ Gets the firing rates of all neurons. 52 | Args: 53 | firings: np.array((timesteps, # neurons)) of bools; True = spike 54 | Returns: 55 | [ float ] 56 | """ 57 | return spiking_rates(firings_to_processes(firings), firings.shape[0]) 58 | 59 | 60 | def average_firing_rates(rates): 61 | """ Gets the total average firing rate of all neurons. 62 | Args: 63 | rates: 64 | Returns: 65 | float 66 | """ 67 | return np.sum(rates) / len(rates) 68 | -------------------------------------------------------------------------------- /core/computation_layer.py: -------------------------------------------------------------------------------- 1 | class ComputationLayer: 2 | 3 | """ Base class for computation graph layers. 4 | Inherited by NeuronLayer and ConnectionLayer. 5 | 6 | Basically, creates a computation graph kernel (or subset thereof). 7 | 8 | Defines: 9 | - computation graph nodes the subclasses must have (and possibly create): input and output 10 | - abstract methods the subclasses must implement in order to compile to a computation graph 11 | 12 | Variables: 13 | name: name of the layer. Must be unique within sibling layers. 14 | input: tf.Tensor operation, such as tf.Variable 15 | output: tf.Tensor operation, such as tf.Variable 16 | """ 17 | 18 | def __init__(self, name): 19 | self.name = name 20 | self.input = None 21 | self.output = None 22 | 23 | def _ops(self): 24 | """ Subclasses need to return [tf.Tensor operation] or similar 25 | Returns an array of tensor operations that session.run should evaluate. 26 | Or: a dictionary of names to ops values - basically anything that 27 | can be fed to tensorflow's session.run method's 'fetches' parameter. 28 | """ 29 | raise NotImplementedError 30 | 31 | def _compile(self): 32 | """ Compile the tensorflow graph operations. Herein may construct 33 | output and input. 34 | """ 35 | raise NotImplementedError 36 | 37 | def ops_format(self): 38 | """ Gets a description of what ops returns. """ 39 | raise NotImplementedError 40 | -------------------------------------------------------------------------------- /core/connection_layer.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | from collections import namedtuple 4 | from spikeflow.core.utils import floats_uniform, floats_normal, identical_ints_sampler 5 | from spikeflow.core.computation_layer import ComputationLayer 6 | 7 | 8 | 9 | Synapse = namedtuple('Synapse', ['from_neuron_index', 'to_neuron_index', 'v']) 10 | 11 | 12 | def weights_from_synapses(from_layer, to_layer, synapses): 13 | """ Constructs a weight matrix from the given synapses. 14 | 15 | Args: 16 | from_layer: the pre-synaptic neuron layer 17 | to_neuron: the post-synaptic neuron layer 18 | synapses: iterable of Synapse 19 | Return: 20 | weights: np.array((from_layer.n, to_layer.n), float32) 21 | """ 22 | w = np.zeros((from_layer.output_n, to_layer.input_n), dtype=np.float32) 23 | for s in synapses: 24 | w[s.from_neuron_index, s.to_neuron_index] = s.v 25 | return w 26 | 27 | 28 | def weights_connecting_from_to(from_layer, to_layer, connectivity, v_sampler, from_range=None, to_range=None, exclude=None): 29 | """ Constructs a weight matrix by connecting each neuron in from_layer to 30 | a random set of neurons in to_layer. 31 | 32 | Args: 33 | from_layer: the pre-synaptic neuron layer 34 | to_layer: the post-synaptic neuron layer 35 | connectivity: float [0,1], each pre-synaptic neuron should connect to 36 | this fraction of post-synaptic neurons 37 | v_sampler: weight distribution 38 | from_range: tuple of (from: int index, to: int index), or None. Use this 39 | to control connectivity from a subset of pre-synaptic neurons 40 | to_range: tuple of (from: int index, to: int index), or None. Use this 41 | to control connectivity to a subset of post-synaptic neurons 42 | exclude: [(x,y)] - those weights are set to 0 43 | Return: 44 | weights: np.array((from_layer.n, to_layer.n), float32) 45 | """ 46 | w = np.zeros((from_layer.n, to_layer.n), dtype=np.float32) 47 | fr = (0, from_layer.output_n) if from_range is None else from_range 48 | tr = (0, to_layer.input_n) if to_range is None else to_range 49 | n = int(connectivity * (tr[1] - tr[0])) 50 | for from_i in range(fr[0], fr[1]): 51 | v_arr = v_sampler(n) 52 | to_neuron_indexes = np.random.choice(tr[1]-tr[0], n, replace=False) + tr[0] 53 | for to_i, v in zip(to_neuron_indexes, v_arr): 54 | w[from_i, to_i] = v 55 | if exclude: 56 | for x, y in exclude: 57 | w[x, y] = 0.0 58 | return w 59 | 60 | 61 | def samples_for_weights(weights, sampler): 62 | """ Creates a matrix of the same shape as weights, 63 | with values drawn from sampler everywhere weights is non-zero, 64 | and zero otherwise. Used, for instance, for creating a delay 65 | matrix to match a weight matrix. 66 | Args: 67 | weights: a 2d weight matrix 68 | sampler: a function that return n random values, or a scalar 69 | Return: 70 | a matrix of values drawn from sampler, same shape and 71 | topology as weights 72 | """ 73 | try: 74 | return np.float32(operator.index(sampler)) 75 | except: 76 | indexes = np.nonzero(weights) 77 | delays = np.zeros(weights.shape) 78 | delays[indexes] = sampler(len(indexes[0])) 79 | return delays 80 | 81 | def delays_for_weights(weights, delay_sampler): 82 | """ Creates a delay matrix for a corresponding weight matrix, using a sampler """ 83 | return np.maximum(samples_for_weights(weights, delay_sampler), np.zeros(weights.shape)) 84 | 85 | def decays_for_weights(weights, decay_sampler): 86 | """ Creates a decay matrix for a corresponding weight matrix, using a sampler """ 87 | return np.minimum(np.maximum(samples_for_weights(weights, decay_sampler), weights * 0), np.ones(weights.shape)) 88 | 89 | 90 | class ConnectionLayer(ComputationLayer): 91 | """ Base class for all connection layers. 92 | """ 93 | 94 | def __init__(self, name, from_layer, to_layer): 95 | """ Constructs a connection layer. 96 | Args: 97 | from_layer: NeuronLayer 98 | to_layer: NeuronLayer 99 | """ 100 | super().__init__(name) 101 | self.from_layer = from_layer 102 | self.to_layer = to_layer 103 | 104 | 105 | def _compile_output_node(self): 106 | """ Child classes must implement this. Will be called in the context 107 | of a graph during network construction phase. This method will be called 108 | before compile(), and must create an output tensor, which will be 109 | connected to to_layer's input. 110 | """ 111 | raise NotImplementedError 112 | 113 | 114 | class SynapseLayer(ConnectionLayer): 115 | """ A connection layer of simple synapses (just a weight matrix that multiplies 116 | the input). 117 | """ 118 | 119 | def __init__(self, name, from_layer, to_layer, weights): 120 | """ Constructs a (simple) SynapseLayer. 121 | Args: 122 | from_layer: from neuron layer 123 | to_layer: to neuron layer 124 | weights: np.array((from_layer.n, to_layer.n), float32) 125 | """ 126 | super().__init__(name, from_layer, to_layer) 127 | self.w = weights 128 | self.output_op = None 129 | 130 | def _compile_output_node(self): 131 | self.output = tf.Variable(np.zeros((self.w.shape[1],), dtype=np.float32), name='Synapse_Layer_Output') 132 | 133 | def _ops(self): 134 | return [self.input, self.output_op] 135 | 136 | def ops_format(self): 137 | # should we change to { key: node }? 138 | return ['input', 'output'] 139 | 140 | def _compile(self): 141 | # create variables 142 | self.weights = tf.Variable(self.w) 143 | 144 | # compute output: input * weights 145 | input_f = tf.to_float(self.input) 146 | o = tf.matmul(tf.expand_dims(input_f, 0), self.weights) 147 | o_reshaped = tf.reshape(o, [-1]) 148 | self.output_op = self.output.assign(o_reshaped) 149 | 150 | def set_weights_in_session(self, w, sess): 151 | self.w = w 152 | sess.run(self.weights.assign(w)) 153 | 154 | def set_weights_in_graph(self, w, graph): 155 | with tf.Session(graph=graph) as sess: 156 | self.set_weights_in_session(w, sess) 157 | 158 | 159 | 160 | class ComplexSynapseLayer(SynapseLayer): 161 | """ A connection layer of synapses with additional capabilities: 162 | 163 | decay: synaptic sum values will decay in time: single layer-wide float 164 | failure_prob: some synapses might fail to transmit: single layer-wide float 165 | post_synaptic_reset_factor: when post synaptic neuron fires, scale synaptic 166 | sum values immediately: single layer-wide float 167 | delay: synaptic-level delay matrix. Each value is int: number of timesteps 168 | of delay to apply to that synapse. Must match shape and topology of weights. 169 | NOTE: using this feature will create a tensor of shape n x t, where n is the 170 | number of synapses, and t is the maximum delay, in number of timesteps. 171 | max_delay: maximum value for delay. If not none, the above-mentioned delay 172 | tensor will be of shape n x max_delay. 173 | """ 174 | 175 | def __init__(self, name, from_layer, to_layer, weights, decay=None, failure_prob=None, post_synaptic_reset_factor=None, delay=None, max_delay=None): 176 | """ Constructs a ComplexSynapseLayer. 177 | Args: 178 | from_layer: from neuron layer 179 | to_layer: to neuron layer 180 | weights: np.array((from_layer.n, to_layer.n), float32) 181 | decay: float [0,1], decay to apply to post-synaptic summation over time 182 | failure_prob: float [0,1], probability of synaptic failure 183 | post_synaptic_reset_factor: float (likely [0,1] but not necessarily), 184 | amount to scale post synaptic summation in the event post-synaptic 185 | neuron fires 186 | delay: int (single layer-wide delay) or np.array(ints), which 187 | must match shape and topology of weights (same indexes of non-zero 188 | weights). 189 | max_delay: int, maximum possible value for delay 190 | """ 191 | 192 | super().__init__(name, from_layer, to_layer, weights) 193 | self.decay = decay 194 | self.failure_prob = failure_prob 195 | self.post_synaptic_reset_factor = post_synaptic_reset_factor 196 | self.delay = delay if not np.isscalar(delay) else delays_for_weights(weights, identical_ints_sampler(delay)) 197 | self.max_delay = max_delay 198 | 199 | def _ops(self): 200 | return [self.input, self.output_op] 201 | 202 | def ops_format(self): 203 | return ['input', 'output'] 204 | 205 | 206 | def _compile(self): 207 | 208 | # define object variable for weights 209 | self.weights = tf.Variable(self.w) 210 | 211 | input_f = tf.to_float(self.input) 212 | 213 | if self.post_synaptic_reset_factor is None: 214 | pstf_inv = None 215 | else: 216 | pstf_inv = tf.to_float(1.0 - self.post_synaptic_reset_factor) 217 | 218 | # make some synapses fail (just bring their weights to 0) 219 | if self.failure_prob is None: 220 | succeeding_weights = self.weights 221 | else: 222 | random_noise = tf.random_uniform(self.weights.shape, 0, 1.0) 223 | succeeding_weights = self.weights * tf.to_float(tf.greater(random_noise, self.failure_prob)) 224 | 225 | # synaptic-level delay 226 | inputs_flat = tf.expand_dims(input_f, 0) 227 | if self.delay is None: 228 | o = tf.matmul(inputs_flat, succeeding_weights) 229 | o_delayed = tf.reshape(o, [-1]) 230 | else: 231 | # Create a tensor variable to hold the delays (so we can change them) 232 | self.delays = tf.Variable(self.delay) 233 | 234 | n = self.w.shape[0]*self.w.shape[1] 235 | d_max = self.max_delay if self.max_delay is not None else int(np.max(self.delay)) 236 | 237 | # Create a variable: n (input*output neurons) by d_max tensor 238 | # to hold propagating synaptic values 239 | propagating_values = tf.Variable(np.zeros((n, d_max+1)), dtype=tf.float32) 240 | 241 | # get 2d tensor of scatter indices from delays 242 | delays_flat = tf.cast(tf.reshape(self.delays, [-1]), tf.int32) 243 | indices = tf.range(0, n) 244 | delay_scatter_indices = tf.transpose(tf.reshape(tf.concat([indices, delays_flat], axis=0), (2, n))) 245 | 246 | # multiply inputs by weights element-wise, and flatten 247 | multed_inputs = tf.transpose(tf.multiply(tf.transpose(succeeding_weights), input_f)) 248 | flat_inputs = tf.reshape(multed_inputs, (-1,)) 249 | 250 | # roll the propagation tensor by 1 timestep down 251 | propagation_rolled = tf.roll(propagating_values, shift=-1, axis=1) 252 | roll_op = propagating_values.assign(propagation_rolled) 253 | 254 | # scatter input into the propagation tensor 255 | k = tf.scatter_nd_update(roll_op, delay_scatter_indices, flat_inputs) 256 | prop_op = propagating_values.assign(k) 257 | 258 | # get the 0th timestep's values 259 | final_out = prop_op[:,0] 260 | 261 | # reshape back into same shape as weight matrix 262 | r_out = tf.reshape(final_out, self.weights.shape) 263 | 264 | # sum per output neuron 265 | o_delayed = tf.reduce_sum(r_out, axis=0) 266 | 267 | 268 | # apply decay 269 | if self.decay is None: 270 | o_decayed = o_delayed 271 | else: 272 | o_decayed = o_delayed + self.output * self.decay 273 | 274 | # if the post-synaptic layer fired, scale my output by post-synaptic reset factor 275 | if pstf_inv is None: 276 | o_reset = o_decayed 277 | else: 278 | o_reset = o_decayed - tf.to_float(self.to_layer.output) * pstf_inv * o_decayed 279 | 280 | # top of graph 281 | self.output_op = self.output.assign(o_reset) 282 | 283 | 284 | 285 | 286 | # def compiled_stdp_graph_w1(w, input, output, stdp_params): 287 | # dW = tf.Variable(np.zeros(w.shape), dtype=tf.float32) 288 | # 289 | # pre_synaptic_traces = tf.Variable(np.zeros((w.shape[0],)), dtype=tf.float32) 290 | # post_synaptic_traces = tf.Variable(np.zeros((w.shape[1],)), dtype=tf.float32) 291 | # #self.post_synaptic_triplet_traces = tf.Variable(np.zeros((self.w.shape[1],)), dtype=tf.float32) 292 | # 293 | # pre_synaptic_traces_decayed = pre_synaptic_traces / stdp_params.TauPlus 294 | # post_synaptic_traces_decayed = post_synaptic_traces / stdp_params.TauMinus 295 | # 296 | # pre_synaptics_activated = input * stdp_params.APlus 297 | # post_synaptics_activated = output * stdp_params.AMinus * -1.0 298 | # 299 | # if stdp_params.all_to_all: 300 | # new_pre_synaptic_traces = pre_synaptic_traces_decayed + pre_synaptics_activated 301 | # new_post_synaptic_traces = post_synaptic_traces_decayed + post_synaptics_activated 302 | # else: 303 | # new_pre_synaptic_traces = tf.min(pre_synaptic_traces_decayed + pre_synaptics_activated, stdp_params.APlus) 304 | # new_post_synaptic_traces = tf.min(pre_synaptic_traces_decayed + post_synaptics_activated, stdp_params.AMinus) 305 | # 306 | # pre_synaptic_traces = tf.assign(pre_synaptic_traces, new_pre_synaptic_traces) 307 | # post_synaptic_traces = tf.assign(post_synaptic_traces, new_post_synaptic_traces) 308 | # 309 | # pre_dw = output * pre_synaptic_traces 310 | # post_dw = input * post_synaptic_traces 311 | # 312 | # tdw = tf.add(pre_dw, post_dw) 313 | # ndw = dW + tdw 314 | # dW = dW.assign(ndw) 315 | # 316 | # return dW 317 | # 318 | # 319 | # def compiled_stdp_graph(w, input, output, stdp_params): 320 | # 321 | # # --- define variables --- 322 | # 323 | # # dW is the change-in-weight matrix: what we want to actually calculate 324 | # dW = tf.Variable(np.zeros(w.shape), dtype=tf.float32) 325 | # 326 | # # pre- and post-synaptic neuron firings will leave traces used to compute dW 327 | # pre_synaptic_traces = tf.Variable(np.zeros((w.shape[0],)), dtype=tf.float32) 328 | # post_synaptic_traces = tf.Variable(np.zeros((w.shape[1],)), dtype=tf.float32) 329 | # if stdp_params.uses_triplet_rule: 330 | # post_synaptic_triplet_traces = tf.Variable(np.zeros((w.shape[1],)), dtype=tf.float32) 331 | # 332 | # 333 | # # --- decay traces, raise by A+ or A- for pre- and post-synaptic spikes --- 334 | # 335 | # # decay all traces exponentially 336 | # pre_synaptic_traces_decayed = pre_synaptic_traces / stdp_params.TauPlus 337 | # post_synaptic_traces_decayed = post_synaptic_traces / stdp_params.TauMinus 338 | # if stdp_params.uses_triplet_rule: 339 | # post_synaptic_triplet_traces_decayed = post_synaptic_triplet_traces / stdp_params.TauMinusTriplet 340 | # 341 | # # for each pre- or post-synaptic spike, change the appropriate trace by a factor 342 | # pre_synaptics_activated = input * stdp_params.APlus 343 | # post_synaptics_activated = output * stdp_params.AMinus * -1.0 344 | # if stdp_params.uses_triplet_rule: 345 | # post_synaptics_triplet_activated = output * stdp_params.AMinusTriplet * -1.0 346 | # 347 | # # update the traces 348 | # new_pre_synaptic_traces = pre_synaptic_traces_decayed + pre_synaptics_activated 349 | # new_post_synaptic_traces = post_synaptic_traces_decayed + post_synaptics_activated 350 | # if stdp_params.uses_triplet_rule: 351 | # new_post_synaptic_triplet_traces = post_synaptic_triplet_traces_decayed + post_synaptics_triplet_activated 352 | # 353 | # # if it's not 'all-to-all', then cap traces at the appropriate level 354 | # if not stdp_params.all_to_all: 355 | # new_pre_synaptic_traces = tf.minimum(new_pre_synaptic_traces, stdp_params.APlus) 356 | # new_post_synaptic_traces = tf.minimum(new_post_synaptic_traces, stdp_params.AMinus) 357 | # if stdp_params.uses_triplet_rule: 358 | # new_post_synaptic_triplet_traces = tf.minimum(new_post_synaptic_triplet_traces, stdp_params.AMinusTriplet) 359 | # 360 | # # assign new traces 361 | # pre_synaptic_traces = tf.assign(pre_synaptic_traces, new_pre_synaptic_traces) 362 | # post_synaptic_traces = tf.assign(post_synaptic_traces, new_post_synaptic_traces) 363 | # if stdp_params.uses_triplet_rule: 364 | # post_synaptic_triplet_traces = tf.assign(post_synaptic_triplet_traces, new_post_synaptic_triplet_traces) 365 | # 366 | # # contributions to dW are given by post-synaptic spikes * pre-synaptic traces, 367 | # # and pre-synaptic spikes * post-synaptic traces 368 | # pre_dw = output * pre_synaptic_traces 369 | # post_dw = input * post_synaptic_traces 370 | # if stdp_params.uses_triplet_rule: 371 | # post_dw = post_dw + input * post_synaptic_triplet_traces 372 | # 373 | # # total dW is sum of both 374 | # tdw = tf.add(pre_dw, post_dw) 375 | # ndw = dW + tdw 376 | # dW = dW.assign(ndw) 377 | # 378 | # # filter dW by W? Don't even bother? 379 | # if stdp_params.uses_triplet_rule: 380 | # return dW, pre_synaptic_traces, post_synaptic_traces, post_synaptic_triplet_traces 381 | # else: 382 | # return dW, pre_synaptic_traces, post_synaptic_traces 383 | # 384 | # 385 | # class STDP_Tester: 386 | # 387 | # def __init__(self, w, stdp_params): 388 | # self.w = w 389 | # self.stdp_params = stdp_params 390 | # 391 | # def compile_stdp_graph(self): 392 | # self.input = tf.placeholder(tf.float32, shape=(self.w.shape[0],), name='input') 393 | # self.output = tf.placeholder(tf.float32, shape=(self.w.shape[1],), name='output') 394 | # if self.stdp_params.uses_triplet_rule: 395 | # self.dW, self.pre_synaptic_traces, self.post_synaptic_traces, self.post_synaptic_triplet_traces = compiled_stdp_graph(self.w, self.input, self.output, self.stdp_params) 396 | # else: 397 | # self.dW, self.pre_synaptic_traces, self.post_synaptic_traces = compiled_stdp_graph(self.w, self.input, self.output, self.stdp_params) 398 | # 399 | # def test(self, input_firings, output_firings): 400 | # graph = tf.Graph() 401 | # with graph.as_default(): 402 | # self.compile_stdp_graph() 403 | # 404 | # pre_traces = [] 405 | # post_traces = [] 406 | # post_triplet_traces = [] 407 | # last_dw = None 408 | # with tf.Session(graph=graph) as sess: 409 | # tf.global_variables_initializer().run() 410 | # 411 | # runnables = [self.dW, self.pre_synaptic_traces, self.post_synaptic_traces, self.post_synaptic_triplet_traces] if self.stdp_params.uses_triplet_rule else [self.dW, self.pre_synaptic_traces, self.post_synaptic_traces] 412 | # 413 | # for input_v, output_v in zip(input_firings, output_firings): 414 | # 415 | # results = sess.run(runnables, feed_dict={self.input: input_v, self.output: output_v}) 416 | # last_dw = results[0] 417 | # pre_traces.append(results[1]) 418 | # post_traces.append(results[2]) 419 | # 420 | # if self.stdp_params.uses_triplet_rule: 421 | # post_triplet_traces.append(results[3]) 422 | # 423 | # if self.stdp_params.uses_triplet_rule: 424 | # return last_dw, np.array(pre_traces), np.array(post_traces), np.array(post_triplet_traces) 425 | # else: 426 | # return last_dw, np.array(pre_traces), np.array(post_traces) 427 | -------------------------------------------------------------------------------- /core/graph_functions.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | 4 | def logsafe(v): 5 | return tf.where(tf.equal(v, 0.0), tf.zeros_like(v), tf.log(v)) 6 | 7 | def ricker_wavelet(v, sigma): 8 | """ Returns the ricker wavelet, or mexican hat wavelet, like this: 9 | https://en.wikipedia.org/wiki/Mexican_hat_wavelet 10 | """ 11 | 12 | sigma_cast = tf.cast(sigma, tf.float32) 13 | sigmasquared = tf.pow(sigma_cast, 2.0) 14 | f = 3.0 * np.power(np.pi, 0.25) 15 | p1 = 2.0 / (tf.sqrt(sigma_cast * f)) 16 | p2 = 1.0 - tf.pow((v / sigma_cast), 2.0) 17 | p3 = tf.exp(-1.0 * v*v / 2.0 * sigmasquared) 18 | return p1 * p2 * p3 19 | -------------------------------------------------------------------------------- /core/learning/learning_rule.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | 3 | class LearningRule: 4 | 5 | """ The base class for all online-style learning rules, including STDP. 6 | 7 | Premature abstraction, you say? Hmph. Right now there's just a single child: 8 | STDPLearningRule. 9 | 10 | Learning rules are applied to a connection layer, and should accumulate some 11 | value over time, given the connection layers' input and output, in order to 12 | apply this value to adjusting something (like weights) after a batch of input. 13 | 14 | Learning rules need to compile two computation sub-graphs: the 'accumulator' 15 | sub-graph, and the 'learning' sub-graph. The 'accumulator' sub-graph will be 16 | executed within the full time-stepping graph computation; the method 'accumulation_ops' 17 | needs to return the accumulator graph nodes. 18 | 19 | Any time during the main time step callback (at the end of a batch?), clients 20 | may call 'model.learn(session)', which will execute the learning sub-graph of 21 | all LearningRules, whose nodes must be returned by 'learning_ops'. 22 | 23 | Can be set to use a 'teaching signal': in this case, the 'output' fed to the 24 | learning rule will come from the teaching signal, rather than the output of 25 | the output neurons of the connection layer. This feature can be toggled. 26 | """ 27 | 28 | def __init__(self, name, connection_layer, uses_teaching_signal=False): 29 | """ Constructs a LearningRule 30 | 31 | Args: 32 | name: String, name of the learning rule. Must be globally unique. 33 | connection_layer: the ConnectionLayer to apply to 34 | uses_teaching_signal: whether to use the teaching signal. MUST be 35 | set to True here if you ever want to use this feature. Can be 36 | subsequently toggled off and on. 37 | """ 38 | self.name = name 39 | self.connection_layer = connection_layer 40 | self._uses_teaching_signal = uses_teaching_signal 41 | 42 | @property 43 | def teaching_signal_key(self): 44 | """ Returns the tensorflow node name of the teaching signal """ 45 | return self.name + '_teaching_signal' 46 | 47 | @property 48 | def uses_teaching_signal(self): 49 | """ Returns whether the teaching signal is used or not. """ 50 | return self._uses_teaching_signal 51 | 52 | def set_uses_teaching_signal(self, uses, session): 53 | """ Sets whether to use the teaching signal. Only works if 54 | _uses_teaching_signal was set to True during compilation! 55 | 56 | Args: 57 | uses: boolean, whether to use the teaching signal 58 | session: session to execute in 59 | """ 60 | self._uses_teaching_signal = uses 61 | if self.uses_teaching_signal_var is not None: 62 | session.run(self.uses_teaching_signal_var.assign(uses)) 63 | 64 | def compile(self, W, input, output): 65 | """ Compiles all computation graph nodes. Creates a teaching-signal bypass 66 | if necessary: output can be set to the teaching signal variable. 67 | 68 | Then calls _compile: child classes must implement to actually compile! 69 | 70 | Args: 71 | W: weight tensor variable to apply learning to 72 | input: input tensor (input Neurons output) 73 | output: output tensor (output Neurons output) 74 | """ 75 | if self._uses_teaching_signal: 76 | self.uses_teaching_signal_var = tf.Variable(self._uses_teaching_signal, dtype=tf.bool) 77 | self.teaching_signal = tf.placeholder(tf.float32, shape=(W.shape[1],), name=self.teaching_signal_key) 78 | final_output = tf.cond(self.uses_teaching_signal_var, lambda: self.teaching_signal, lambda: output) 79 | else: 80 | final_output = output 81 | 82 | self._compile(W, input, final_output) 83 | 84 | 85 | # called by the model: 86 | 87 | def _compile_into_model(self): 88 | """ Called by the model; compiles this learning rule fully. """ 89 | self.compile(self.connection_layer.weights, self.connection_layer.input, self.connection_layer.output) 90 | 91 | def _ops(self): 92 | """ Convenience function for the model; the _ops to run with every timestep 93 | are the 'accumulation' nodes. """ 94 | return self.accumulation_ops() 95 | 96 | def ops_format(self): 97 | """ Get the names of the accumulation ops """ 98 | return self.accumulation_ops_format() 99 | 100 | 101 | # subclasses must override: 102 | 103 | def _compile(self, W, input, output): 104 | """ Subclasses must implement to compile all accumulation step tensorflow 105 | nodes, as well as all learning step tensorflow nodes. """ 106 | raise NotImplementedError 107 | 108 | def accumulation_ops(self): 109 | """ Gets a list of tensorflow nodes for the accumulation step. """ 110 | raise NotImplementedError 111 | 112 | def accumulation_ops_format(self): 113 | """ Gets a list of tensorflow node names for the accumulation step. """ 114 | raise NotImplementedError 115 | 116 | def learning_ops(self): 117 | """ Gets a list of tensorflow nodes for the learning step. """ 118 | raise NotImplementedError 119 | 120 | def learning_ops_format(self): 121 | """ Gets a list of tensorflow node names for the learning step. """ 122 | raise NotImplementedError 123 | -------------------------------------------------------------------------------- /core/learning/stdp.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | from tensorflow.python.ops import control_flow_ops 4 | from spikeflow.core.learning.learning_rule import LearningRule 5 | from spikeflow.core.learning.weight_bounds import WeightBounds, WeightBounds_Enforcer 6 | from spikeflow.core.spike_process import * 7 | from spikeflow.core.graph_functions import * 8 | 9 | 10 | class STDPParams: 11 | 12 | """ Parameters governing stdp, as described here: 13 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 14 | 15 | Can perform 'associative' or 'symmetric' STDP as well: if AMinus is None and 16 | Sigma is a number: uses mexican-hat wavelet to compute STDP. 17 | """ 18 | 19 | def __init__(self, APlus=1.0, TauPlus=10.0, AMinus=1.0, TauMinus=10.0, all_to_all=True, Sigma=1.0): 20 | self.APlus = APlus 21 | self.TauPlus = TauPlus 22 | self.AMinus = AMinus 23 | self.TauMinus = TauMinus 24 | self.all_to_all = all_to_all 25 | self.Sigma = Sigma 26 | 27 | @property 28 | def is_associative(self): 29 | return self.AMinus is None 30 | 31 | def __str__(self): 32 | a_minus_string = '' if self.is_associative else ' A-:{0:1.2f}'.format(self.AMinus,) 33 | sigma_string = '' if not self.is_associative else ' σ:{0:1.2f}'.format(self.Sigma,) 34 | associative_string = ' Associative!' if self.is_associative else '' 35 | return 'STDPParams A+:{0:1.2f}{1} T+:{2:1.2f} T-:{3:1.2f}{4}{5}'.format(self.APlus, a_minus_string, self.TauPlus, self.TauMinus, sigma_string, associative_string) 36 | 37 | 38 | class STDP_Tracer: 39 | 40 | """ Computes pre- and post-synaptic trace contributions, according to STDP. 41 | 42 | Should we inherit from ComputationLayer? This is basically a computation kernel. 43 | But who compiles it? Probably not the top model layer - instead, the LearningRule, 44 | or some higher layer. 45 | 46 | Implements the online trace model of STDP described here: 47 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 48 | """ 49 | 50 | def __init__(self, stdp_params): 51 | self.stdp_params = stdp_params 52 | 53 | def _compile(self, W, input, output, pre_synaptic_traces, post_synaptic_traces): 54 | 55 | stdp_params = self.stdp_params 56 | 57 | # --- decay traces, raise by A+ or A- for pre- and post-synaptic spikes --- 58 | 59 | # decay all traces exponentially 60 | pre_synaptic_traces_decayed = pre_synaptic_traces / stdp_params.TauPlus 61 | post_synaptic_traces_decayed = post_synaptic_traces / stdp_params.TauMinus 62 | 63 | # for each pre- or post-synaptic spike, change the appropriate trace by a factor 64 | pre_synaptics_activated = input * stdp_params.APlus 65 | if self.stdp_params.is_associative: 66 | post_synaptics_activated = output * stdp_params.APlus 67 | else: 68 | post_synaptics_activated = output * stdp_params.AMinus * -1.0 69 | 70 | # update the traces 71 | new_pre_synaptic_traces = pre_synaptic_traces_decayed + pre_synaptics_activated 72 | new_post_synaptic_traces = post_synaptic_traces_decayed + post_synaptics_activated 73 | 74 | # if it's not 'all-to-all', then cap traces at the appropriate level 75 | if not stdp_params.all_to_all: 76 | new_pre_synaptic_traces = tf.minimum(new_pre_synaptic_traces, stdp_params.APlus) 77 | if self.stdp_params.is_associative: 78 | new_post_synaptic_traces = tf.minimum(new_post_synaptic_traces, stdp_params.APlus) 79 | else: 80 | new_post_synaptic_traces = tf.minimum(new_post_synaptic_traces, stdp_params.AMinus) 81 | 82 | # assign new traces 83 | pre_synaptic_traces = tf.assign(pre_synaptic_traces, new_pre_synaptic_traces) 84 | post_synaptic_traces = tf.assign(post_synaptic_traces, new_post_synaptic_traces) 85 | 86 | return pre_synaptic_traces, post_synaptic_traces 87 | 88 | 89 | def test(self, W, input_firings, output_firings): 90 | """ Compiles a computation graph, runs it through input and output firings 91 | (which should represent pre0 and post-synaptic neuron firings), returns 92 | recordings of pre and post trace contributions. 93 | """ 94 | 95 | graph = tf.Graph() 96 | with graph.as_default(): 97 | 98 | pre_synaptic_traces = tf.Variable(np.zeros((W.shape[0],)), dtype=tf.float32) 99 | post_synaptic_traces = tf.Variable(np.zeros((W.shape[1],)), dtype=tf.float32) 100 | 101 | input = tf.placeholder(tf.float32, shape=(W.shape[0],), name='input') 102 | output = tf.placeholder(tf.float32, shape=(W.shape[1],), name='output') 103 | 104 | trace_contributions = self._compile(W, input, output, pre_synaptic_traces, post_synaptic_traces) 105 | pre_synaptic_trace_contributions, post_synaptic_trace_contributions = trace_contributions 106 | 107 | pre_trace_contributions = [] 108 | post_trace_contributions = [] 109 | 110 | with tf.Session(graph=graph) as sess: 111 | tf.global_variables_initializer().run() 112 | 113 | runnables = [pre_synaptic_trace_contributions, post_synaptic_trace_contributions] 114 | 115 | for input_v, output_v in zip(input_firings, output_firings): 116 | 117 | results = sess.run(runnables, feed_dict={input: input_v, output: output_v}) 118 | 119 | pre_trace_contributions.append(results[0]) 120 | post_trace_contributions.append(results[1]) 121 | 122 | return np.array(pre_trace_contributions), np.array(post_trace_contributions) 123 | 124 | 125 | class STDPLearningRule(LearningRule): 126 | 127 | """ The first (and, currently, only) concrete LearningRule. 128 | Implements online stdp, with optional weight-bounds, as described here: 129 | 130 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 131 | 132 | Like all LearningRules, compiles two computation sub-graphs; one for 'accumulation', 133 | which is executed by the model with every time-step, and one for 'learning', 134 | which is executed at any time by the model (in turn by the client). 135 | """ 136 | 137 | def __init__(self, name, stdp_params, weight_bounds=None, connection_layer=None, uses_teaching_signal=False): 138 | """ Creates an stdp learning rule applied to some connection layer, as described here: 139 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 140 | 141 | Args: 142 | name: String, name of the learning rule. Must be globally unique. 143 | stdp_params: STDPParams 144 | weight_bounds: WeightBounds, optional 145 | connection_layer: the ConnectionLayer to apply to, optional (maybe don't set for testing) 146 | uses_teaching_signal: whether to use the teaching signal. MUST be 147 | set to True here if you ever want to use this feature. Can be 148 | subsequently toggled off and on. 149 | """ 150 | super().__init__(name, connection_layer, uses_teaching_signal) 151 | 152 | self.stdp_params = stdp_params 153 | self.stdp_tracer = STDP_Tracer(stdp_params) 154 | self.weight_bounds_enforcer = WeightBounds_Enforcer(weight_bounds) if weight_bounds else None 155 | 156 | # Tensorflow graph nodes we'll need to store 157 | 158 | # variables 159 | self.pre_dw = None 160 | self.post_dw = None 161 | 162 | # trace contributions 163 | self.accumulate_pre_dw = None 164 | self.accumulate_post_dw = None 165 | self.assignments = None 166 | 167 | # modify W according to weight bounds, and zero out 168 | self.assign_W = None 169 | self.zero_out = None 170 | 171 | # compilation fragments: 172 | 173 | def __compile_step_pre_dw(self, output, pre_synaptic_trace_conts): 174 | output_reshaped = tf.reshape(output, [output.shape[0], 1]) 175 | pre_reshaped = tf.reshape(pre_synaptic_trace_conts, [pre_synaptic_trace_conts.shape[0], 1]) 176 | return tf.transpose(output_reshaped * tf.transpose(pre_reshaped)) 177 | 178 | def __compile_step_post_dw(self, input, post_synaptic_trace_conts): 179 | input_reshaped = tf.reshape(input, [input.shape[0], 1]) 180 | post_reshaped = tf.reshape(post_synaptic_trace_conts, [post_synaptic_trace_conts.shape[0], 1]) 181 | return input_reshaped * tf.transpose(post_reshaped) 182 | 183 | def __compile_accumulate_associative(self, W, input, output): 184 | 185 | # -- define variables -- 186 | self.pre_dw = tf.Variable(np.zeros(W.shape), dtype=tf.float32) 187 | self.post_dw = tf.Variable(np.zeros(W.shape), dtype=tf.float32) 188 | 189 | pre_synaptic_trace_conts = tf.Variable(np.zeros((W.shape[0],)), dtype=tf.float32) 190 | post_synaptic_trace_conts = tf.Variable(np.zeros((W.shape[1],)), dtype=tf.float32) 191 | 192 | # instantiate partial graph for stdp traces 193 | pre_conts, post_conts = self.stdp_tracer._compile(W, input, output, pre_synaptic_trace_conts, post_synaptic_trace_conts) 194 | 195 | # assign new trace contributions 196 | a1 = pre_synaptic_trace_conts.assign(pre_conts) 197 | a2 = post_synaptic_trace_conts.assign(post_conts) 198 | self.assignments = tf.group(a1, a2) 199 | 200 | with tf.control_dependencies([self.assignments]): 201 | 202 | step_pre_dw = self.__compile_step_pre_dw(output, pre_synaptic_trace_conts) 203 | step_post_dw = self.__compile_step_post_dw(input, post_synaptic_trace_conts) 204 | 205 | step_pre_dw = tf.where(tf.math.logical_and(step_pre_dw > 0, step_post_dw > 0), step_pre_dw * 0.5, step_pre_dw) 206 | step_post_dw = tf.where(tf.math.logical_and(step_pre_dw > 0, step_post_dw > 0), step_post_dw * 0.5, step_post_dw) 207 | 208 | acc_pre_dw = self.pre_dw + step_pre_dw 209 | acc_post_dw = self.post_dw + step_post_dw 210 | 211 | self.accumulate_pre_dw = self.pre_dw.assign(acc_pre_dw) 212 | self.accumulate_post_dw = self.post_dw.assign(acc_post_dw) 213 | 214 | def __compile_accumulate_standard(self, W, input, output): 215 | 216 | # -- define variables -- 217 | self.pre_dw = tf.Variable(np.zeros(W.shape), dtype=tf.float32) 218 | self.post_dw = tf.Variable(np.zeros(W.shape), dtype=tf.float32) 219 | 220 | pre_synaptic_trace_conts = tf.Variable(np.zeros((W.shape[0],)), dtype=tf.float32) 221 | post_synaptic_trace_conts = tf.Variable(np.zeros((W.shape[1],)), dtype=tf.float32) 222 | 223 | # compute trace inputs 224 | step_pre_dw = self.__compile_step_pre_dw(output, pre_synaptic_trace_conts) 225 | step_post_dw = self.__compile_step_post_dw(input, post_synaptic_trace_conts) 226 | 227 | acc_pre_dw = self.pre_dw + step_pre_dw 228 | acc_post_dw = self.post_dw + step_post_dw 229 | 230 | self.accumulate_pre_dw = self.pre_dw.assign(acc_pre_dw) 231 | self.accumulate_post_dw = self.post_dw.assign(acc_post_dw) 232 | 233 | with tf.control_dependencies([self.accumulate_pre_dw, self.accumulate_post_dw]): 234 | 235 | # instantiate partial graph for stdp traces 236 | pre_conts, post_conts = self.stdp_tracer._compile(W, input, output, pre_synaptic_trace_conts, post_synaptic_trace_conts) 237 | 238 | # assign new trace contributions 239 | a1 = pre_synaptic_trace_conts.assign(pre_conts) 240 | a2 = post_synaptic_trace_conts.assign(post_conts) 241 | self.assignments = tf.group(a1, a2) 242 | 243 | 244 | def __compile_accumulate(self, W, input, output): 245 | input_cast = tf.cast(input, tf.float32) 246 | output_cast = tf.cast(output, tf.float32) 247 | if self.stdp_params.is_associative: 248 | self.__compile_accumulate_associative(W, input_cast, output_cast) 249 | else: 250 | self.__compile_accumulate_standard(W, input_cast, output_cast) 251 | 252 | def __compile_apply(self, W): 253 | 254 | # add the pre and post contributions 255 | new_W_step = tf.add(self.pre_dw, self.post_dw) 256 | 257 | # if associative ('symmetric') apply the ricker wavelet 258 | if self.stdp_params.is_associative: 259 | new_W_step = ricker_wavelet(logsafe(new_W_step), self.stdp_params.Sigma) 260 | 261 | if self.weight_bounds_enforcer is not None: 262 | new_W = self.weight_bounds_enforcer._compile(W, new_W_step) 263 | else: 264 | new_W = W + new_W_step 265 | 266 | # assign new weights 267 | self.assign_W = W.assign(new_W) 268 | 269 | # zero out the pre and post dw accumulations 270 | with tf.control_dependencies([self.assign_W]): 271 | zero_pre_dw = self.pre_dw.assign(np.zeros(W.shape)) 272 | zero_post_dw = self.post_dw.assign(np.zeros(W.shape)) 273 | self.zero_out = tf.group(zero_pre_dw, zero_post_dw) 274 | 275 | 276 | # --- overridden from LearningRule --- 277 | 278 | def _compile(self, W, input, output): 279 | self.__compile_accumulate(W, input, output) 280 | self.__compile_apply(W) 281 | 282 | def accumulation_ops(self): 283 | return [self.accumulate_pre_dw, self.accumulate_post_dw, self.assignments] 284 | 285 | def accumulation_ops_format(self): 286 | return ['accumulate_pre_dw', 'accumulate_post_dw', 'assignments'] 287 | 288 | def learning_ops(self): 289 | return [self.assign_W, self.zero_out] 290 | 291 | def learning_ops_format(self): 292 | return ['assign_W', 'zero_out'] 293 | 294 | 295 | 296 | # custom testing method! 297 | 298 | def apply_to_W(self, session): 299 | r = session.run(self.learning_ops()) 300 | return r[0] 301 | 302 | def test(self, Wi, input_firings, output_firings, reps=1): 303 | graph = tf.Graph() 304 | with graph.as_default(): 305 | W = tf.Variable(Wi, dtype=tf.float32) 306 | input = tf.placeholder(tf.float32, shape=(W.shape[0],), name='input') 307 | output = tf.placeholder(tf.float32, shape=(W.shape[1],), name='output') 308 | 309 | self.compile(W, input, output) 310 | 311 | with tf.Session(graph=graph) as sess: 312 | tf.global_variables_initializer().run() 313 | 314 | # show data to network, accumulate pre and post dw 315 | runnables1 = self.accumulation_ops() 316 | 317 | acc_pre, acc_post, last_w = (None, None, W.eval()) 318 | for i in range(reps): 319 | 320 | # run batch 321 | for input_v, output_v in zip(input_firings, output_firings): 322 | acc_pre, acc_post, assi = sess.run(runnables1, feed_dict={input: input_v, output: output_v}) 323 | 324 | # apply learning rule to weights after every batch 325 | last_w = self.apply_to_W(sess) 326 | 327 | # return the last accumulated pre, last accumulated post, and final W 328 | return acc_pre, acc_post, last_w 329 | 330 | 331 | 332 | # --- 333 | # Methods to calculate STDP 'offline': will be significantly slower than 334 | # the above, 'online' (or 'in-graph') version. Not yet benchmarked! 335 | 336 | 337 | def stdp_offline_dw(delta_times, stdp_params): 338 | """ Calculates the changes in weights due to deltas of firing times, 339 | according to the stdp learning rule: 340 | 341 | dW := 0 for dt == 0 342 | APlus * exp(-1.0 * delta_times / TauPlus) for dt > 0 343 | AMinus * exp(1.0 * delta_times / TauMinus) for dt < 0 344 | 345 | Args: 346 | delta_times: np.array(int) 347 | stdp_params: STDPParams 348 | Returns: 349 | np.array(delta_times.shape, float32) 350 | """ 351 | return np.where(delta_times == 0, 352 | 0, 353 | np.where(delta_times > 0, 354 | stdp_params.APlus * np.exp(-1.0 * delta_times / stdp_params.TauPlus), 355 | -1.0 * stdp_params.AMinus * np.exp(delta_times / stdp_params.TauMinus))) 356 | 357 | 358 | def stdp_offline_dw_process(input_spike_process, output_spike_process, stdp_params): 359 | """ Calculates the sum total change in weight from input and output spike processes as subject to the stdp learning rule, 'stdp_dw'. 360 | 361 | Args: 362 | w: weight matrix (np.array(input_n, output_n) of floats32) 363 | input_spike_processes: np.array(int) 364 | output_spike_process: np.array(int) 365 | stdp_params: STDPParams 366 | Returns: 367 | Change in weight (np.float), subject to the stdp learning rule, 'stdp_dw'. 368 | """ 369 | delta_times = spike_process_delta_times(input_spike_process, output_spike_process) 370 | all_dws = stdp_offline_dw(delta_times, stdp_params) 371 | return np.sum(all_dws) 372 | 373 | 374 | def stdp_offline_dw_processes(w, input_spike_processes, output_spike_processes, stdp_params): 375 | """ Calculates change in weights from spike processes as subject to the stdp learning rule, 'stdp_dw'. Returns a dw matrix that corresponds to the weight matrix w - where there is a non-zero weight, there will be a dw. 376 | 377 | NOTE: This might be slow! Does not use a tensorflow computation graph (yet). 378 | 379 | Args: 380 | w: weight matrix (np.array(input_n, output_n) of floats32) 381 | input_spike_processes: [ np.array(int) ] 382 | output_spike_processes: [ np.array(int) ] 383 | stdp_params: STDPParams 384 | Returns: 385 | Change in weights (np.array(w.shape), float32), subject to the stdp learning rule, 'stdp_dw'. 386 | """ 387 | dW = np.zeros(w.shape, dtype=np.float32) 388 | 389 | w_indices_i, w_indices_o = np.nonzero(w) 390 | 391 | for i, o in zip(w_indices_i, w_indices_o): 392 | dW[i, o] = stdp_offline_dw_process(input_spike_processes[i], 393 | output_spike_processes[o], 394 | stdp_params) 395 | 396 | return dW 397 | 398 | 399 | def stdp_offline_dw_firings(w, input_firings, output_firings, stdp_params): 400 | """ Calculates change in weights from firing matrices as subject to the stdp learning rule, 'stdp_dw'. 401 | 402 | Args: 403 | w: weight matrix (np.array(input_n, output_n) of floats32) 404 | input_firings: firings 405 | output_firings: firings 406 | stdp_params: STDPParams 407 | Returns: 408 | Change in weights, subject to the stdp learning rule, 'stdp_dw'. 409 | """ 410 | return stdp_offline_dw_processes(w, 411 | firings_to_spike_processes(input_firings), 412 | firings_to_spike_processes(output_firings), 413 | stdp_params) 414 | -------------------------------------------------------------------------------- /core/learning/weight_bounds.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | 3 | class WeightBounds: 4 | 5 | """ Contains weight-bounding parameters, as described here: 6 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 7 | """ 8 | 9 | def __init__(self, WMax, EtaPlus, EtaMinus, Soft=True): 10 | self.WMax = WMax 11 | self.EtaPlus = EtaPlus 12 | self.EtaMinus = EtaMinus 13 | self.Soft = Soft 14 | self.Hard = not self.Soft 15 | 16 | def __str__(self): 17 | return 'WeightBounds Wmax:{0:1.2f} η+:{1:1.2f} η-:{2:1.2f} Soft:{3}'.format(self.WMax, self.EtaPlus, self.EtaMinus, self.Soft) 18 | 19 | 20 | def _Heaviside(v): 21 | fv = tf.cast(v, tf.float32) 22 | return tf.sign(fv) * 0.5 + 0.5 23 | 24 | def _Constrain(v, min, max): 25 | return tf.minimum(tf.maximum(v, min), max) 26 | 27 | class WeightBounds_Enforcer: 28 | 29 | """ Compiles weight-dependent weight-bounding, as described here: 30 | http://www.scholarpedia.org/article/Spike-timing_dependent_plasticity 31 | """ 32 | 33 | def __init__(self, weight_bounds): 34 | """ Constructs a WeightBounds_Enforcer. 35 | 36 | Args: 37 | weight_bounds: WeightBounds 38 | """ 39 | self.weight_bounds = weight_bounds 40 | 41 | def _APlus(self, W): 42 | wd = self.weight_bounds.WMax - W 43 | bwd = wd if self.weight_bounds.Soft else _Heaviside(wd) 44 | return bwd * self.weight_bounds.EtaPlus 45 | 46 | def _AMinus(self, W): 47 | # seems to be an error in the scholarpedia article: NOT '-W'; use positive instead 48 | bw = W if self.weight_bounds.Soft else _Heaviside(W) 49 | return bw * self.weight_bounds.EtaMinus 50 | 51 | 52 | def _compile(self, W, trace_activation_totals): 53 | 54 | # OK: technically we need to do this: 55 | # a_plus_pre = self._APlus(W) * pre_synaptic_trace_activations 56 | # a_minus_post = self._AMinus(W) * post_synaptic_trace_activations # posts are already negative! 57 | # new_W = W + a_plus_pre + a_minus_post 58 | # 59 | # BUT: we use a trick. All the negative values in the sum will treated as 60 | # depression, and all positive as potentiation. This is basically the same 61 | # as above. 62 | 63 | step_W = tf.where( 64 | trace_activation_totals > 0, 65 | self._APlus(W) * trace_activation_totals, 66 | self._AMinus(W) * trace_activation_totals) 67 | 68 | new_W = W + step_W 69 | 70 | # a minor nit: 71 | # the scholarpedia article implies that this step: the 'hard' limiting 72 | # of the resultant weight to [0,Wmax] should only be done if 'hard bounds' 73 | # is used. But it should ALWAYS be done: with soft bounds, the a_plus_pre 74 | # and a_minus_post terms can grow arbitrarily large (or small), and the 75 | # dW algorithm can make a single jump even with soft bounds that will 76 | # surpass [0,WMax]. So: always perform hard bounding. 77 | # Is this because this is not a continuous system, but uses discrete 78 | # timesteps? 79 | return _Constrain(new_W, 0.0, self.weight_bounds.WMax) 80 | -------------------------------------------------------------------------------- /core/model.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import collections 3 | from spikeflow.core.neuron_layer import NeuronLayer 4 | 5 | 6 | class CompositeLayer(NeuronLayer): 7 | 8 | """ A layer that looks like a neuron layer, but that can have any arbitrary 9 | internal structure, containing layers and connections. There is a single 10 | input, which is the input to the first neuron layer, and a single output, 11 | which is the output of the last neuron layer. In between: anything can happen. 12 | """ 13 | 14 | def __init__(self, name, neuron_layers, connections, learning_rules): 15 | """ Creates a composite layer. 16 | Args: 17 | neuron_layers: list of NeuronLayer 18 | connections: list of ConnectionLayer 19 | """ 20 | super().__init__(name) 21 | self.neuron_layers = neuron_layers 22 | self.connections = connections 23 | self.learning_rules = learning_rules 24 | 25 | @property 26 | def input_n(self): 27 | """ The number of inputs is the number of inputs in the first neuron layer. """ 28 | return self.neuron_layers[0].input_n 29 | 30 | @property 31 | def output_n(self): 32 | """ The number of outputs is the number of outputs in the last neuron layer. """ 33 | return self.neuron_layers[-1].output_n 34 | 35 | def _ops(self): 36 | computation_layers = self.neuron_layers + self.connections + self.learning_rules 37 | return { clayer.name: clayer._ops() for clayer in computation_layers } 38 | 39 | def ops_format(self): 40 | computation_layers = self.neuron_layers + self.connections + self.learning_rules 41 | return { clayer.name: clayer.ops_format() for clayer in computation_layers } 42 | 43 | def _compile(self): 44 | 45 | if len(self.neuron_layers) == 0: 46 | raise ValueError("Composites and models must contain at least one neuron layer.") 47 | 48 | # compile and get connection outputs 49 | connection_tos = {} 50 | for connection in self.connections: 51 | connection._compile_output_node() 52 | to_i = self.neuron_layers.index(connection.to_layer) 53 | connection_tos.setdefault(to_i, []).append(connection.output) 54 | 55 | # first neuron layer gets model inputs 56 | self.neuron_layers[0].add_input(self.input) 57 | 58 | # all neuron layers can get synaptic inputs 59 | for i, neuron_layer in enumerate(self.neuron_layers): 60 | for connection_input in connection_tos.get(i, []): 61 | neuron_layer.add_input(connection_input) 62 | 63 | # compile neuron layers 64 | for neuron_layer in self.neuron_layers: 65 | neuron_layer._compile() 66 | 67 | # hook up synapse layer inputs and compile connections 68 | all_neuron_inputs = [layer.input for layer in self.neuron_layers] 69 | with tf.control_dependencies(all_neuron_inputs): 70 | for connection in self.connections: 71 | connection.input = connection.from_layer.output 72 | connection._compile() 73 | 74 | # compile all learning rules 75 | for learning_rule in self.learning_rules: 76 | learning_rule._compile_into_model() 77 | 78 | # finally, my output is the last neuron layer output 79 | self.output = self.neuron_layers[-1].output 80 | 81 | 82 | class BPNNModel(CompositeLayer): 83 | """ Top-level biologically plausible neural network model runner. 84 | Contains neuron and connection layers. Can compile to tensorflow graph, and 85 | then run through time, feeding input. 86 | """ 87 | 88 | def __init__(self, input_shape): 89 | super().__init__('top', [], [], []) 90 | self.input_shape = input_shape 91 | self.input = None 92 | self.graph = None 93 | 94 | @classmethod 95 | def compiled_model(cls, input_shape, neuron_layers = [], connections = [], learning_rules = []): 96 | """ Convenience creation method. Creates and returns a compiled model. 97 | Args: 98 | input_shape: tuple of int; shape of input 99 | neuron_layers: [ neuron_layer.NeuronLayer ] 100 | connections: [ connection_layer.ConnectionLayer ] 101 | learning_rules: [ learning_rule.LearningRule ] 102 | """ 103 | model = cls(input_shape) 104 | for nl in neuron_layers: 105 | model.add_neuron_layer(nl) 106 | for conn in connections: 107 | model.add_connection_layer(conn) 108 | for learning_rule in learning_rules: 109 | model.add_learning_rule(learning_rule) 110 | model.compile() 111 | return model 112 | 113 | def add_neuron_layer(self, neuron_layer): 114 | self.neuron_layers.append(neuron_layer) 115 | 116 | def add_connection_layer(self, connection_layer): 117 | self.connections.append(connection_layer) 118 | 119 | def add_learning_rule(self, learning_rule): 120 | self.learning_rules.append(learning_rule) 121 | 122 | def compile(self): 123 | """ Creates and connects tensor float graph and graph node operations, 124 | including neuron and connection computation layers. 125 | """ 126 | 127 | self.graph = tf.Graph() 128 | with self.graph.as_default(): 129 | 130 | # create input tensor 131 | self.input = tf.placeholder(tf.float32, shape=self.input_shape, name='I') 132 | 133 | # compile neuron layers and connections in current graph 134 | self._compile() 135 | 136 | 137 | def run_time(self, data_generator, post_timestep_callback): 138 | """ Runs the model through time as long as the data_generator produces data. 139 | 140 | After each time step, calls post_timestep_callback, allowing data collection 141 | and graph modification (to an extent). 142 | 143 | NOTE: The callback method will necessarily drop back into python after each 144 | timestep. This will obviously slow it down. Solutions being considered. 145 | NOTE: It's called a post_timestep_callback, but for now, batch size must be 1. 146 | This will be optimized in the future. Sort of the point of this library. 147 | 148 | Args: 149 | data_generator: a generator that must produce data with shape self.input_shape 150 | post_timestep_callback: function (i: incrementing integer index 151 | graph: the tensorflow graph 152 | sess: the tensorflow session 153 | results: { layer_index: layer _ops output}) 154 | 155 | results: Dictionary of keys to results, where keys are the indexes of 156 | computation layers (in order neuron layers, then connection layers, 157 | by addition), and results are outputs of the layer _ops 158 | method calls, which was just run in session.run. 159 | Called after every step of data generation. 160 | """ 161 | 162 | runnables = self._ops() 163 | 164 | with tf.Session(graph=self.graph) as sess: 165 | tf.global_variables_initializer().run() 166 | for i, data in enumerate(data_generator): 167 | 168 | # run one timestep 169 | if isinstance(data, collections.Mapping): 170 | # it's already a dictionary, so feed it directly 171 | results = sess.run(runnables, feed_dict=data) 172 | else: 173 | results = sess.run(runnables, feed_dict={self.input: data}) 174 | 175 | # call the callback 176 | post_timestep_callback(i, self.graph, sess, results) 177 | 178 | 179 | def learn(self, session): 180 | """ Applies all learning rules to weights they operate on. 181 | 182 | Can/should be called from within the time_step_callback passed to run_time. 183 | I suggest calling this after every batch, but do as you wish. 184 | 185 | Args: 186 | session: The session to execute in. 187 | """ 188 | runnables = { learning_rule.name: learning_rule.learning_ops() 189 | for learning_rule in self.learning_rules } 190 | return session.run(runnables) 191 | 192 | # deprecate!? 193 | def continue_run_time(self, data_generator, post_timestep_callback): 194 | runnables = self._ops() 195 | with tf.Session(graph=self.graph) as sess: 196 | tf.global_variables_initializer().run() 197 | for i, data in enumerate(data_generator): 198 | results = sess.run(runnables, feed_dict={self.input: data}) 199 | post_timestep_callback(i, self.graph, sess, results) 200 | -------------------------------------------------------------------------------- /core/neuron_layer.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | import pandas as pd 4 | from collections import namedtuple 5 | from spikeflow.core.computation_layer import ComputationLayer 6 | from spikeflow.core.utils import * 7 | 8 | 9 | class NeuronLayer(ComputationLayer): 10 | 11 | """ A computation layer that contains neurons. 12 | Really, just adds a single operation: add_input, 13 | and defines number of inputs and outputs. 14 | """ 15 | 16 | def __init__(self, name): 17 | super().__init__(name) 18 | 19 | @property 20 | def input_n(self): 21 | """ Subclasses must implement; returns number of inputs 22 | """ 23 | raise NotImplementedError 24 | 25 | @property 26 | def output_n(self): 27 | """ Subclasses must implement; returns number of outputs. 28 | For normal neuron layers, should be equal to input_n. Composite 29 | layers might be different. 30 | """ 31 | raise NotImplementedError 32 | 33 | def add_input(self, new_input): 34 | """ Adds the new_input to the summation operation tree. 35 | Args: 36 | new_input: tensor with same shape as self.input 37 | """ 38 | self.input = new_input if self.input is None else tf.add(self.input, new_input) 39 | 40 | 41 | class IdentityNeuronLayer(NeuronLayer): 42 | """ Implements a layer of pass-through neurons. Output = Input """ 43 | 44 | def __init__(self, name, n, bound=None): 45 | super().__init__(name) 46 | self.n = n 47 | self.bound = bound 48 | 49 | if n <= 0: 50 | raise ValueError('IdentityNeuronLayer must have at least one neuron.') 51 | 52 | @property 53 | def input_n(self): 54 | return self.n 55 | 56 | @property 57 | def output_n(self): 58 | return self.n 59 | 60 | def _ops(self): 61 | return [self.input, self.assign_op] 62 | 63 | def ops_format(self): 64 | return ['input', 'output'] 65 | 66 | def _compile(self): 67 | self.output = tf.Variable(np.zeros(self.input.shape, dtype=np.float32), dtype=np.float32, name='output') 68 | if self.bound is None: 69 | self.assign_op = self.output.assign(self.input) 70 | else: 71 | self.assign_op = self.output.assign(tf.minimum(self.input, np.ones(self.input.shape, dtype=np.float32)*self.bound)) 72 | 73 | 74 | class PoissonNeuronLayer(NeuronLayer): 75 | 76 | """ Implements a layer of Poisson-sampling neurons. Output shape = input shape. 77 | 78 | Given inputs as probabilities BETWEEN 0 AND 1, output is 1 if a uniformly 79 | drawn random number is < input. 80 | 81 | Useful for doing things like converting images to poisson spike trains. 82 | """ 83 | 84 | def __init__(self, name, n): 85 | super().__init__(name) 86 | self.n = n 87 | if n <= 0: 88 | raise ValueError('PoissonNeuronLayer must have at least one neuron.') 89 | 90 | @property 91 | def input_n(self): 92 | return self.n 93 | 94 | @property 95 | def output_n(self): 96 | return self.n 97 | 98 | def _ops(self): 99 | return [self.input, self.assign_op] 100 | 101 | def ops_format(self): 102 | return ['input', 'output'] 103 | 104 | def _compile(self): 105 | zeros = np.zeros(self.input.shape, dtype=np.float32) 106 | ones = np.ones(self.input.shape, dtype=np.float32) 107 | 108 | self.output = tf.Variable(zeros, dtype=np.float32, name='output') 109 | 110 | samples = tf.random.uniform(self.input.shape, 0.0, 1.0, np.float32) 111 | sample_firings = tf.where(samples < self.input, ones, zeros) 112 | 113 | self.assign_op = self.output.assign(sample_firings) 114 | 115 | 116 | 117 | class LIFNeuronLayer(NeuronLayer): 118 | 119 | """ Implements a layer of Leaky Integrate-and-fire Neurons. 120 | Contains multiple convenience creation functions. 121 | 122 | Implements this timestep function: 123 | v(t+1) = v(t) + (input * resistance - v(t)) / tau * dt 124 | 125 | Or in more traditional form: 126 | tau * dV/dt = IR - V 127 | 128 | (subject to threshholding and refractory period) 129 | 130 | Configuration values can be different for each neuron: 131 | resistance: np.array(floats) 132 | tau: time constant, np.array(floats) 133 | treshhold: threshhold, np.array(floats) 134 | n_refrac: refactory periods in number of timesteps, np.array(ints) 135 | """ 136 | 137 | C = namedtuple('LIFNeuronConfig', ['resistance', 'tau', 'threshhold', 'n_refrac']) 138 | 139 | def __init__(self, name, neuron_configuration, dt=0.5): 140 | """ LIFNeuronLayer constructor 141 | Args: 142 | neuron_configuration: 2d numpy array; columns are 143 | resistance, tau, threshhold, n_refrac 144 | dt: single timestep dt value. 145 | """ 146 | super().__init__(name) 147 | 148 | if len(neuron_configuration.shape) != 2: 149 | raise ValueError('LIFNeuronLayer must be initialized with configuration with 2 dimensions; n rows by 4 columns') 150 | if neuron_configuration.shape[0] <= 0: 151 | raise ValueError('LIFNeuronLayer must have at least one neuron.') 152 | if neuron_configuration.shape[1] != 4: 153 | raise ValueError('LIFNeuronLayer configuration must have 4 columns; resistance, tau, threshhold, n_refrac.') 154 | 155 | self.n = neuron_configuration.shape[0] 156 | self.resistance = neuron_configuration[:,0] 157 | self.tau = neuron_configuration[:,1] 158 | self.threshhold = neuron_configuration[:,2] 159 | self.n_refrac = neuron_configuration[:,3].astype(np.int32) 160 | self.dt = np.float32(dt) 161 | 162 | @property 163 | def input_n(self): 164 | return self.n 165 | 166 | @property 167 | def output_n(self): 168 | return self.n 169 | 170 | @classmethod 171 | def layer_from_tuples(cls, name, neuron_configuration_tuples, dt=0.5): 172 | """ Creates a layer from the given configuration tuples. 173 | Args: 174 | neuron_configuration_tuples: [LIFNeuronLayer.C] 175 | dt: state update timestep, float 176 | """ 177 | nct = neuron_configuration_tuples 178 | res = np.array([nn.resistance for nn in nct], dtype=np.float32) 179 | tau = np.array([nn.tau for nn in nct], dtype=np.float32) 180 | thresh = np.array([nn.threshhold for nn in nct], dtype=np.float32) 181 | refrac = np.array([nn.n_refrac for nn in nct], dtype=np.float32) 182 | configuration = np.array([res, tau, thresh, refrac]).T 183 | return cls(name, configuration, dt) 184 | 185 | @classmethod 186 | def layer_with_n_identical_neurons(cls, name, n, resistance, tau, threshhold, n_refrac, dt=0.5): 187 | """ Creates a layer of n identical neurons. 188 | Args: 189 | resistance, tau, threshhold, n_refrac: floats 190 | dt: state update timestep, float 191 | """ 192 | configuration = np.tile(np.array([resistance, tau, threshhold, n_refrac], dtype=np.float32), (n,1)) 193 | return cls(name, configuration, dt) 194 | 195 | @classmethod 196 | def layer_with_n_distributions(cls, name, n, resistance_dist, tau_dist, threshhold_dist, n_refrac_dist, dt=0.5): 197 | """ Creates a layer of n neurons, whose configuration values are pulled from 198 | distribution generation functions. 199 | Args: 200 | *_dist: functions that generate n float values according to any distribution. 201 | dt: state update timestep, float 202 | """ 203 | configuration = np.array([resistance_dist(n), tau_dist(n), threshhold_dist(n), n_refrac_dist(n)]).T 204 | return cls(name, configuration, dt) 205 | 206 | def to_dataframe(self): 207 | """ Returns a pandas Dataframe containing the neuron configurations. 208 | """ 209 | return pd.DataFrame(np.array([self.resistance, self.tau, self.threshhold, self.n_refrac]).T, 210 | index = range(len(self.resistance)), 211 | columns = ['resistance', 'tau', 'threshhold', 'n_refrac']) 212 | 213 | 214 | def _ops(self): 215 | return [self.input, self.recovery_op, self.fired_op, self.v_op] 216 | 217 | def ops_format(self): 218 | return ['input', 'recovery', 'fired', 'v'] 219 | 220 | def _compile(self): 221 | nintzeros = np.zeros((self.n,), dtype=np.int32) 222 | nfloatzeros = np.zeros((self.n,), dtype=np.float32) 223 | 224 | # internal tensor variables for the neuron states 225 | v = tf.Variable(nfloatzeros, name='v') 226 | recovery_times = tf.Variable(nintzeros, dtype=tf.int32, name='recoverytimes') 227 | 228 | # define output variable 229 | self.output = tf.Variable(np.zeros(nfloatzeros.shape, dtype=bool), dtype=bool, name='fired') 230 | 231 | # Reset any neurons that spiked last timestep 232 | # NOTE: Does the WARNING: tf.Variable from the tensorflow Variable 233 | # documentation apply here? Is tf.where safe here, and below? 234 | v_1 = tf.where(self.output, nfloatzeros, v) 235 | 236 | # Reset recovery durations for neurons that spiked last timestep. 237 | recovery_decremented = tf.maximum(nintzeros, recovery_times - 1) 238 | recovery_new = tf.where(self.output, self.n_refrac, recovery_decremented) 239 | self.recovery_op = recovery_times.assign(recovery_new) 240 | 241 | # State update equations: update the 'voltage', but 242 | # only do it if the neuron is not in its recovery period. 243 | v_2 = v_1 + (self.input * self.resistance - v_1) / self.tau * self.dt 244 | recovering = tf.greater(self.recovery_op, nintzeros) 245 | v_3 = tf.where(recovering, nfloatzeros, v_2) 246 | 247 | # Compute spikes that fired, assign to output 248 | self.fired_op = self.output.assign(tf.greater_equal(v_3, self.threshhold)) 249 | 250 | # Update the state: where spiked, set to threshhold 251 | v_f = tf.where(self.fired_op, self.threshhold, v_3) 252 | self.v_op = v.assign(v_f) 253 | 254 | 255 | class IzhikevichNeuronLayer(NeuronLayer): 256 | 257 | """ Implements a layer of Izhikevich Neurons. 258 | Contains multiple convenience creation functions. 259 | """ 260 | 261 | C = namedtuple('IzhikevichNeuronConfig', ['a', 'b', 'c', 'd', 't', 'v0']) 262 | 263 | def __init__(self, name, neuron_configuration, dt=0.5): 264 | """Creates an IzhikevichNeuronLayer with the given neuron configurations. 265 | 266 | Args: 267 | neuron_configuration: np.array of size n by 6. Each row represents 268 | one neuron. Columns 0 through 5 represent values of a, b, c, d, t, and v0. 269 | dt: state update timestep, float 270 | """ 271 | super().__init__(name) 272 | 273 | if len(neuron_configuration.shape) != 2: 274 | raise ValueError('IzhikevichNeuronLayer must be initialized with configuration with 2 dimensions; n rows by 6 columns') 275 | if neuron_configuration.shape[0] <= 0: 276 | raise ValueError('IzhikevichNeuronLayer must have at least one neuron.') 277 | if neuron_configuration.shape[1] != 6: 278 | raise ValueError('IzhikevichNeuronLayer configuration must have 6 columns; a, b, c, d, threshhold, v0') 279 | 280 | self.n = neuron_configuration.shape[0] 281 | self.a = neuron_configuration[:,0] 282 | self.b = neuron_configuration[:,1] 283 | self.c = neuron_configuration[:,2] 284 | self.d = neuron_configuration[:,3] 285 | self.t = neuron_configuration[:,4] 286 | self.v0 = neuron_configuration[:,5] 287 | self.dt = np.float32(dt) 288 | 289 | self.v_op = None 290 | self.u_op = None 291 | self.fired_op = None 292 | 293 | @property 294 | def input_n(self): 295 | return self.n 296 | 297 | @property 298 | def output_n(self): 299 | return self.n 300 | 301 | @classmethod 302 | def layer_from_tuples(cls, name, neuron_configuration_tuples, dt=0.5): 303 | """ Creates a layer from the given configuration tuples. 304 | Args: 305 | neuron_configuration_tuples: [IzhikevichNeuronLayer.C] 306 | dt: state update timestep, float 307 | """ 308 | a = np.array([nn.a for nn in neuron_configuration_tuples], dtype=np.float32) 309 | b = np.array([nn.b for nn in neuron_configuration_tuples], dtype=np.float32) 310 | c = np.array([nn.c for nn in neuron_configuration_tuples], dtype=np.float32) 311 | d = np.array([nn.d for nn in neuron_configuration_tuples], dtype=np.float32) 312 | t = np.array([nn.t for nn in neuron_configuration_tuples], dtype=np.float32) 313 | v0 = np.array([nn.v0 for nn in neuron_configuration_tuples], dtype=np.float32) 314 | configuration = np.array([a, b, c, d, t, v0]).T 315 | return cls(name, configuration, dt) 316 | 317 | @classmethod 318 | def configuration_with_n_identical_neurons(cls, n, a, b, c, d, t, v0): 319 | """ Creates a configuration for the constructor for n identical izhikevich neurons 320 | Args: 321 | a, b, c, d, t, v0: floats 322 | """ 323 | return np.tile(np.array([a, b, c, d, t, v0]), (n,1)) 324 | 325 | @classmethod 326 | def configuration_with_n_distributions(cls, n, a_dist, b_dist, c_dist, d_dist, t_dist, v0_dist): 327 | """ Creates a configuration for the constructor forn neurons, whose configuration 328 | values are pulled from distribution generation functions. 329 | Args: 330 | a_dist, b_dist, c_dist, d_dist, t_dist, v0_dist: functions that 331 | generate n float values according to any distribution. 332 | """ 333 | return np.array([a_dist(n), b_dist(n), c_dist(n), d_dist(n), t_dist(n), v0_dist(n)]).T 334 | 335 | @classmethod 336 | def layer_with_configuration(cls, name, configuration, dt=0.5): 337 | """ Creates a layer from a configuration 338 | Args: 339 | configuration: numpy configuration (as returned from configuration_with_*) 340 | dt: state update timestep, float 341 | """ 342 | return cls(name, configuration, dt) 343 | 344 | @classmethod 345 | def layer_with_configurations(cls, name, configurations, dt=0.5): 346 | """ Creates a layer from a list of configurations 347 | Args: 348 | configuration: list of configurations 349 | dt: state update timestep, float 350 | """ 351 | return cls(name, np.concatenate(configurations), dt) 352 | 353 | @classmethod 354 | def layer_with_n_identical_neurons(cls, name, n, a, b, c, d, t, v0, dt=0.5): 355 | """ Creates a layer of n identical neurons. 356 | Args: 357 | a, b, c, d, t, v0: floats 358 | dt: state update timestep, float 359 | """ 360 | return cls(name, cls.configuration_with_n_identical_neurons(n, a, b, c, d, t, v0), dt) 361 | 362 | @classmethod 363 | def layer_with_n_distributions(cls, name, n, a_dist, b_dist, c_dist, d_dist, t_dist, v0_dist, dt=0.5): 364 | """ Creates a layer of n neurons, whose configuration values are pulled from 365 | distribution generation functions. 366 | Args: 367 | a_dist, b_dist, c_dist, d_dist, t_dist, v0_dist: functions that 368 | generate n float values according to any distribution. 369 | dt: state update timestep, float 370 | """ 371 | configuration = cls.configuration_with_n_distributions(n, a_dist, b_dist, c_dist, d_dist, t_dist, v0_dist) 372 | return cls(name, configuration, dt) 373 | 374 | def to_dataframe(self): 375 | """ Returns a pandas Dataframe containing the neuron configurations. 376 | """ 377 | return pd.DataFrame(np.array([self.a, self.b, self.c, self.d, self.t, self.v0]).T, 378 | index = range(len(self.a)), 379 | columns = ['a', 'b', 'c', 'd', 't', 'v0']) 380 | 381 | 382 | def _ops(self): 383 | return [self.input, self.v_op, self.u_op, self.fired_op] 384 | 385 | def ops_format(self): 386 | return ['input', 'v', 'u', 'fired'] 387 | 388 | def _compile(self): 389 | 390 | n = self.n 391 | a = self.a 392 | b = self.b 393 | c = self.c 394 | d = self.d 395 | t = self.t 396 | 397 | # Create tensor variables for the neuron states 398 | v = tf.Variable(np.ones((n,), dtype=np.float32) * c, name='v') 399 | u = tf.Variable(np.zeros((n,), dtype=np.float32), name='u') 400 | self.output = tf.Variable(np.zeros(v.shape, dtype=bool), dtype=bool, name='fired') 401 | 402 | # Reset any neurons that spiked last timestep 403 | # NOTE: Does the WARNING: tf.Variable from the tensorflow Variable 404 | # documentation apply here? Is tf.where safe here, and below? 405 | v_1 = tf.where(self.output, c, v) 406 | u_1 = tf.where(self.output, tf.add(u, d), u) 407 | 408 | # State update equations 409 | v_2 = v_1 + (0.04 * v_1 * v_1 + 5.0 * v_1 + 140.0 - u_1 + self.input) * self.dt 410 | u_f = u_1 + a * (b * v_1 - u_1) * self.dt 411 | 412 | # Spikes: 413 | # Limit anything above threshold to threshold value 414 | # We are saving which fired to use again in the next iteration 415 | threshhold = tf.constant(t, tf.float32, v.shape) 416 | self.fired_op = self.output.assign(tf.greater_equal(v_2, threshhold)) 417 | v_f = tf.where(self.fired_op, threshhold, v_2) 418 | 419 | # Operations to update the state 420 | self.v_op = v.assign(v_f) 421 | self.u_op = u.assign(u_f) 422 | -------------------------------------------------------------------------------- /core/spike_process.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | """ 4 | Defines types for firings and spike processes, and functions to convert 5 | between them. 6 | 7 | 8 | firing: np.array((num_timesteps,), dtype=bool) 9 | A 'firing' is firing record for a single neuron: a numpy array of bools, one 10 | for each timestep. 11 | 12 | firings: np.array((num_timesteps, num_neurons), dtype=bool) 13 | A 'firings' is a record of many neuron firings, in which each column is a neuron and 14 | each row is a timestep, of booleans: True means a neuron fired at a timestep. 15 | 16 | spike_process: np.array(int) 17 | A 'spike_process' is a numpy array of integer indexes at which a neuron spiked. 18 | 19 | spike_processes: [ spike_process ] 20 | A list of spike_process, one for each neuron 21 | 22 | 23 | This is where typed-python might make sense. 24 | """ 25 | 26 | def firing_to_spike_process(firing): 27 | """ Converts a firing record for a single neuron to a spike process: 28 | a list of time step indexes at which the neuron fired. 29 | Args: 30 | firing: firing record for a single neuron 31 | Returns: 32 | spike_process for a single neuron 33 | """ 34 | # return np.argwhere(firing).ravel() 35 | return np.reshape(np.argwhere(firing), (-1,)) 36 | 37 | 38 | def firings_to_spike_processes(firings): 39 | """ Converts firings numpy tensor to array of spike processes. These are 40 | simply numpy arrays of timestep indexes for which each neuron fired. 41 | Args: 42 | firings: firing records for multiple neurons 43 | Returns: 44 | spike_processes, one for each neuron 45 | """ 46 | return [ firing_to_spike_process(firings[:,i]) for i in range(firings.shape[1]) ] 47 | 48 | 49 | def spike_process_to_firing(spike_process, max_length=None): 50 | """ Converts a single neuron spike processes to a firing record. 51 | Args: 52 | spike_process: spike_process for a single neuron 53 | max_length: if highest timestep is < max_length, pads to max_length with 0s 54 | Returns: 55 | firing record for one neuron 56 | """ 57 | length = max(np.max(spike_process)+1, 0 if max_length is None else max_length) 58 | firing = np.zeros((length,), dtype=bool) 59 | firing[spike_process] = True 60 | return firing 61 | 62 | 63 | def spike_processes_to_firings(spike_processes, max_length=None): 64 | """ Converts spike processes to firing records. 65 | Args: 66 | spike_processes: list of spike processes for multiple neurons 67 | max_length: if highest timestep is < max_length, pads to max_length with 0s 68 | Returns: 69 | firing record for multiple neurons 70 | """ 71 | length = max(np.max(spike_processes)+1, 0 if max_length is None else max_length) 72 | firings = np.zeros((length, len(spike_processes)), dtype=bool) 73 | for i, spike_process in enumerate(spike_processes): 74 | firings[spike_process,i] = True 75 | return firings 76 | 77 | 78 | def spike_process_delta_times(pre_spike_process, post_spike_process): 79 | """ Calculates the delta times for all combinations of pre and post 80 | spike times, in one big array. 81 | Args: 82 | pre_spike_processes: spike_process of presynaptic neuron 83 | post_spike_processes: spike_process of postsynaptic neuron 84 | Returns: 85 | list of np.array(varying n, float32): for each spike process, the 86 | array of time differences; post_spike_times - each pre spike time, 87 | for each pre spike time 88 | """ 89 | return np.array([ post_spike_process - pre_spike_time for pre_spike_time in pre_spike_process ]).ravel() 90 | -------------------------------------------------------------------------------- /core/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from collections import namedtuple 3 | 4 | 5 | UniformDistribution = namedtuple('UniformDistribution', ['low', 'high']) 6 | NormalDistribution = namedtuple('NormalDistribution', ['mean', 'stddev']) 7 | 8 | 9 | def typed_arr(arr, dtype): 10 | return arr.astype(dtype) 11 | 12 | 13 | def floats_arr(arr): 14 | return typed_arr(arr, np.float32) 15 | 16 | def floats_uniform(low, high, n): 17 | return floats_arr(np.random.uniform(low, high, (n))) 18 | 19 | def floats_normal(mean, stddev, n): 20 | return floats_arr(np.random.normal(mean, stddev, [n])) 21 | 22 | 23 | def ints_arr(arr): 24 | return typed_arr(arr, np.int32) 25 | 26 | def ints_uniform(low, high, n): 27 | return ints_arr(np.random.randint(low, high, (n))) 28 | 29 | def ints_normal(mean, stddev, n): 30 | return ints_arr(np.random.normal(mean, stddev, [n])) 31 | 32 | 33 | """ 34 | Samplers: sometimes closures can be more concise than classes. 35 | Are they as clear? These oughtta be, I hope... 36 | """ 37 | 38 | def list_sampler(l, dtype=np.float32): 39 | def ls(n): 40 | return typed_arr(np.array(l[:n]), dtype) 41 | return ls 42 | 43 | 44 | def identical_sampler(v): 45 | return lambda n: floats_arr(np.ones((n,)) * v) 46 | 47 | def uniform_sampler(low, high): 48 | return lambda n: floats_uniform(low, high, n) 49 | 50 | def normal_sampler(mean, stddev): 51 | return lambda n: floats_normal(mean, stddev, n) 52 | 53 | 54 | def identical_ints_sampler(v): 55 | return lambda n: ints_arr(np.ones((n,)) * v) 56 | 57 | def uniform_ints_sampler(low, high): 58 | return lambda n: ints_uniform(low, high, n) 59 | 60 | def normal_ints_sampler(mean, stddev): 61 | return lambda n: ints_normal(mean, stddev, n) 62 | -------------------------------------------------------------------------------- /drawing_utils/connection_layer_drawing.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import matplotlib.pyplot as plt 3 | import matplotlib.gridspec as gridspec 4 | from spikeflow import ConnectionLayer, SynapseLayer, ComplexSynapseLayer 5 | 6 | """ 7 | The point of this library is not really these convenience rendering functions. 8 | But maybe they'll be quick and dirty help. 9 | """ 10 | 11 | def draw_synapse_layer(synapse_layer, dpi=100): 12 | """ Draws a synapse layer as image of weights 13 | """ 14 | min = np.min(synapse_layer.w) 15 | colors = 'gnuplot2' if min == 0 else 'RdBu' 16 | plt.imshow(synapse_layer.w * -1 + 1, colors, aspect='auto') 17 | plt.ylabel('from neuron') 18 | plt.xlabel('to neuron') 19 | plt.box(on=False) 20 | plt.show() 21 | -------------------------------------------------------------------------------- /drawing_utils/neuron_layer_drawing.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import matplotlib.gridspec as gridspec 3 | from spikeflow import IzhikevichNeuronLayer 4 | 5 | """ 6 | The point of this library is not really these convenience rendering functions. 7 | But maybe they'll be quick and dirty help. 8 | """ 9 | 10 | def draw_izhikevich_neuron_layer_abcdt_distributions(neuron_layer, dpi=100): 11 | """ Draws distributions of all a, b, c, d, and t values. 12 | """ 13 | 14 | df = neuron_layer.to_dataframe() 15 | 16 | fig = plt.figure(num=None, figsize=(10, 0.6), dpi=dpi) 17 | fig.text(0 , 1, '{} neurons'.format(df.shape[0],)) 18 | 19 | gs = gridspec.GridSpec(1, 5, hspace=0.2) 20 | for i, p in enumerate(['a', 'b', 'c', 'd', 't']): 21 | ax = fig.add_subplot(gs[i]) 22 | ax.set_title(p) 23 | plt.box(on=None) 24 | bins = max(9, df[p].nunique() // 7) 25 | df[p].hist(bins=bins, grid=False, xlabelsize=7, ylabelsize=7, ax=ax) 26 | 27 | plt.show() 28 | 29 | 30 | def draw_izhikevich_neuron_layer_ab_cd_distributions(neuron_layer, dpi=100, 31 | ab_ranges=((-0.2, 1.2), (-1.1, 1.1)), 32 | cd_ranges=((-70.0, -40.0), (-22.0, 10.0))): 33 | 34 | """ Draws a vs b and c vs d distributions of neuron layer values. 35 | """ 36 | 37 | df = neuron_layer.to_dataframe() 38 | 39 | fig = plt.figure(num=None, figsize=(10, 2.0), dpi=dpi) 40 | fig.text(0 , 1, '{} neurons'.format(df.shape[0],)) 41 | gs = gridspec.GridSpec(1, 5, hspace=0.3, width_ratios=[1, 3, 1, 3, 2]) 42 | 43 | for (i, xname, yname, r) in [ (1, 'a', 'b', ab_ranges), (3, 'c', 'd', cd_ranges)]: 44 | ax = fig.add_subplot(gs[i]) 45 | ax.set_title(xname+' vs '+yname) 46 | #plt.box(on=None) 47 | plt.xlabel(xname) 48 | plt.ylabel(yname) 49 | plt.xticks(r[0]) 50 | plt.yticks(r[1]) 51 | ax.set_xlim(r[0]) 52 | ax.set_ylim(r[1]) 53 | plt.plot(df[xname], df[yname], 'o', alpha=0.1) 54 | 55 | plt.show() 56 | -------------------------------------------------------------------------------- /drawing_utils/trace_renderers.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import pandas as pd 3 | import matplotlib.pyplot as plt 4 | import matplotlib.gridspec as gridspec 5 | from spikeflow.core.analysis_utils import * 6 | 7 | 8 | """ 9 | The point of this library is not really these convenience rendering functions. 10 | But maybe they'll be quick and dirty help. 11 | """ 12 | 13 | def render_signal(ax, name, signal, colorCode, show_yticks=True, alpha=1.0, linewidth=1.0, marker=None, linestyle='-'): 14 | plt.box(on=None) 15 | ax.set_ylabel(name, color=colorCode) 16 | ax.tick_params('y', colors=colorCode) 17 | plt.xlim(xmax=signal.shape[0]) 18 | plt.xticks([]) 19 | if not show_yticks: 20 | plt.yticks([]) 21 | plt.plot(signal, colorCode, alpha=alpha, linestyle=linestyle, linewidth=linewidth, marker=marker) 22 | 23 | 24 | class TraceRenderer: 25 | 26 | def __init__(self, traces, name): 27 | self.traces = traces 28 | self.name = name 29 | 30 | def subfigure_height_ratios(self, dpi): 31 | return [] 32 | 33 | def height(self, dpi): 34 | pass 35 | 36 | def render(self, fig, gs, gs_start, start_time, end_time): 37 | pass 38 | 39 | 40 | class IdentityNeuronTraceRenderer(TraceRenderer): 41 | 42 | def height(self, dpi): 43 | return 0.8 * self.traces.shape[2] 44 | 45 | def subfigure_height_ratios(self, dpi): 46 | return [1, 1] + [3, 1, 1] * (self.traces.shape[2]-1) 47 | 48 | def render(self, fig, gs, gs_start, start_time, end_time): 49 | for i in range(self.traces.shape[2]): 50 | for t in range(2): 51 | ax = fig.add_subplot(gs[gs_start+(i*3)+t]) 52 | if t == 0: 53 | ax.set_title(self.name + ' ' + str(i)) 54 | color = { 0: 'r', 1: 'b'}[t] 55 | name = { 0: 'in', 1: 'out'}[t] 56 | render_signal(ax, name, self.traces[start_time:end_time,t,i], color, show_yticks=t<1) 57 | 58 | 59 | class LIFNeuronTraceRenderer(TraceRenderer): 60 | 61 | def height(self, dpi): 62 | return 1.0 * self.traces.shape[2] 63 | 64 | def subfigure_height_ratios(self, dpi): 65 | return [1, 2, 1] + [3, 1, 2, 1] * (self.traces.shape[2]-1) 66 | 67 | def render(self, fig, gs, gs_start, start_time, end_time): 68 | for i in range(self.traces.shape[2]): 69 | for j, t in enumerate([0, 3, 2]): 70 | ax = fig.add_subplot(gs[gs_start+(i*4)+j]) 71 | if t == 0: 72 | ax.set_title(self.name + ' ' + str(i)) 73 | color = { 0: 'r', 1: 'g', 2:'b'}[j] 74 | name = { 0: 'in', 1: 'v', 2: 'out'}[j] 75 | render_signal(ax, name, self.traces[start_time:end_time,t,i], color, show_yticks=t<2) 76 | 77 | 78 | class IzhikevichNeuronTraceRenderer(TraceRenderer): 79 | 80 | def height(self, dpi): 81 | return 1.4 * self.traces.shape[2] 82 | 83 | def subfigure_height_ratios(self, dpi): 84 | return [1, 2, 1, 1] + [3, 1, 2, 1, 1] * (self.traces.shape[2]-1) 85 | 86 | def render(self, fig, gs, gs_start, start_time, end_time): 87 | for i in range(self.traces.shape[2]): 88 | for t in range(4): 89 | ax = fig.add_subplot(gs[gs_start+(i*5)+t]) 90 | if t == 0: 91 | ax.set_title(self.name + ' ' + str(i)) 92 | color = { 0: 'r', 1: 'g', 2:'y', 3:'b'}[t] 93 | name = { 0: 'in', 1: 'v', 2: 'u', 3: 'out'}[t] 94 | render_signal(ax, name, self.traces[start_time:end_time,t,i], color, show_yticks=t<3) 95 | 96 | 97 | class NeuronFiringsRenderer(TraceRenderer): 98 | 99 | def __init__(self, traces, firing_height, name, groups=None): 100 | super().__init__(traces, name) 101 | self.firing_height = firing_height 102 | self.groups = groups 103 | 104 | def height(self, dpi): 105 | return (2.0 / dpi) * self.traces.shape[1] * self.firing_height + 1.0 106 | 107 | def subfigure_height_ratios(self, dpi): 108 | return [(2.0 / dpi) * self.traces.shape[1] * self.firing_height + 1.0] 109 | 110 | def render(self, fig, gs, gs_start, start_time, end_time): 111 | data = self.traces[start_time:end_time,:].T 112 | data2 = np.repeat(data, 3).reshape(data.shape[0], data.shape[1], 3) 113 | data3 = data2 * (np.array([0.0, 0.0, 1.0]) * -1.0 + 0.95) * -1.0 + 0.95 114 | if self.groups is not None: 115 | for (fromn, ton, color) in self.groups: 116 | k = data2[fromn:ton,:,:] * (np.array(color) * -1.0 + 0.95) * -1.0 + 0.95 117 | data3[fromn:ton,:,:] = k 118 | data_to_render = np.flipud(np.repeat(data3, self.firing_height, axis=0)) 119 | ax = fig.add_subplot(gs[gs_start]) 120 | ax.set_title(self.name) 121 | plt.box(on=None) 122 | ax.set_ylabel('neuron i') 123 | ax.set_xlabel('timestep t') 124 | ax.set_ylim((-5, data_to_render.shape[0]+5)) 125 | ax.set_xlim((0, end_time-start_time)) 126 | plt.yticks([]) 127 | plt.xticks([]) 128 | plt.imshow(data_to_render, aspect='auto', interpolation='nearest') 129 | #plt.imshow(data_to_render, 'Blues', aspect='auto', interpolation='nearest') 130 | #plt.imshow(data_to_render, 'RdBu', aspect='auto', interpolation='nearest') 131 | 132 | 133 | class STDPTracesRenderer(TraceRenderer): 134 | 135 | def __init__(self, traces, name): 136 | super().__init__(traces, name) 137 | self.pre_synaptic_traces = traces[0] 138 | self.post_synaptic_traces = traces[1] 139 | self.post_synaptic_triplet_traces = traces[2] if len(traces) > 2 else None 140 | 141 | def _height_of(self, trc): 142 | return 0.6 * trc.shape[1] if trc is not None else 0 143 | 144 | def height(self, dpi): 145 | return self._height_of(self.pre_synaptic_traces) + \ 146 | self._height_of(self.post_synaptic_traces) + \ 147 | self._height_of(self.post_synaptic_triplet_traces) + \ 148 | 2.0 149 | 150 | def _height_ratios_of(self, trc): 151 | if trc is None: return [] 152 | return [2] + [1] * trc.shape[1] 153 | 154 | def subfigure_height_ratios(self, dpi): 155 | return self._height_ratios_of(self.pre_synaptic_traces) + \ 156 | self._height_ratios_of(self.post_synaptic_traces) + \ 157 | self._height_ratios_of(self.post_synaptic_triplet_traces) 158 | 159 | def render(self, fig, gs, gs_start, start_time, end_time): 160 | for i in range(self.pre_synaptic_traces.shape[1]): 161 | ax = fig.add_subplot(gs[gs_start+i+1]) 162 | if i == 0: 163 | ax.set_title('pre-synaptic traces x') 164 | render_signal(ax, r'$x_{' + str(i) + '}$', self.pre_synaptic_traces[start_time:end_time,i], 'b', show_yticks=True, marker='.', linestyle='--') 165 | 166 | for j in range(self.post_synaptic_traces.shape[1]): 167 | ax = fig.add_subplot(gs[gs_start+i+j+3]) 168 | if j == 0: 169 | ax.set_title('post-synaptic traces y') 170 | render_signal(ax, r'$y_{' + str(j) + '}$', self.post_synaptic_traces[start_time:end_time,j], 'b', show_yticks=True, marker='.', linestyle='--') 171 | 172 | if self.post_synaptic_triplet_traces is not None: 173 | for k in range(self.post_synaptic_triplet_traces.shape[1]): 174 | ax = fig.add_subplot(gs[gs_start+i+k+j+5]) 175 | if k == 0: 176 | ax.set_title('post-synaptic triplet traces yt') 177 | render_signal(ax, r'$yt_{' + str(j) + '}$', self.post_synaptic_triplet_traces[start_time:end_time,k], 'b', show_yticks=True, marker='.', linestyle='--') 178 | 179 | 180 | 181 | def render_figure(renderers, start_time, end_time, dpi=100): 182 | height = sum([r.height(dpi) for r in renderers]) + (len(renderers)-1)*0.2 183 | 184 | height_ratios = [] 185 | for r in renderers: 186 | height_ratios += r.subfigure_height_ratios(dpi) + [6] 187 | 188 | fig = plt.figure(num=None, figsize=(14, height), dpi=dpi) 189 | gs = gridspec.GridSpec(len(height_ratios), 1, hspace=0.1, height_ratios=height_ratios) 190 | 191 | gs_start = 0 192 | for renderer in renderers: 193 | renderer.render(fig, gs, gs_start, start_time, end_time) 194 | gs_start += len(renderer.subfigure_height_ratios(dpi)) + 1 195 | 196 | plt.xticks([i for i in range(0, end_time-start_time, 100)]) 197 | plt.show() 198 | 199 | 200 | def draw_firing_distributions(layer_firings, groups=None, dpi=100): 201 | fig = plt.figure(num=None, figsize=(2 * len(layer_firings), 0.6), dpi=dpi) 202 | gs = gridspec.GridSpec(1, len(layer_firings), hspace=1.0) 203 | for i, firings in enumerate(layer_firings): 204 | rates = firing_rates(firings) 205 | df = pd.DataFrame(rates, index = range(len(rates)), columns = ['r']) 206 | ax = fig.add_subplot(gs[i]) 207 | ax.spines['bottom'].set_color('grey') 208 | ax.spines['top'].set_color('grey') 209 | ax.spines['left'].set_color('grey') 210 | ax.spines['right'].set_color('grey') 211 | ax.set_xlim((0, df['r'].max() + 0.01)) 212 | ax.set_title('Layer ' + str(i) + ": " + str(len(rates)) + ' neurons', fontsize=9) 213 | bins = max(17, df['r'].nunique() // 7) 214 | df['r'].hist(bins=bins, grid=False, xlabelsize=7, ylabelsize=7, ax=ax) 215 | 216 | plt.show() 217 | -------------------------------------------------------------------------------- /jupyter_examples/README.md: -------------------------------------------------------------------------------- 1 | # examples in jupyter 2 | 3 | Consider these examples demonstrations of the api. Like a manual! :) 4 | 5 | More in-depth explorations will be in 'jupyter_explorations'. 6 | -------------------------------------------------------------------------------- /jupyter_examples/example_0_single_neuron.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Hello world: print 1 neuron trace" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": { 14 | "collapsed": false 15 | }, 16 | "outputs": [ 17 | { 18 | "name": "stderr", 19 | "output_type": "stream", 20 | "text": [ 21 | "/Applications/Anaconda/anaconda/envs/mlbook/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 22 | " from ._conv import register_converters as _register_converters\n" 23 | ] 24 | } 25 | ], 26 | "source": [ 27 | "import numpy as np\n", 28 | "from spikeflow import BPNNModel, IzhikevichNeuronLayer\n", 29 | "import spikeflow.drawing_utils.trace_renderers as rend" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "metadata": {}, 35 | "source": [ 36 | "# Create a neuron layer with a single neuron, and create the model" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": 2, 42 | "metadata": { 43 | "collapsed": false 44 | }, 45 | "outputs": [], 46 | "source": [ 47 | "model_input_shape = (1,)\n", 48 | "\n", 49 | "nl = IzhikevichNeuronLayer.layer_from_tuples('n1', [\n", 50 | " IzhikevichNeuronLayer.C(a=0.010, b=0.2, c=-65.0, d=6.0, t=30.0, v0=0.0)\n", 51 | "])\n", 52 | "\n", 53 | "model = BPNNModel.compiled_model(model_input_shape, [nl], [])" 54 | ] 55 | }, 56 | { 57 | "cell_type": "markdown", 58 | "metadata": {}, 59 | "source": [ 60 | "# Run the model for 2000 timesteps" 61 | ] 62 | }, 63 | { 64 | "cell_type": "code", 65 | "execution_count": 3, 66 | "metadata": { 67 | "collapsed": false 68 | }, 69 | "outputs": [], 70 | "source": [ 71 | "traces = []\n", 72 | "\n", 73 | "def end_time_step_callback(i, graph, sess, results):\n", 74 | " traces.append(results)\n", 75 | " \n", 76 | "data = (np.ones(1,)*(7 if i > 1200 else 0) for i in range(0, 2000))\n", 77 | " \n", 78 | "model.run_time(data, end_time_step_callback)" 79 | ] 80 | }, 81 | { 82 | "cell_type": "markdown", 83 | "metadata": {}, 84 | "source": [ 85 | "# Extract the data we want (neuron 0's traces) and display" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": 4, 91 | "metadata": { 92 | "collapsed": false 93 | }, 94 | "outputs": [ 95 | { 96 | "data": { 97 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABI8AAAB0CAYAAAAM7xLLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xm8HFWZ//HPk5CVJQlbwiKrgCxqBGRzhEZE5KeyKeJP\n1gFREGVEdFj8OTdXHQEXQAyKCLK7IQ6KDIugraNgBFxAMDCEnbAEAiFkX57fH+dUUrfS2723+1ZX\n5ft+vepV3XWqq8/T1UvV0+ecMndHRERERERERESklmF5V0BERERERERERLqXkkciIiIiIiIiIlKX\nkkciIiIiIiIiIlKXkkciIiIiIiIiIlKXkkciIiIiIiIiIlKXkkciIiIiIiIiIlKXkkciIiIiIiIi\nIlKXkkciIiIiIiIiIlKXkkciIiIiIiIiIlKXkkciIiIiIiIiIlKXkkciIiLScWZ2nJm5me2ad13a\nxcy2N7Nbzex1M5ttZteY2QYtPvaJ+Hp8u0ZZJZZ9qP217j5mtpeZ/cHM5pvZ82Z2kZmtlXe9RERE\nZCUlj0RERET6ycw2BX4PvBE4G/gG8D7g12Y2sh+bOtHMNu5AFQvBzCYDdwJjgc8ClwEfB67Ps14i\nIiLS1xp5V0BERESk25jZMGCkuy+ss8rZwJrALu7+VHzMn4FfA8cBl7bwNA8C2wFnAqcOts7tYGZj\n3X3+ED7lV4FXgIq7vxbr8ATwfTN7j7vfPoR1ERERkTrU8khERES6gpmNNLMvmdl9ZjbHzOaZ2f+Y\n2b6pdSx2+fpFjcePjo/7XmrZKDPrNbNHzWyRmT1tZl8zs1GZx7qZTTWzI83sQWAR8N4G1f0g8Ksk\ncQTg7ncAjwAfbjHkJ4CrabH1kZltYmY/MLMXYiwPmtnxmXWS7oFbZJYnXeEqqWVVM/uHme1iZr83\ns/mEZE5S/sn4HIvMbKaZXWxm4zPbTbaxg5n9NnY9e9bM/r2FeNYB9geuTRJH0dXA67T+OoqIiEiH\nKXkkIiIi3WId4GNAFTgDmAJsANwWuzfh7g5cCxxoZutmHv+BuI1rYUXroV8CnwNuAj4N3AicBvyk\nxvO/C7gglv0bIbmzCjPbBNgQuLdG8Z+BtzUPdYX/JLQEP7PRSmY2EfgT8G5gaqzfo8DlZvaZfjxf\n1nrALcDfgM8Av43PNwW4GJgJnA7cAHwCuN3MRmS2MQG4Ffh7XHc6cJ6ZHdjkud9MiL3P6+jui2N9\n+vM6ioiISAep25qIiIh0i1eALWLyAAAz+z4hGfFp4IS4+GrgC4SWKZekHn8UIeHzh3j/o4Rkyz7u\nnizDzP4BXGJme7n7XanHbwe82d0falLPjeL8uRplzwHrmtkod1/UZDu4+2Nmdg2h9dE57l5rmxCS\nTMNj/V6Oyy4xsx8BU8zse+6+oNnz1TAJOMnd0621NgDOAm4HDnT35XH5dELi6ijgitQ2NgaOcfdr\n4nqXA08S9tctDZ672ev4zgHEIyIiIh2glkciIiLSFdx9WZI4MrNhsWVR0jJl59R6jwDTgCOTZXHd\nA4HrYuskgMOBfwLTzWz9ZAJ+E8tXdIeLftdC4ghgTJzXSg4tzKzTiq/QoPWRmRmhm9xN8W46ltuA\ncaRen35aRN9EEISE20jgwiRxFH0feI0wMHja68TWXrCi5dCfga2aPHez17E/r6GIiIh0kJJHIiIi\n0jXM7Fgzu5+QPHgZmEVIVozLrHo18A4z2zzePxwYAVyTWmcbYMe4jfT0SCzfMLPNx1usZtLCZ1SN\nstGZdZpy98cI9f64mW1UY5UNgPGEq5BlY0kSP9lYWvVsuqVXlLymD2fquRh4LFWeeCaVsEu8QujO\n1kiz13EgLalERESkA9RtTURERLqCmR0FXEkYl+jrwIvAMkIXqq0zq/+YMD7RkYRBno8C7nX3dMJj\nGPAA4RLwtTydud9qsiLpZlUr0bMRMLuVLmsZ/wkcTRjr6cZMWfJn37XAVXUef3+cZ5M4ieF1lrcj\nQbOsznJr8rhmr+PMAddIRERE2krJIxEREekWHyK0bDks3ZLFzHqzK7r7bDO7GTjSzK4D3kEY8Dlt\nBvBW4M4aLWMGzN2fNbNZwK41incjDPbc323OMLNrCYNST8sUzwLmAsPjFd0aeSXOx2eWZ1sLNfJk\nnG9H2B9AuBoesCXQrA6t+gewlPA6/jTzPJPTy0RERCRf6rYmIiIi3SJpwbKixYqZ7Q7sWWf9a4Ad\nCK2UlhFaI6X9FNgEODH7QDMbY2ZrDqKuNwDvN7M3pLa5H7AtcP0At/kVQte7Ppe5d/dl8fk+aGY7\nZR8UB7hOzIjzvVPlwwld3lp1B7AYODWOt5Q4gdB98OZ+bKsud58Tn+soM1s7VXQ0sBYDfx1FRESk\nzdTySERERIbS8Wb23hrLvwX8CjgM+K/YqmhL4CTgIUIyIetmwrhIhwO3uPuLmfJriFdkM7N9gT8S\num+9KS4/gMxl4vvhq/F5f2tm34r1+zyhm1x2AOqWpFofHVuj+EzCAN/T4hXoHgLWJQyU/e54G3d/\n0Mz+BJwTBxGfDXyEfhzzufssMzsH6AFuNbNfElohfRK4h9Tg2G3wBeAu4HdmdimwKXA6cLu739rG\n5xEREZFBUPJIREREhtLJdZZfGadJhK5bBxASJEcRkjSV7APcfbGZ/YSQ1LimRvlyMzsEOA04BjgU\nmE/oivUtVg6c3W/u/rSZ7QOcD5xLaKlzM3D6AMY7SvsKIeY+YxS5+wtmthvwH4QE2ycJibMHCeMk\npR0JfI+QcHoVuBz4LfDrVivh7lNi17xPEcaWmg1cCpzt7kv6H1bd5/mLmb0bOC8+z9xY37Pa9Rwi\nIiIyeNbGIQBEREREhpSZXUDoTjXJ3efnXR8RERGRMtKYRyIiIlJIZjaa0ErnBiWORERERDpH3dZE\nRESkUMxsQ8I4Px8C1iN0QRMRERGRDlHySERERIpmB+A64EXgVHf/W871ERERESk1jXkkIiIiIiIi\nIiJ1acwjERERERERERGpS8kjERERERERERGpq1zJI7MpmHlmmp53tUREREREREREiqqMA2Y/SLgC\nS2JpXhURERERkZIxmwL05F0NERGRAfor7jv390FlTB4txf35vCshIiIiIqW0AXA/sGveFRERERkq\n3ZU8MtsG2BfYkGyXOvcvtbiVbTCbCSwE7gbOwv2pBs85ChiVWboI90UtPp+IiIiIrD4mAC/jviTv\nioiIiAwVc/e86xCYnQh8F3gJeB5IV8xbalZldiCwFvAwsBGhSfEmwE64z63zmCms2vS4F/cp/aq/\niIiIiJSf2S3AfNw/mHdVREREhko3JY+eBL6D+3lt2NYpwOeBSYTWVefi/v/qrKuWRyIiIiLSGrNp\nwAO4fyzvqoiIiAyVbrra2gTg+kFvxewI4HygF9gZmA2cjtmGNdd3X4T7a5lJiSMRERERqWUC8Ere\nlRARERlK3ZQ8uh54Txu281ng+7hfATwFjAAWAce3YdsiIiIisnpT8khERFY73TRg9qPAlzHbA3gA\n6DsIoftFTbdgdj6wC3A5ZnsRWh8tBe4E9qzzmFrd1vZZ5flFRERERGA8oWW7iIjIaqObxjx6vEGp\n475VC9u4ETiYkPh5EfgD8AXgE8A+uO9e4zFTWHXAbBERERGRet6L+215V0JERGSodE/yqB3MNgae\nBfbC/e7U8q9RP3mkAbNFREREREREROrIt9ta6Gb2Rdznxdv1OO6nt7DFl4BlwMTM8onA87W37IsI\nYyKJiIiIiIiIiEhG3mMevY0woHVyu57Wmke5L8bsPmA/4EYAzIbF+1MHXEsRERERkTqs19YHZgFH\neI//NO/6dIr12hhgPvAJ7/FL866PtM567ZPAxcAo7/HFeddHWme99hiw3Hv8jXnXRVpnvTaS0Ejl\nFO/x7+Rdn3bIN3nkvm/N24Nz/rd349qzz7aPzBvJOludyqvX/Jwxez7DFW3avoiIiIhI2mZxXgFK\nmzwCNorz/QElj4rlqDhfE1DyqFi2zLsCMiAbxvmBgJJH3cimwLDlMPW/sb2fxL68D/bO42HZMMo0\nupOIiIiIdI+143xerrXovCR59GqutZCBSN6jY4FX8qyIyGoiGUrntVxr0UbD8q5AB3x2+TC+d/I9\nvv6OL/rIn+zEpGXDmAscn3fFRERERKSU1o/z13OtRedtEuezc62FDMQ6cT4211pIv1ivDc+7DjJg\nk+J8bq61aKNStTyK/Qp3Ac5JlnmPL7deuwPYs85jal5tzXt0tTURERERackGcV727kAbx/mSXGsh\nA5FueSTFkb0QlBRHsu9KkzwqW8uj9YHhwAuZ5S+wMvOXdRYwJzOd1akKioiIiEjpJMmjMbnWovOS\nlkdlj7OMkpZH2nfFsknzVaRLJcmj0jTY6XfyyHrtMuu1SgfqkpdzgHGZ6ZyGjxARERERWSkZGLXs\nrTq2i/Oyx1kq1mtG+IMdtO+KJvnMLcu1FjIQm8Z5aT5zA8mCbQDcar02C/gxcK33+N/bW60Be4nw\nwco275sIPF/rAbF7mrqoiYiIiMhAbRPnpTlJqGPHOC97nGWzceq29l2xJJ+5pbnWQgbiLXFems9c\nv5NH3uMHW69NAA4HPgp81nptOnAd8EPv8SfaW8XAeu0JYPPM4rO8x89N3Z9E6FN4vfXaK8BVwNnA\nfsDUTtRLRERERFZfsVXHrvFuaU4SsqzXxrLykuGljbOkJqdua98VS5I8GmW9Ntx7XC2QCsB6bRjw\n1ni3NJ+5AY155D3+ivf4pd7jFUJC50rgaODR9lWtpv8gXCI0mb6dFMSR6G8GngaWA1cDJwDTgDWB\nKzpcNxERERFZ/WwNrBdvl+YkoYb9AQP+F42bUzTvYuUV8sr8Hi0V67XRwD7AI3HR6ByrI/2zC2GQ\n+lco0WduUIM3Wa+NIPzTsjuwBasOVN1uc73Ha3Y/A94D7EBolnk48HnCDnsLsI/3eKfrJiIiIiKr\nn2OA1wl/WJbmJCEttq46BbgfeJD6F6KRLhN7jBxL+GP9FEr6Hi2powgDnX8VOJew7+blWiNp1anA\ns8DvKdGg5wNKHlmv7UvosvZBQuulnwPvB37TvqrVdKb12heBp4AfAhd4jyf9P/cEHohJoqnAVOu1\nLYHHgPn1Nmi9NgoYlVm8KI6FJCIiIiJSk/XaNsBpwA8IV/3dKN8atZ/12jqEi8nsDxwEHIwSEIVg\nvbY14b25BnA+8K9o33W9mKz9CHAhcC3wl1ikfdflYvfeswiJv48TGtqUZr/1O3lkvfYssC5wK+EF\nuWmIEi0XET44s4G9CD9iGwGfjeWTWLXl0wupsnrOAnoyy3qBKYOoq4iIiIiUmPXaQcBlwDOEoRW+\nQUFPEuLJ6jjCVeMmEo6xdyCMlfNuQne1T3qP32S9tj8FjbOMYk+Q9eK0AWFcqu2AvQm9Q54D3u89\n/rT12ny077pC/MytSUg6r0c4X90W2Ak4gNBa5WfASawcs0r7rgvE8YzGsfJztylh3+1K+L4cQ8gv\nXEb4Hi3NfhtIy6MpwPXe468O9smt184Fzmiy2vbe49O9x89PLbvfem0RcKn12lmDTF6dQ8jEp6nV\nkYiIiIisIo5D8i3Cn6i/Ak7wHp8TT8wnWq/tSbjK70LCMeUiQkv94akpe39EnEZm5q3cbqV8FOEE\nZs0G8+xYqLMI3dTOAa70Hn82Li9tAiJ1Sfs14jQidbvZ/XatO4Jw8tnKtCZhmI6sp4B7Ce/RH3qP\nL4jLy7zvhtG+13+gZSNpfd+tE9dPWwg8DFwP/Mx7/I8xtqQXzeqw74ZiP9UqG03r+248q35fvgr8\ng/AnwnXe44/F2Er1mRvI1da+38bn/yZhsO1GHquz/M+E+m9B+JA9D+yWWWdinNcbJ4mYeOqTLIrN\nzaTg4gGAET7cyZS9P5hl1oapXdvp1NTu+pGaU2NZt67TLfUYirq2QutqXa0rq7PN4/Rx4DLvcY/L\nHwW2AO4aonosBpak5ksaLFtMGCvlhTifX2P+MvBinF7wHp9NbS8Dm1mvnQP8jtArYDHhuDxJhmVv\nd/qkvp3rdpIT9snSOKVvp+8vBBakpjmE85nk/vzUfDbwEmG/vAw8lUoWZb0MfNh67WnCie7rsU7p\n/ZWehuokvh3bGNCFoPphGbX3W/Z2sm+S6bXM/WSaS9hvyb6bBTzrPb68xnMnn8Up1mvXEi4QNZ8Q\nc6N9l1cipr/rdfq3ttFnLZkWseo+ml1j2QLCINgvp6aZwMup34K0l4FNY6OZet+X9T532alRWX/X\ne8J7vFkjnlV0+guyIe/xWYQPykBMJlxV7cV4/27gC9ZrG3qPJ8v2J3xgH+rnts+1XvtMrQ9vzIxm\nfxSz94fVud2obLCP6eS2az2m3YmYTiyT9vAa0/I6y1uZSM2psWywZZ3cdpnq1KysGa1b7nXTn9e8\n6lDEdctOr0VwH3C09/g96YXe49+2XruO0N1kIuGf7GRszeWEk89kyt6vlwSqlxBaVudEZShcQrhA\nzUnAmW3YXqOTuv6UzR/g44asrE5iYCh9DPgaYSydwZ4HLqe116OV12bhAB83VGVLc/y84T3+pPXa\nMcAXgB+3YZNJIqzV16PeeklSulv2U62y5XnuO+BSwm/CJ2je46pV2TiTaVmd5bXWeXkgT2ye62vZ\nmtj8d3fgt4Qs7Z7ABcAt3uPHxnWGA38DZl6yM/dtPpbjhxkTZy9m5qTRHFqp+J/78XzJvwILWTU5\n1G2Sg4/0QUijA5R2lC2vM3mT+60uG8p1BrPt/kyDSbbk8jw5f9GKiIhIF4t/qL6B0IVjDfoeQ2an\nJOmVPcnL+8RutRR7WWwKrBUXNdpvNU/UuyARtlqyXtuAMLbVWJp/5molVHJNhK2uUt+X4wjdFZNE\nTiufuxVJn7w/d0VJHu0MfAd4E+Hfm8eBa4Dz0+MdWa9tfsjG3Hjy1kz+9qPMnbmAG77+FpYOMz4E\nbFeprGiR1Oz53kFo2TSKlTux3g6ut3wokjn6wRURERERERGRjipE8qg/qlWbBtxTqfin4v1hhH6h\n365U/NxcKyciIiIiIiIiUjC5jnnUbtWqjQR2IVwVAoBKxZdXq3YHoatbrcckfdHTFlUqg7qCm4iI\niIiIiIhIKZQqeQSsTxiX6IXM8hcIXd5qOQvoySx7uFq1l4D14rQutcc7WkDfkdaz00s1ls2pVNRH\nWERERERERESKoWzJo4E4Bzg/s6xPy6PY9W0dViaTstP6qdvbpG6PrfF8y6tVm03zpNOrhCvFzUnN\nF1UqJetnKCIiIiKFUa3aTsAphKubzSVc7r3RfC4wr1LxpblUWFaoVu0oYA9a22fJ7QU6/8hX7F3z\nn4SBllvZZ8l8sfZdvqpV2xH4JKHRSaN9lZ6/Xqn4slwq3ERhxjyqVu0LwPsIA1kvrlR8fI11tgb+\nlzBC+RzgKkLLosuB8ZWKHzx0NYZq1UbTPNmUnSYQLjtfSxJXOqmUTTDVmidvxHnJvFvfkCIiIiLS\nvapVu4Bw2emnCFfrWjvOhzV5aPrkqZUT32br6k/VfqpW7RlC44F5rNx3Y5o8bDnxhJb+77OaZToP\n6Z9q1fYA7iac545k5b4b2eShS2nTPovz+epB0z/Vqn2TkDx6krDPku/Leuf7ifn0/zux7rxdQ/IM\nqOVRtWr/0ai8UvEvDaw6DY0Erid8cE6oUafhwI2EF+m/gavjtATYD5jagTo1VKn4QuDZOLUkxjGe\ncBm/cYQWT9nb2fnmNZbX6maXfp4FrEwmvZ653ez+PMIbej7hQCCZ5hP+nVjSarwiIiIiZVCt2lqE\nsTeXAIuBRZl5n9sFTny8Cfh1+k/ZatWMkIRITmrXpm9iae0GZeNYedn4dFnTE+Nq1V6n77FoO6f0\n/kvvt0KePFertjawCXB0peLXppYPp/V9lS3bCNi2RlnDE+PUeUgn9ttC+u6zZL60wJ+57eN8cqXi\n85OFsUVSf/dZcnu9GmW1es2kebVqyXlgp/Zd9rtzUcGTjW8C7qhU/APJgvh9OZaBfV9OAN5Qo2xE\no0pUq7aEcB4/n/DZ+3ul4h/ubzAD7bZ2aOb+CGBLQnZzBtD25FGl4j0A1aodV2eV9wA7ACcDFwG3\nAhcTWh7NB65od506IX44ku5rA5J6QybJpLXitGaLtzetsXwtWny/VKu2jExCqcH9essWNZkW1ysr\n8A+DiIiIFNf/A85odeV4MF8rsVRv3t+yZYRj86Wp2/1dli4zwrHgjsCP0rHEY6/kj8UXW30Nmrw+\ntU6Ma83HNJjWaVLerLVUrXotpXZyor/LlsRpaWpaUud2f8uSk21jZSJnhzifno4nnnvMidOg1Tgx\nrncyvCb198vawIZ1ykbHqb+8WrXB7rPkdqPXf7D3l7NyvyXTHsCT6cQRQKXii4HZcRq0mEhMzv/q\nJTCS1mr1pgnAxg3KGyY56tRrOQPbZ7XK2v1ZS5c5Kz9vyXxHQgOYFeL35bw4ZcdqHpD4fdlKEnFs\nnAb0PT2g5FGl4m+rUeF1gCuB/xrINttgT+CBSsUvjS/el4BJhH8tPlapeM0dU8arrWXekM+1a7up\nH/HkjZf+Isjer7UsfX/dBo8bwyDG40odjLUyLWblj3erU7seU/MgTckvERGRQnozcCdwKuH4cyTh\nGLPWfCBlYwmt0+s9Lvv4Zt0iBuP+Dm4baP+JcVZMcoyg/klurdd3IMvWqbPeSMLx7hqxHmtk7ndi\n/y0BHu7AdlfoxIlxVhyPdhT19136czCY/bh2nfVq7bf07U64qUPbXSEmEpNhUToiJqgafeYGu8+S\n98X4OmXDqb/fGvbcGYSh+r4cVAOUVrTtzV2p+GvVqvUQ3tjXtGu7/TAJeMGMY8C/787UatXGEr64\nXjJjJPARd67OPK7W1dZ6gSkdr3HBdPpHPC1+saQ/6J2akn+lRrQ4jUzd7shBWcyu1/vnb1mTslbn\n9cqWx2lZ6vbyFpYPtKwd2/PUPDvVWz7QskE/RslBEZHS2h74WaXiD+VdEVhxgj08Tmuk5q3czi4b\nQfg9W0poIX7vEIbSEfH3OGm11ZZWN+0U91+9xFKjpFNyOzn2IHX7xUrF5w5dFJ0Ruw4mvRW6SkxK\nDmPg+204tY8nu+J7ZbBigioZCqWrxH1Xb7+0sg8T6c/eEuDPQ1D9IdHuzGgyNk9LqlU7l+bNe7ev\nVHx6k3XSriB0Wcs2xVo7lmWTRzWvttaP55MOiF8sXfmjkIgJrv4mnNJT8gORPVDrz7y/jxnZYBtr\nEBJiwwk/erWmRmWNyjv572ehVKsrXorBJqPS2xnIfDCP7ZZtDPXzN9PO9fSc+a+3ujxnVvb7ut33\ny/Ic2ftrEIZw6Girjv6IJ9jLCScvUjBx/yXJLSmImJRM/qiVAon7LukhIjUMdMDsUzOLjDBg2tHA\nLf3Y1DcJXd0aeazFbT0P7BbrkhwsTUyVbUqNfxVi9zQli6TfYoJrGWFwNxEREREREZFSGmjLo9My\n95cDs4CrCC15WlKp+Kz4uEH7wAdmf3TSpMe3Bl8OdqcZS8ePf3699dZ7bvljj73lKmALQoskERER\nERERERFp0UAHzN6y3RVpplq1zQiDLG8GDK9WbXIserRS8dfnz1/n2l12ufPkGTMmbzhp0mN/GT/+\nxWE77/ybw557bqs/zpgx+RbgCeCGoa63iIiIiJSbGWOBZ4AD3ZmWd306xYw1CHF+1J3f5F0faZ0Z\n7yJcJW8Td5bmXR9pnRkXA8PdOSnvukjrzBhO+L48xp1f512fdjAvyNit1apdCRxbo2jfSsWrcZ3N\nf/KT0395yCEXbzNq1MJ5hJZQZ1Yqri9IEREREekIM7YnDGh7tXvN49VSMGN9Qq+B2905IO/6SOvM\nuA14DzDRfWCX6ZZ8mIUhWdw1hmiRmLEu4epnVXf2zbs+7dCpSwm2XaXixwHHNVnnyUqFt8I3hqRO\nIiIiIiKEq7fCwAYpL5IJcV72OMso2WfjWfXCQiLSfqX7vixM8qhVZqtciagPd4YPYXVEREREpPyS\nk4SytwxYXeIso2SfTWi4loi0S+m+L0uXPAIOo2/yaATwNkKXt55caiQiIiIiZba6nJCvm3cFZNC0\nD0WGRuk+a6VLHrlzY43FPzPjQeAI4PIhrpKIiIiIlFuSPCp7C/fVJUlWZtqHBRIHXU5uj3BnSZ71\nkX4p3WdtWN4VGEJ/AvbLuxIiIiIiUjrJScK4XGvReUmcI3OthQxEss9Kd0JbcuNTt7XviiXZX6Vp\nsLNaJI/MGAOcCjybd11EREREpHQmZOZltbrEWUbad8U0oc5t6X7J/hrfcK0CMffSDP4NgBmv0HfM\nIwPWBuYDR7nzyxa2cX+HqiciIiIi5bMxsB6wCHgk57p00iRgA2AJMD3nukj/vIkwFuxLwHM510Va\nNwZ4Y7z9KLAgx7pI/3Tz9+U/3Tmivw8qTROqlNPomzxaDswCprnzSrMHm9ko4OfAOe6+qDNVHHox\nrrNQXIVQ1rigvLEprmIpa1xQ3tgUV7GUNS4ob2yKq1jKGheUNzbFVSxljWswStfyCMCM8cAJwPZx\n0UPA5e7Maf5YWweYA4xz99c6V8uhpbiKpaxxQXljU1zFUta4oLyxKa5iKWtcUN7YFFexlDUuKG9s\niqtYyhrXYJRuzCMzdiU06TuNcHm8dePtGWbsnGfdRERERERERESKpozd1i4AbgJOdGcpgBlrAJcB\nFwJ751g3EREREREREZFCKWPyaFdSiSMAd5aa8TXg3vyqJSIiIiIiIiJSPKXrtga8BmxWY/kbgLkt\nPH4R0BvnZaK4iqWscUF5Y1NcxVLWuKC8sSmuYilrXFDe2BRXsZQ1LihvbIqrWMoa14CVbsBsMy4C\nDgU+B9wVF78D+DpwgzufyatuIiIiIiIiIiJFU8Zua58DHLialfEtAb4LnJlXpUREREREREREiqh0\nLY8SZowFto53Z7gzP8/6iIiIiIiIiIgUUWmTRyIiIiIiIiIiMnhlHDBbRERERERERETaRMmjFDM7\nxcyeMLOFZjbNzHbLu06NmNneZnaTmc00MzezQzLlZmZfMrPnzGyBmd1hZttk1hltZheb2ctm9rqZ\n3WBmE4e2Bq+/AAAMb0lEQVQ2kr7M7Cwzu8fM5prZi2Z2o5ltl1mncLGZ2clmdr+ZvRanu83swFR5\n4WKqxczOjO/HC1PLChmbmU2JsaSn6anyQsYV67WJmV0b67XAzB4ws11T5YWLLX5/Z/eXm9nFsbxw\nMcU6DTezL5vZ47HeM8zsi2ZmqXWKGtvaZnahmT0Z632Xmb09VV6IuGyIfo/NbF0zu87Cb8irZna5\nma2VY1yHmdntsc5uZpNrbKNQcZnZCDM7z8J34ry4ztVmtnG3x9Ustlg+xcymx9heie/F3bs9tmZx\nZda9JK7zmczywsVlZlfaqr9ptxY9rrjO9mb2SzObE9+P95jZZqnyrourldhq7K9k+nw3x9ZCXGuZ\n2VQze8bC79hDZnZSZp0ixjUxfs5mmtl8M7vVCvD7nBcljyIzOwI4n3A5vp2BvwO3mdmGuVassTUJ\n9TylTvm/A6cCJwG7A/MIMY1OrXMB8AHgcGAfYGPg552qcIv2AS4G9gD2B0YAt5vZmql1ihjbM4RB\n23cBdgV+A/zCzHaM5UWMqQ8LJ32fAO7PFBU5tgeBjVLTv6TKChmXmU0A/ki4mMCBwA7A6cArqdWK\nGNvb6buv9o/Lr4/zIsYEcAZwMvApYPt4/9+BT6fWKWpslxH209HAm4HbgTvMbJNYXpS4hur3+Dpg\nR8Jr9n5gb+DS9oRQU7O41gT+QHhP1lO0uMYSjgO/HOeHAdsBv8ys141xQfN99gjhu+TNhN+zJwjH\nWBuk1unG2JrFBYCZHUo4fpxZo7iocd1K39+2/5spL1xcZrY14btjOlAB3kL4zC1MrdaNcUHzfbZR\nZjqecDGnG1LrdGNszeI6H3gvcBThWOQCYKqZHZRap1BxmZkBNwJbAQcDbwOeJByHpM87uzGufLi7\npjDu0zRgaur+MOBZ4My869Zi/R04JHXfgOeAz6WWjSN8KX8kdX8x8KHUOm+K29oj75hSddog1mnv\nEsY2GzihDDEBaxEOSt8NVIELi76/gCnA3+qUFTmuc4H/aVBe2NgycVwIPBrjKWxMwK+AyzPLbgCu\nLfL+AsYAS4H3ZZbfB3ylwHF15PeYcLDuwK6pdd4LLAc2Huq4MmVbxPLJmeWFjiu1ztvjepsVJa5+\nxLZOXG+/osRWLy5gE8IfdTsSkmKfKfp7EbgSuLHBY4oa14+Ba4ocV6P3YmadG4E7ixRbnX32D+CL\nmWX3AV8palzAtnHZjqllw4AXgY8VJa6hnNTyCDCzkYTWIHcky9x9eby/Z171GqQtgUn0jWkOIUmW\nxLQLoVVPep3pwFN0V9zj4nx2nBc+NgvdUD5CyIbfTQliIrQWu9nd78gsL3ps28SmrI/F5qhJk+oi\nx3UQcK+ZXW+ha+hfzezEVHmRYwNWfK8fBfzAw694kWO6C9jPzLYFMLO3EloM3BLLixrbGsBw+v7T\nDLCAEF9R48pqVxx7Aq+6+72pbd9BODjt0+2oi5QlrnGEE4NX4/1SxBW/Jz8OzCH8Mw8Fjc3MhgHX\nAF939wdrrFLIuKJK/K1+2My+a2brpcoKF1fcV+8DHjGz22Js0zLdiQoXVy2xa9P7gMtTi4sa213A\nQRaGPTAz25eQfLk9lhcxrlFxvuI4JOYAFrGyp0ER4+oYJY+C9QkHsC9klr9AOOAroqTejWKaBCx2\n91cbrJOr+ANzIfBHd/9HXFzY2MzszWb2OuFL6RLgUHd/iALHBBATYTsDZ9UoLnJs04DjCP8enEw4\nCfwfM1ubYse1FSGe/wUOAL4LXGRmx8byIseWOAQYT/jXFood07mEf2mnm9kS4K+Eln3XxfJCxubu\ncwnJ8y+a2cYxqX4U4SBsIwoaVw3timMS4d/QFdx9KeGPlW6JNavwccWuhecBP3L31+LiQsdlZu+P\nxyILgdOA/d39pVhc1NjOILRkvKhOeVHjuhU4BtiPEOM+wC1mNjyWFzGuDQmt1c8kxPce4L+An5vZ\nPnGdIsZVy7HAXPp2cSpqbJ8GHiK07ltM2HenuPvvY3kR40qSQOeY2QQzG2lmZwCbEo5DoJhxdcwa\neVdApImLgZ3oO85MkT0MTCb8i/kh4KrUD2UhmdkbgG8RDj6zLQgKzd1vSd2938ymEfpCfxj4Zz61\naothwL3ufna8/1cz24kwHstV+VWrrU4AbnH3WuNeFM2HgSOBjxLG4JoMXGhmM9296PvraOAHhG7i\ny4C/AD8i/NMnkiszGwH8lND18OScq9NOvyV8j6wPnAj81Mx2d/cXGz+sO5nZLsC/ATvHlqal4e4/\nTt19wMzuB2YQxgm6M5dKDV7SeOEX7n5BvP03M9uLcBzyu3yq1RHHA9eV5Pj404TxxA4iHAvvDVwc\nj0WyvQ4Kwd2XmNlhhJZhswnHIXcQWnZbo8eurtTyKHiJ8GbJXp1lIvD80FenLZJ6N4rpeWCkmY1v\nsE5uzGwqYcCxfd39mVRRYWNz98Xu/qi73+fuZxGaif8bBY6JcJK3IfAXM1tqZksJ/4ydGm8n/7YX\nMbY+4r8OjwBvpNj77DnCv0dp/wSSLnlFjg0z25ww9tZlqcVFjunrwHnu/mN3f8DdryEM3pi09Cts\nbO4+w933IfwL/QZ3343QPPwxChxXRrvieJ7wXbuCma0BrEv3xJpV2LhSiaPNCX+OvJYqLmxcAO4+\nLx6L/MndTyC02DkhFhcxtnfG+jyVOg7ZHPimmT0R1yliXKtw98cI5y1vjIuKGNdLhPdcs+OQosXV\nh5m9kzDY/mWZosLFZmZjgK8Cp7v7Te5+v7tPBX4CfC6uVri4AOI52WRCa/WN3P29wHqE4xAoaFyd\nouQR4YSeMODXfsmy2F1qP0KT+iJ6nPBmTce0DqHfZRLTfYSrLaXX2Y7wxZ1b3LEf7VTgUOBd7v54\nZpXCxlbDMEJ/2yLHdCfhqi2TU9O9hKsOTGblSWARY+vDwiU330hIvhR5n/2RcECTti3hnyQodmwA\n/0poPnxzalmRYxpLONBOW8bK3/AixwasOJl9zsKVAA8AfkEJ4oraFcfdwPjYyiLxLsL7YFqnKj9I\nhYwrlTjaBni3u7+cWaWQcTWQHItAMWO7hnC1rvRxyExC4v2AuE4R41qFmW1KOLF9Li4qXFzxvOse\nGh+HFC6uGk4A7nP3v2eWFzG2EXFqdCxSxLhWcPc57j7LzLYhXBX7F7Go0HG1nXfBqN3dMAFHEPp+\nH0sYMf17hMtWT8y7bg3qvBYrfySd0G99MiuvBnJGjOEgwsn9jYQT+dGpbXyX8EW9L6EFyV3AXTnH\n9R3CoJT7EPqJJtOY1DqFiw04h9DEc4tY53MIA6ntX9SYGsRaJV5trcixAd+I78MtgL2AXwOzgA0K\nHtfbCT+EZxOSYR8lXDr8yBLss2GxTufWKCtqTFcSxhh4X3wvHhrfh+eVILYDCGOKbUm4vO3fgD8B\nI4oUF0P0e0xoSv8XYDfgHYSWkD/MMa514/3/E8uPiPcnFTUuwgnSL4CngbfS9zhkZDfH1UJsaxJa\nD+xBaJmzC6Hb6EL6Xm2o62Jr9l6ssf4TpK62VsS4YtnX4/7agnDyel+s06iixhXLDyWMm3Mi4Tjk\nU4TExL90c1ytvhcJVzGcB5xUZxtdF1sL+6xKuOJahfCbfRzhAhcnFzyuw2NMWwEHE747buj2/ZXX\nlHsFummKX1xPEgYzngbsnnedmtS3Ej8E2enKWG7Alwj/eC4k9OHcNrON0YRxhWbHL7mfkzroyymu\nWjE5cFxqncLFRuhP+0R8f70Y67x/kWNqEGuVvsmjQsZGGKR4Ztxnz8T7Wxc9rliv9wMPxHr/Ezgx\nU17I2AgDb3q2rgWPaW3ChQOeJByozSBcyj59IlvU2D4c41lE+Cd9KjCuaHExRL/HhGTNDwkDsM4h\nnPivlWNcx9Upn1LUuAgn6fWOQyrdHFcLsY2O9Xw2fuZmEhJlby/6e7HG+k+wavKoUHEBY4DbCMeM\ni2NMl5L5Y7tocaXWOZ5w4Y4FhD8ODu72uPoR28eB+aR+z7o9tmZxERLoVxC+PxYQBpv+LGAFj+tU\nwp8FiwnHWV8mdXzVrXHlNVkMVkREREREREREZBUa80hEREREREREROpS8khEREREREREROpS8khE\nREREREREROpS8khEREREREREROpS8khEREREREREROpS8khEREREREREROpS8khEREREREREROpS\n8khEREREREREROpS8khEREREREREROpS8khEREREREREROpS8khEREREREREROpS8khERERERERE\nROr6/5P+9OIOZKW7AAAAAElFTkSuQmCC\n", 98 | "text/plain": [ 99 | "" 100 | ] 101 | }, 102 | "metadata": {}, 103 | "output_type": "display_data" 104 | } 105 | ], 106 | "source": [ 107 | "neuron_layer_0_traces = np.array([r['n1'] for r in traces])\n", 108 | "\n", 109 | "rend.render_figure([rend.IzhikevichNeuronTraceRenderer(neuron_layer_0_traces, \n", 110 | " 'Layer 0 Neuron')], \n", 111 | " 0, 2000)" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "# Often ignore first 1000 timesteps (let model settle with 0 input)" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": 5, 124 | "metadata": { 125 | "collapsed": false 126 | }, 127 | "outputs": [ 128 | { 129 | "data": { 130 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABI8AAAB0CAYAAAAM7xLLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xm8HFWd9/HP72YnQNgCAVEgCIiCMoAoMCMdEZUHNxSR\nYRkQBvfBwWUk8OhNo7LoDKADo4KAbCryoCggiDC0CyAMbuwyhB0CCRC2rOTm9/xxTpFK0fvt7qru\n+32/XufV3VWnqk/3uedW9a/OOWXujoiIiIiIiIiISDVDeRdARERERERERESKS8EjERERERERERGp\nScEjERERERERERGpScEjERERERERERGpScEjERERERERERGpScEjERERERERERGpScEjERERERER\nERGpScEjERERERERERGpScEjERERERERERGpScEjERERERERERGpScEjERER6TozO8zM3Mx2zrss\nnWJm25rZ1Wb2opk9Y2YXmNn0Jrd9MH4f/1llXSmu26/zpS4eM9vNzH5vZovN7Akz+7aZrZl3uURE\nRGQVBY9EREREWmRmmwK/BV4LHAv8O7AP8Gszm9jCro40s026UMS+YGY7ANcBawCfA74PfAy4JM9y\niYiIyOrG510AERERkaIxsyFgorsvrZHlWGAqsJO7Pxy3uQX4NXAYcGYTb3MnsA1wDHDUaMvcCWa2\nhrsv7uFbngAsBEru/nwsw4PAWWb2Tne/podlERERkRrU80hEREQKwcwmmtnxZvZHM3vOzBaZ2e/M\nbFYqj8UhXz+vsv3kuN33UssmmVnZzO4zs2Vm9oiZfcPMJmW2dTM73cwOMrM7gWXAu+sU90PAFUng\nCMDdrwXuBfZv8iM/CJxPk72PzOxVZnaOmT0ZP8udZnZ4Jk8yPHDzzPJkKFwptaxiZneY2U5m9lsz\nW0wI5iTrPxXfY5mZPW5mZ5jZOpn9Jvt4vZldH4eePWZm/9bE51kb2Au4MAkcRecDL9L89ygiIiJd\npuCRiIiIFMXawD8DFeBLwBxgOvCrOLwJd3fgQmBvM1svs/174z4uhJd7D/0C+AJwOfAvwGXA0cDF\nVd7/7cCpcd1nCcGdVzCzVwEbArdWWX0L8HeNP+rLvk7oCX5MvUxmthHwB+AdwOmxfPcBZ5vZv7bw\nflnrA1cBfwH+Fbg+vt8c4AzgceDzwKXAx4FrzGxCZh/rAlcDf4157wFONrO9G7z39oTPvtr36O7L\nY3la+R5FRESkizRsTURERIpiIbB5DB4AYGZnEYIR/wIcERefDxxH6Jny3dT2BxMCPr+Prw8kBFv2\ncPdkGWZ2B/BdM9vN3W9Mbb8NsL2739WgnBvHx3lV1s0D1jOzSe6+rMF+cPf7zewCQu+jE9292j4h\nBJnGxfI9HZd918x+BMwxs++5+5JG71fFDOAT7p7urTUdmA1cA+zt7ivj8nsIgauDgXNT+9gE+Cd3\nvyDmOxt4iFBfV9V570bf4z+08XlERESkC9TzSERERArB3UeSwJGZDcWeRUnPlB1T+e4FbgYOSpbF\nvHsDF8XeSQAfBu4G7jGzDZIE/Hdc//JwuOg3TQSOAKbEx2rBoaWZPM34GnV6H5mZEYbJXR5fpj/L\nr4BppL6fFi1j9UAQhIDbROC0JHAUnQU8T5gYPO1FYm8veLnn0C3AzAbv3eh7bOU7FBERkS5S8EhE\nREQKw8wONbPbCMGDp4EFhGDFtEzW84HdzWyz+PrDwATgglSerYA3xH2k071x/YaZfT7QZDGTHj6T\nqqybnMnTkLvfTyj3x8xs4ypZpgPrEO5Clv0sSeAn+1ma9Vi6p1eUfKd/y5RzOXB/an3i0VTALrGQ\nMJytnkbfYzs9qURERKQLNGxNRERECsHMDgZ+QJiX6JvAfGCEMIRqy0z2HxPmJzqIMMnzwcCt7p4O\neAwBtxNuAV/NI5nXzQYrkmFW1QI9GwPPNDNkLePrwCGEuZ4uy6xLLvZdCJxXY/vb4mM2iJMYV2N5\nJwI0IzWWW4PtGn2Pj7ddIhEREekoBY9ERESkKPYj9Gz5YLoni5mVsxnd/RkzuxI4yMwuAnYnTPic\nNhd4E3BdlZ4xbXP3x8xsAbBzldW7ECZ7bnWfc83sQsKk1DdnVi8AXgDGxTu61bMwPq6TWZ7tLVTP\nQ/FxG0J9AOFueMAWQKMyNOsOYAXhe/xJ5n12SC8TERGRfGnYmoiIiBRF0oPl5R4rZvYWYNca+S8A\nXk/opTRC6I2U9hPgVcCR2Q3NbIqZTR1FWS8F3mNmr07tc09ga+CSNvf5NcLQu9Vuc+/uI/H9PmRm\n22U3ihNcJ+bGx7el1o8jDHlr1rXAcuCoON9S4gjC8MErW9hXTe7+XHyvg81srdSqQ4A1af97FBER\nkQ5TzyMRERHppcPN7N1Vln8LuAL4IPCz2KtoC+ATwF2EYELWlYR5kT4MXOXu8zPrLyDekc3MZgE3\nEIZvvS4ufxeZ28S34IT4vteb2bdi+b5IGCaXnYC6KaneR4dWWX0MYYLvm+Md6O4C1iNMlP2O+Bx3\nv9PM/gCcGCcRfwY4gBbO+dx9gZmdCAwDV5vZLwi9kD4F/A+pybE74DjgRuA3ZnYmsCnweeAad7+6\ng+8jIiIio6DgkYiIiPTSJ2ss/0FMMwhDt95FCJAcTAjSlLIbuPtyM7uYENS4oMr6lWb2AeBo4J+A\nfYHFhKFY32LVxNktc/dHzGwP4BTgJEJPnSuBz7cx31Ha1wifebU5itz9STPbBfgKIcD2KULg7E7C\nPElpBwHfIwScngXOBq4Hft1sIdx9Thya9xnC3FLPAGcCx7r7S61/rJrv8yczewdwcnyfF2J5Z3fq\nPURERGT0rINTAIiIiIj0lJmdShhONcPdF+ddHhEREZFBpDmPREREpC+Z2WRCL51LFTgSERER6R4N\nWxMREZG+YmYbEub52Q9YnzAETURERES6RMEjERER6TevBy4C5gNHuftfci6PiIiIyEDTnEciIiIi\nIiIiIlKT5jwSEREREREREZGaFDwSEREREREREZGaBit4ZDYHM8+ke/IuloiIiIiIiIhIvxrECbPv\nJNyBJbEir4KISBVm44BFwKS8iyIiIiIiIjLGvAP361rdaBCDRytwfyLvQohITZsSAkfvAa7JuSwi\nIiIiIiJjyUg7GxUreGS2FTAL2JDskDr345vcy1aYPQ4sBW4CZuP+cJ33nMQre0Asw31Zk+8nIq3Z\nMj7ei/tLuZZEREREREREGjJ3z7sMgdmRwHeAp4AngHTBHPcdm9jH3sCawN+AjYFh4FXAdri/UGOb\nOTFfWhn3OS2VX0SaY/bPwPeAKbgvz7s4IiIiIiIiUl+RgkcPAf+F+8kd2NengS8CMwi9q07C/f/W\nyKueRyK9ZHYCcCDum+ddFBEREREREWmsSMGj54EdcL9/lPv5CHA+8AngZqACrAVshvv8UZZSREbL\n7GJgOu5vz7soIiIiIiIi0thQ4yw9cwnwzg7s53PAWbifCzwMTACWAYd3YN8iMnozgbl5F0JERERE\nRESaU6QJs+8DvorZW4HbgdUn0nX/dsM9mJ0C7AScjdluQBlYAVwH7Fpjm2rD1vZ4xfuLSKe8Fvhp\n3oUQERERERGR5hRp2NoDddY67jOb2MdlwPsJgZ/5wO+B44CPA3vg/pYq28zhlRNmi0h37YP7L/Mu\nhIiIiIiIiDRWnOBRJ5htAjwG7Ib7Tanl36B28EgTZouIiIiIiIiI1JDvsLUwzOzLuC+Kz2tx3D/f\nxB6fAkaAjTLLNwKeqL5nX0aYE0lERERERERERDLynvPo7wgTWifPa2mue5T7csz+COwJXAaA2VB8\nfXrbpRSRXFjZpgMPAnv4sN+ac3FklKxsk4FHgcN82K/IuzwiY4GV7Sxgsg/7IXmXRWQssLLtBfw/\nYGMf9sV5l0dk0FnZJhDOLz/hw/6zvMszyPINHrnPqvp8dE75z1248Nhj7YBFE1l75lE8e8FPmbLr\no5zbof2LSO9sD6wBbA0oeNT/tgPWB7bKuyAiY8jfA8/lXQiRMWQWsDYwHXgo57KIjAWvBTZE55dd\nl3fPo46zOTC0Ek7/Jfa2h7Cv7oH9w+EwMsQgze4kMla8Lj6ul2sppFOSHqbr51oKkTHCyjaJcDJd\n76YkItJZO8bH9VHwSKQX3hAfdX7ZZUN5F6ALPrdyiO998n98gzfM94kXb8eMkSFeAA7Pu2Ai0rJt\n4qMOBoMhCR4pGCjSG9sA41CbE+kJK5uhY51IryXBI7W5LhuonkdWtonATsCJyTIf9pVWtmuBXWts\nU/Vuaz6su62JFEDS80jBo8GgE2qR3touPq5rZRvyYV+Za2lEBt/GhOEzoGOdSK8kxzq1uS4btJ5H\nGxCusD2ZWf4kMKPGNrMJcwGk0+xuFVBEWpL0PNLBoM9Z2cYBb4wvVZ8ivZFcjTVgWp4FERkjdkw9\n17FOpDfU86hHWg4eWdm+b2UrdaEseTmRcEKVTifW3UJEus7KtgawWXypnkf9b0fC5Od3o/oU6ZXt\ngeRuT2p3It33D8B84CnU5kS6Lo4i2hpYgtpc17UzbG06cLWVbQHwY+BCH/a/drZYbXsKGAE2yizf\nCHii2gZxeJqGqIkUT9JL5Q50JWEQvB9YCFwBfDjnsogMvBiAnwX8EtgP/R8V6YX3EY5zu6M2J9IL\nexFGHl0N7JJzWQZeyz2PfNjfTxjP+1XgzcCfrGx3WtmOtbJt3tnitVy25cAfgT2TZVa2ofj6przK\nJSJtORCYB/wKXUkYBO8DriRckdUJtUj37QOsCZwRX6vdiXSRlW1rwlyNPweeRm1OpBf+EbgT+A1q\nc13X1oTZPuwLgTOBM61smxIq7XDg+Hb32YiV7UFWDWFJzPZhPymV5zXABOAzVrZDgUsIPZGmAud2\no1wi0nlx8vsDCe12AQoe9TUr2/aE4TPHA2sBa1vZJviwv5RvyUQG2kHArcAt8bVOqkW660BgKXAt\ncCRqcyJdZWVbk9Cz/URCwHaKlW2KD/uSfEs2uEY1YbaVbQKwM/AWYHNeOVF1p32F0OspSf+ZKss4\nwlXtp4GTCOMeDydceXu3D3u3yyYinfNxQsDoPOAZYJ3YxqXPxHo7E/gboSv/M3HVurkVSmTAWdne\nQzih/i7hfGgZCsKLdE3sdfQl4Awf9sWEY53anEh3nUK4IcQFrDq/VNC2i9oKHlnZZlnZziIEi34A\nPA+8B9i0c0Wr6gUf9idSaVFq3TuB1wMH+7DP9mHfCPgkocv2n7tcLhHpECvbbsA3gdN92O8gBIRB\nwYa+Y2VbC7iYMAb9CB/2pejgLtJVVra9gQsJwdpzfNid0O7U5kS6wMr2RsIQ+8cIF7pBbU6ka6xs\n461sXyf08PusD/vD6PyyJ1oeYmZle4xQKVcDHwMuj5NO98IxVrYvAw8DPwRO9WFfEdftCtye6WH0\nK+A7hNv3VQ0gxRnaJ2UWL+vhZxIRXu7J+GlCz8FbgC/GVUnwaH3CpPhSYLGn0VuADwCHEYYNf9CH\n/YaYRQd3kQ6zss0A/g/wEcLFtF8SLqZ5zKL5V0Q6KF4ceRfwIWB/4C5gn9jrCBQ8EukoK5sB2xE6\nrBwOzCT09js7ZtH5ZQ+0Mz/RHOASH/ZnO1yWRr4N/Inwh7EbYWzjxsDn4voZvHLY3JOpdbXMBoYz\ny8qEzykiXWZl2w74KHAw4W6O/wV8PhXA1cEgR/FgPZEQBErSmsAGhPpK0pbAtoTbpU4kTIx9CXCS\nD/sjqV0mwUDVp0gNVrbxwBoxTY2P6xLaWtL2NgW2iWljwIE/EOah/IkP+8rULvVDVqSOeIOdyYS2\nNgWYRrhotUFM6xPu3rwNYVLsZB7WvxJ+i3w3c+H5aWA9K5ulgrgiEsXzy0msOtYlx7ukzSXnl1uw\n6li3FrCIMFXN/j7s6c4hOr/sgZaDRz7sZ3Xqza1sJxEihvVs68N+jw/7Kallt1nZlhEm7J49yl5C\nJxLGS6ap15FIF1nZ1iH8wPko4a6NTxOGWZzlw35nJntyMHijle0RwjDZFzM/jLpVTiMM7x0i3Aa0\nWqq1rp1tOrG/CYTgzYRRPp/CqmBRvfmmFhEmNX8A+C1hfqNbgVt82Eeq5F8YHzUXxBgW21bSvpK/\n4XrPO52v1W3GEc6ZknZS7Xm7y7Inz2vEdbWsJPTCnEeYS+x3hF4P1/mwL6ixjeZfkaTdFbWN1Wtz\n7bS9es+TY1y6zU2u89WtJLShBcD/EoZj3w381of9/hrbPBPfZ03ghTr7lgHW5LEur+NgtXXJeWQ7\n7apRe0yCs+lUbwqdZYRj3UPAbYQLkn8mtLtqv9WTji061nVRV+6M1oL/IMyZVE+tf8q3EMq/OeHk\n6QnCvBppG8XHJ2rtPP7xrfYHaGU7wcrWoFgi0qb1gEMIP5iuInT5vsKHfXmN/E8TJnz9bmqZW9mW\nACsyaSQmCAcka+MxG5QpmpEqaWXq+XLgpZiWZx7Tzxc1yLs05qmVngIWtHpHCx/2ZVa2F4HPWtnW\nJ/wvf4jwf3pFs1doUydk6QBf0X8Mad+rp37zEuH/zEujeL6kyrJlhDa1uE5aQgi8LgCebSN4/jTw\nQSvbN4AKMBd4FFjcSq+ITEC9G8G/Iv6tDtK++021ttRqu1tWY3nSrhbXeP4C4Tj3FO23OYCzrGxX\nEm4l/lDcV7ULK1Vlgg/ZlFcAIq+20I9l77cflCtp3K4arU+Oadnl6TZWLyXtblErxycf9hEr20LC\nXdenEc4vHySMRHqpA+eXtVK185te5BvtPi/2YX+ome8kLdfgUbxCVusqWSM7EP7A58fXNwHHWdk2\n9GFPlu1F6KVwV4v7PoDQ/VtEOm8EOI0wIfa8Rpl92Jdb2bYidBFfm9CVfBrhquH4mMalno8ntF8n\n/I9o5dGpHpzJBmiaWd7uuprbDFDX908SAohfZ/WrvR57lb7EKw/c2ef9dkKWSOp3ZYPnnc6Xfb48\n5zL01b4HoO19i/C/8RBWzScHgJUtOdl3qre19PN+NNq20qntV+Twnnl8zo5sMwBt7gbC6IZkLrKX\nWdlGCG1uBdXbWvp1P0rOq/L+O11RgDL01b4HoN19Gvgn4KuE3wmJsXB+mZZug7Xq/WZCQLsl5n3w\nN2Jl25UwAev1hCsBuwKnAlf5sB8a84z7yVtZMn3S6l29r36CykkH+Kxel1lEROqzsk0mzJW0GWFc\n+yRCMGkCqx/4mn2eDr4V7uRuAE7KpM/FK6qbxbQp4eQ6uXFItaD7yjrLarW5wgQoejG8WaSR2Mt2\nJqHdrUU4zk0iBHSTv9dWjnlFDlIMQgBC+lw8v0za3IasOs6lzy+bOc5lzy8bBWTaWd+pPC//L+lm\nG+yX4NGOhEl0X0eo+AeAC4BT0mMer/1ve+QX81hy0cNsarB46nh+8tnXcszn9vXn8ym5iIiIiIiI\niEh/64vgUbMqFXsQOK1U8tPyLouIiIiIiIiIyCAYxOBRMuThYeCHwKmlkq+os03SjS1tWak0qju4\niYhIFZWKTSfMVfcbwgTAUwj/t6fUed7qvA9LCRMzJo9df17vOCOSt0rFriN04f8zzbW5cS2+xTKq\nt41utruXSqUBOomVgVKp2McIoyYupbk2V+9ub9WspMfHOWBpqaShoFJMlYqtS5io/gbC3Q7rtbVk\nWSuczrSlVs8vR1osZ1flfbe1plUqdhywD2Gi7OWlkq9TJdv5hEmydyDMo/FVYBPg6Dq7ng0MZ5aV\ngTmjLLKIiLzS7vHx0FKp8V0eKhUzVt3itdGJd7312efrNciTvajQqJwjtHYSsYwwaXXyuLzKsnbX\nLdcJviQqFRtPmDfy+FLJv9HkNs20uVbb4jRgRoPtWrGyUrFWTsaTu+t2sq2l21yhTvAld7sDfy2V\n/CMNc/LysW4S7bW5eu1vWoPtWvotWKnYclr7Adyx41qVZQogS9quhMmuP1oq+X2NMsc2N5HRt7ns\n87UarF9tfuYmypncqbXZAFTS7hql35ZK/mQrZYE+CB5VKnYS8KXM4imViqX/WWwL/C+wL+F2z7sC\nGwMXA0dVKnZMnZ5EJwKnZJap15FIQcR/7kcTAsHZ24T27FEnKB2zO/AYoXdoQ/F7T04cezZ/XaVi\nQ1Q/kR/tycW68fnEuP9aj632/MiWfwWtn5y/1EZqd7uaST/CO+5NwFTC1dimlEqe1McL3SpUVpM/\nnttpc2unnk9MpWy7G9U5cQwgt/NDuONtqEZaUWfdiI5xHbc78MtmM8fvP/nht7BbhcqKweV22lej\nNrcRqyYGr9XmJjHKO1vFYFYvjnUdPy6qzXXc7oSe7XObyRy//ySQ8mwXy7WaSsXGMfpzyuyyqcAG\nrLr4WSulj3N7Ai0Hj9oatlap2FfqrS+V/PiWd1r7vaYD66cW7UvoLbRLatn9hC/gCmCTJIpWqVgZ\n+Aqwfankd3SqTCLSO5WKbU6YJP9hwsnveELUvtrjqH5wNzBCb4NWyW2dq6WVddb1Mk/6AOKZx2rL\nDKgADzd7NXasiicX9X7o1nocTZ4JTaSJVZZ1+ra2Tv0fwElbzD5vd10n9pF+Xeuuf/XumNKxvKWS\newx+jovpU4QLZdNKJV/aWlWMHfE7y7aPZtvPaJc10/bSqRu3cG8UeGq2DTV63ol91Ho+2jsftbtt\ncoxL2twM4EHggFLJL266BsaYGDAeR2ePY83kbbW9JanTRmi+vXX7uNVOvnp34+v4sa1KgvC/MGl3\nVwNPl0r+wearYGxJXRidBCyKF4pa0u5Vln0zrycAWxD+kOYCHQselUq+AFiQvK5UbB7h5OiedL5K\nxXYFbs90v0oi9+vV2r/mPBIpvN3j446lkj9dL2PqB1Ot4FIRHqe0kH9cjTSUeuxHF+VdgKKLvW+S\nbsiFFgNd7Z6Mt5LGp9K4Gs+rvZ7cxnb11hVapVI1lvd7BY7qi0M9k54fhRaPdZ1sV43y1GobtdrJ\n5Ca3abbNdTpA3Su/z7sARRZ7fiQBiUU5F6euVKCrlQsr7aZ2jk2TgDWayNfsum5ejO2mz+VdgCKL\nx7lRnVu2dRJUKvnfZZdVKrY28APgZ+0Wpp7UnEc7ERpo1huBN2WGsyWm1tm15jwSKbbdgXsaBY7g\n5X+KKwlXbgZePJlJX3WpFmCql0abJ2GZx3rPFwNXtv2hpXBioGuEPvjR3Qnxh3ujk/ChOmlcG+ta\n2WYcr+wxeHNXvgzJRTzWJcMtBl7qwlCjH8H12l2z7andvMbqPTPmlUr+WFe+EOm5TKCr8Bd1RisV\nLGu2vbVzXOvEMS99nFtGC0NFpT0du4JWKvnzlYoNA5cDFzSzTY35jLK2jb2MJgKXEOa8eFeVfCMA\nV1992PK3v/1HKydOXPYw8HPgiwBmTAQOcOf8zHaa80ik2HYDbsy7EEUUT2aSg6aI9ED84Z7MpSEi\nXTbWLgyJ5C0TLNPvYnlZp7tfT4upWf9B6K1Uz/0ApZIPA1Qq9oYa+e4EPnjyyeeMP/nkczd2Z36l\nYlsQgkdPEGY+PxdWDx7F4WmrNYpKxX5eo9t3Le10p211m0F5j3a20Xt0d5uivscQoev7G4Fvt7G9\niIiIiIiIdEBbwaNKxY7KLDLC3c0OAa5qdj/Z+YxG6SYAMx+69NIZd1cq8+cCdxF6Kt0FvA54LrtR\njTmP9tecRyIiIiIiIiIi7fc8OjrzeiUhCHQeYRhYx1Uq9hrCxNfrxdc7xFX3lUr+4qxZK0+YOfP2\nFeDjDz307henTn1u5lprLXzzkiVTX3j00W1uIUzofXWVXWvOIxERERERERGRGtqdMHuLTrx5i3Me\nHQ8cmlr+5/g4C6iAXfb88+tdD3b0u9997iaTJi1ZvnTp1EcmTlw67Yc/nP1jwi0zL62yf815JFJQ\nZqwJ/Bo41J178y6PjI4Z4wj1eaw7f8i7PCJjgRlfAtZwf8WFMhHpAjPeBJwOvMNdvylEus0MA64B\njnfnd3mXZ5CZe7Wbk/VGpWLTgfUbZLu/VPKXJ6WsVOww4LRSydepltmMQ4GL3Vlaqdg+wBXAZA1D\nE+k/ZuxCuEvQEe6ck3d5ZHTMeDXwMHCcOyfkXR6RscCMG4Gp7rwp77KIjAVmfBY4DdjWnXvyLo/I\noDNjBjAPmONOOe/yDLJOT5jdkg7PeQSAO+elXu4ALFTgSKRvbZl5lP6m+hTpvS2BqWaYO/ldMRQZ\nO9LHOgWPRLpP55c9kmvwqBWpOY9eA4zLznlUqdh7gY1mzVp5JuDgNjS0wtzN3Vfdxtqdcb0vvYi0\naWbmUfqb6lOkh+LQ3w3jy+nA/ByLIzJW6Fgn0ltqcz3SN8EjGs55xEvAp8vl/ZaZOcuWTZn3wAPb\nXX3xxV+4fmRkaIe4rcb7i/QXHQwGi+pTpLfSc1TORMEjkV7QsU6kt9TmeiTXOY96yYwDgY+48/68\nyyIizTGjAuwBPO3OBjkXR0bJjB8BBwAOTNFEoiLdZcYHgJ/Flwe7c1Ge5REZdGYMAYuBScDl7rwv\n5yKJDDwzzgcOiS+nurM4z/IMsqG8C9BDfwD2zLsQItKSmYQ7Ja5vxrScyyKjl9SnAZvlWxSRMWEm\n4YfsU+iKrEgvbEwIHD2I2pxIryTnl7B6j1vpsDERPDJjCnAU8FjeZRGR5pgxCdiUcGt30MFgEMxk\nVX3qpFqk+2YCc2NSmxPpvqSd/RqYGW8hLiLdpfPLHhm4YWtmLITV7iZiwFqEK28Hu/OLJvZxW5eK\nJyLNmwC8Dtgf+AnhisILeRZIRsWA7Qjdis8BngQW5loikcG3GXA9sAh4H/BAvsURGXjTCDf3OQD4\nMXAnsDLXEokMvu2Bw4DvAM/EJPUd6c7NrW40iMGjw1g9eLQSWADc7N74h4qZTQJmAye6u+bjKDjV\nV/9RnfUX1Vf/UZ31F9VX/1Gd9RfVV/9RnfUX1dfYMXDBIwAz1gGOALaNi+4Cznbnucbb2trAc8A0\nd3++e6WUTlB99R/VWX9RffUf1Vl/UX31H9VZf1F99R/VWX9RfY0dAzfnkRk7A/cBRwPrxXQ0MNeM\nHfMsm4iIiIiIiIhIvxmfdwG64FTgcsI4vhUAZowHvg+cBrwtx7KJiIiIiIiIiPSVQQwe7UwqcATg\nzgozvgG4Lcd7AAAIDUlEQVTcml+xRERERERERET6z8ANWwOeJ9zlIOvVNHenpmVAOT5K8am++o/q\nrL+ovvqP6qy/qL76j+qsv6i++o/qrL+ovsaIgZsw24xvA/sCXwBujIt3B74JXOrOv+ZVNhERERER\nERGRfjOIw9a+ADhwPqs+30vAd4Bj8iqUiIiIiIiIiEg/GrieRwkz1gC2jC/nurM4z/KIiIiIiIiI\niPSjgQ0eiYiIiIiIiIjI6A3ihNkiIiIiIiIiItIhCh6lmNmnzexBM1tqZjeb2S55l2ksMrO3mdnl\nZva4mbmZfSCz3szseDObZ2ZLzOxaM9sqk2eymZ1hZk+b2YtmdqmZbdTbTzI2mNlsM/sfM3vBzOab\n2WVmtk0mj+qsQMzsk2Z2m5k9H9NNZrZ3ar3qq8DM7Jj4v/G01DLVWYGY2ZxYR+l0T2q96qtgzOxV\nZnZh/L6XmNntZrZzar3qrEDi+Xq2jbmZnRHXq74KxMzGmdlXzeyBWB9zzezLZmapPKqzgjGztczs\nNDN7KNbJjWb25tR61dkYo+BRZGYfAU4h3GZwR+CvwK/MbMNcCzY2TSV8/5+usf7fgKOATwBvARYR\n6mpyKs+pwHuBDwN7AJsAP+1Wgce4PYAzgLcCewETgGvMbGoqj+qsWB4l3EBgJ2Bn4L+Bn5vZG+J6\n1VdBxZO2jwO3ZVapzornTmDjVPr71DrVV4GY2brADYQbrOwNvB74PLAwlU11VixvZvX2tVdcfkl8\nVH0Vy5eATwKfAbaNr/8N+JdUHtVZ8Xyf0LYOAbYHrgGuNbNXxfWqs7HG3ZXCvE83A6enXg8BjwHH\n5F22sZwId877QOq1AfOAL6SWTQOWAgekXi8H9kvleV3c11vz/kyDnoDp8bt+m+qsfxLwDHCE6qu4\nCVgTuBd4B1ABTovLVWcFS8Ac4C811qm+CpaAk4Df1VmvOit4Ak4D7ot1pfoqWAKuAM7OLLsUuDA+\nV50VLAFTgBXAPpnlfwS+pjobm0k9jwAzm0i4An9tsszdV8bXu+ZVLqlqC2AGq9fVc4TgX1JXOxF6\nv6Tz3AM8jOqzF6bFx2fio+qswGJX8gMIPf5uQvVVZGcAV7r7tZnlqrNi2srC8Ov7zewiM3tNXK76\nKp73Abea2SUWhl//2cyOTK1XnRVYPI8/GDjHw69T1Vfx3AjsaWZbA5jZmwi9Ma+K61VnxTMeGEcI\nBqUtIdSd6mwMGp93AQpiA0LjeDKz/ElCdFSKY0Z8rFZXM1J5lrv7s3XySBeY2RDh6t8N7n5HXKw6\nKyAz254QLJoMvAjs6+53mdluMYvqq0BigG9HwlCNLLWx4rkZOAz4G2FIzTDwOzPbDtVXEc0kDKk5\nBTiB0M6+bWbL3f08VGdF9wFgHeAH8bXqq3hOAtYG7jGzEcLvruPc/aK4XnVWMO7+gpndBHzZzO4m\nfM//SAj63IfqbExS8EhEOukMYDtWn9tDiulvwA6EnmL7AeeZ2R75FkmqMbNXA98C9nL37BVAKSB3\nvyr18jYzuxl4CNgfuDufUkkdQ8Ct7n5sfP3nGOj7BHBefsWSJh0BXOXuj+ddEKlpf+Ag4EDCfHA7\nAKeZ2eMxQCvFdAhwDmEqlxHgT8CPCD2KZAzSsLXgKUKDyM78vhHwRO+LI3Uk9VGvrp4AJprZOnXy\nSIeZ2enAe4BZ7v5oapXqrIDcfbm73+fuf3T32YRJ6j+L6quIdgI2BP5kZivMbAVh0smj4vPkqp/q\nrKDiVdd7gdeiNlZE84C7MsvuBpKhhqqzgjKzzQjzwH0/tVj1VTzfBE529x+7++3ufgFhIuXZcb3q\nrIDcfa6770GYc/HV7r4LYRja/ajOxiQFjwg/ogiTf+2ZLIvDb/YkDOuQ4niA8M8mXVdrE2b4T+rq\nj4Q7pqTzbEM4CVR9dli8TefpwL7A2939gUwW1Vl/GAImofoqousIdznZIZVuBS6Kz5OTONVZQZnZ\nmoTA0TzUxoroBmCbzLKtCb3FQHVWZB8F5gNXppapvopnDcLky2kjrPotqjorMHdf5O7z4p0p3wX8\nHNXZ2JT3jN1FScBHCBOCHUq4heT3CLdo3Sjvso21RIhuJz+QHDg6Pn9NXP+lWDfvI/yguozw42ly\nah/fIZz0zSJctb8RuDHvzzaICfgv4FlCT4gZqTQllUd1VqAEnAi8Ddg81seJwErCsCjVVx8kUndb\nU50VLwH/Hv8nbg7sBvwaWABMV30VLxHmOHoJOJYQ5DuQcMvpg1J5VGcFS4TAw0PASVXWqb4KlAjz\nUT0K7BP/L+4b/yeerDorbiIEit5NmBx7L+AvwB+ACaqzsZlyL0CREvCZ+Me9jDDZ5VvyLtNYTECJ\nEDTKph/E9QYcT4h2LyXM4L91Zh+TCfPvPBNPAH8KzMj7sw1iqlFXDhyWyqM6K1ACzgYejP/r5sf6\n2Ev11T+JVwaPVGcFSsCPgcdjG3s0vt5S9VXcRBh2fXusj7uBIzPrVWcFS8A74/nG1lXWqb4KlIC1\nCDdUeYhwt665hNu9T1SdFTcR5qqaG49l84DTgWmqs7GbLFaqiIiIiIiIiIjIK2jOIxERERERERER\nqUnBIxERERERERERqUnBIxERERERERERqUnBIxERERERERERqUnBIxERERERERERqUnBIxERERER\nERERqUnBIxERERERERERqUnBIxERERERERERqUnBIxERERERERERqUnBIxERERERERERqUnBIxER\nERERERERqUnBIxERERERERERqen/A7l6RJO8oWucAAAAAElFTkSuQmCC\n", 131 | "text/plain": [ 132 | "" 133 | ] 134 | }, 135 | "metadata": {}, 136 | "output_type": "display_data" 137 | } 138 | ], 139 | "source": [ 140 | "rend.render_figure([rend.IzhikevichNeuronTraceRenderer(neuron_layer_0_traces, \n", 141 | " 'Layer 0 Neuron')], \n", 142 | " 1000, 2000)" 143 | ] 144 | } 145 | ], 146 | "metadata": { 147 | "kernelspec": { 148 | "display_name": "Python 3", 149 | "language": "python", 150 | "name": "python3" 151 | }, 152 | "language_info": { 153 | "codemirror_mode": { 154 | "name": "ipython", 155 | "version": 3 156 | }, 157 | "file_extension": ".py", 158 | "mimetype": "text/x-python", 159 | "name": "python", 160 | "nbconvert_exporter": "python", 161 | "pygments_lexer": "ipython3", 162 | "version": "3.6.0" 163 | }, 164 | "toc": { 165 | "colors": { 166 | "hover_highlight": "#DAA520", 167 | "running_highlight": "#FF0000", 168 | "selected_highlight": "#FFD700" 169 | }, 170 | "moveMenuLeft": true, 171 | "nav_menu": { 172 | "height": "103px", 173 | "width": "252px" 174 | }, 175 | "navigate_menu": true, 176 | "number_sections": true, 177 | "sideBar": true, 178 | "threshold": 4, 179 | "toc_cell": false, 180 | "toc_section_display": "block", 181 | "toc_window_display": false, 182 | "widenNotebook": false 183 | } 184 | }, 185 | "nbformat": 4, 186 | "nbformat_minor": 2 187 | } 188 | -------------------------------------------------------------------------------- /jupyter_examples/example_12_spike_processes.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# spike processes: basically just lists of timesteps at which neurons fired" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": { 14 | "collapsed": false 15 | }, 16 | "outputs": [ 17 | { 18 | "name": "stderr", 19 | "output_type": "stream", 20 | "text": [ 21 | "/Applications/Anaconda/anaconda/envs/mlbook/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 22 | " from ._conv import register_converters as _register_converters\n" 23 | ] 24 | } 25 | ], 26 | "source": [ 27 | "import numpy as np\n", 28 | "from spikeflow import firing_to_spike_process, firings_to_spike_processes\n", 29 | "from spikeflow import spike_process_to_firing, spike_processes_to_firings\n", 30 | "from spikeflow import spike_process_delta_times" 31 | ] 32 | }, 33 | { 34 | "cell_type": "markdown", 35 | "metadata": {}, 36 | "source": [ 37 | "# Create a couple firing records" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": 2, 43 | "metadata": { 44 | "collapsed": false 45 | }, 46 | "outputs": [ 47 | { 48 | "name": "stdout", 49 | "output_type": "stream", 50 | "text": [ 51 | "firing 1: (8,) [ True False False True False False False True]\n", 52 | "firing 2: (8,) [False False True False False True True False]\n" 53 | ] 54 | } 55 | ], 56 | "source": [ 57 | "f1 = np.array([True, False, False, True, False, False, False, True])\n", 58 | "f2 = np.array([False, False, True, False, False, True, True, False])\n", 59 | "print('firing 1:', f1.shape, f1)\n", 60 | "print('firing 2:', f2.shape, f2)" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "metadata": {}, 66 | "source": [ 67 | "# Convert the firing records to spike processes" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 3, 73 | "metadata": { 74 | "collapsed": false 75 | }, 76 | "outputs": [ 77 | { 78 | "name": "stdout", 79 | "output_type": "stream", 80 | "text": [ 81 | "spike process 1: (3,) [0 3 7]\n", 82 | "spike process 2: (3,) [2 5 6]\n" 83 | ] 84 | } 85 | ], 86 | "source": [ 87 | "s1 = firing_to_spike_process(f1)\n", 88 | "s2 = firing_to_spike_process(f2)\n", 89 | "print('spike process 1:', s1.shape, s1)\n", 90 | "print('spike process 2:', s2.shape, s2)" 91 | ] 92 | }, 93 | { 94 | "cell_type": "markdown", 95 | "metadata": {}, 96 | "source": [ 97 | "# Convert a spike processes back to firing record" 98 | ] 99 | }, 100 | { 101 | "cell_type": "code", 102 | "execution_count": 4, 103 | "metadata": { 104 | "collapsed": false 105 | }, 106 | "outputs": [ 107 | { 108 | "name": "stdout", 109 | "output_type": "stream", 110 | "text": [ 111 | "recovered firing: (8,) [ True False False True False False False True]\n" 112 | ] 113 | } 114 | ], 115 | "source": [ 116 | "rf1 = spike_process_to_firing(s1)\n", 117 | "print('recovered firing:', rf1.shape, rf1)" 118 | ] 119 | }, 120 | { 121 | "cell_type": "markdown", 122 | "metadata": {}, 123 | "source": [ 124 | "# Create a firings record (for multiple neurons)" 125 | ] 126 | }, 127 | { 128 | "cell_type": "code", 129 | "execution_count": 5, 130 | "metadata": { 131 | "collapsed": false 132 | }, 133 | "outputs": [ 134 | { 135 | "name": "stdout", 136 | "output_type": "stream", 137 | "text": [ 138 | "firings: (8, 2)\n", 139 | "[[ True False]\n", 140 | " [False False]\n", 141 | " [False True]\n", 142 | " [ True False]\n", 143 | " [False False]\n", 144 | " [False True]\n", 145 | " [False True]\n", 146 | " [ True False]]\n" 147 | ] 148 | } 149 | ], 150 | "source": [ 151 | "tf = np.vstack([f1, f2]).T\n", 152 | "print('firings:', tf.shape)\n", 153 | "print(tf)" 154 | ] 155 | }, 156 | { 157 | "cell_type": "markdown", 158 | "metadata": {}, 159 | "source": [ 160 | "# Convert a firings record to (multiple) spike processes" 161 | ] 162 | }, 163 | { 164 | "cell_type": "code", 165 | "execution_count": 6, 166 | "metadata": { 167 | "collapsed": false 168 | }, 169 | "outputs": [ 170 | { 171 | "name": "stdout", 172 | "output_type": "stream", 173 | "text": [ 174 | "spike processes: 2\n", 175 | "[array([0, 3, 7]), array([2, 5, 6])]\n" 176 | ] 177 | } 178 | ], 179 | "source": [ 180 | "s = firings_to_spike_processes(tf)\n", 181 | "print('spike processes:', len(s))\n", 182 | "print(s)" 183 | ] 184 | }, 185 | { 186 | "cell_type": "markdown", 187 | "metadata": {}, 188 | "source": [ 189 | "# Convert spike processes back to firings record" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": 7, 195 | "metadata": { 196 | "collapsed": false 197 | }, 198 | "outputs": [ 199 | { 200 | "name": "stdout", 201 | "output_type": "stream", 202 | "text": [ 203 | "Recovered firings shape (8, 2)\n", 204 | "[[ True False]\n", 205 | " [False False]\n", 206 | " [False True]\n", 207 | " [ True False]\n", 208 | " [False False]\n", 209 | " [False True]\n", 210 | " [False True]\n", 211 | " [ True False]]\n" 212 | ] 213 | } 214 | ], 215 | "source": [ 216 | "rf = spike_processes_to_firings(s)\n", 217 | "print('Recovered firings shape', rf.shape)\n", 218 | "print(rf)" 219 | ] 220 | }, 221 | { 222 | "cell_type": "markdown", 223 | "metadata": {}, 224 | "source": [ 225 | "# Same, but pad length to something" 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": 8, 231 | "metadata": { 232 | "collapsed": false 233 | }, 234 | "outputs": [ 235 | { 236 | "name": "stdout", 237 | "output_type": "stream", 238 | "text": [ 239 | "Recovered firings shape (12, 2)\n", 240 | "[[ True False]\n", 241 | " [False False]\n", 242 | " [False True]\n", 243 | " [ True False]\n", 244 | " [False False]\n", 245 | " [False True]\n", 246 | " [False True]\n", 247 | " [ True False]\n", 248 | " [False False]\n", 249 | " [False False]\n", 250 | " [False False]\n", 251 | " [False False]]\n" 252 | ] 253 | } 254 | ], 255 | "source": [ 256 | "rf = spike_processes_to_firings(s, max_length=12)\n", 257 | "print('Recovered firings shape', rf.shape)\n", 258 | "print(rf)" 259 | ] 260 | }, 261 | { 262 | "cell_type": "markdown", 263 | "metadata": {}, 264 | "source": [ 265 | "# padding uses max of max_length and max of spike processe (so max lengths < max(spike_processes) has no effect)" 266 | ] 267 | }, 268 | { 269 | "cell_type": "code", 270 | "execution_count": 9, 271 | "metadata": { 272 | "collapsed": false 273 | }, 274 | "outputs": [ 275 | { 276 | "name": "stdout", 277 | "output_type": "stream", 278 | "text": [ 279 | "Recovered firings shape (8, 2)\n", 280 | "[[ True False]\n", 281 | " [False False]\n", 282 | " [False True]\n", 283 | " [ True False]\n", 284 | " [False False]\n", 285 | " [False True]\n", 286 | " [False True]\n", 287 | " [ True False]]\n" 288 | ] 289 | } 290 | ], 291 | "source": [ 292 | "rf = spike_processes_to_firings(s, max_length=4)\n", 293 | "print('Recovered firings shape', rf.shape)\n", 294 | "print(rf)" 295 | ] 296 | }, 297 | { 298 | "cell_type": "markdown", 299 | "metadata": {}, 300 | "source": [ 301 | "# For kicks, compute delta times between two spike processes" 302 | ] 303 | }, 304 | { 305 | "cell_type": "code", 306 | "execution_count": 10, 307 | "metadata": { 308 | "collapsed": false 309 | }, 310 | "outputs": [ 311 | { 312 | "name": "stdout", 313 | "output_type": "stream", 314 | "text": [ 315 | "Delta times shape (9,)\n", 316 | "[ 2 5 6 -1 2 3 -5 -2 -1]\n" 317 | ] 318 | } 319 | ], 320 | "source": [ 321 | "delta_times = spike_process_delta_times(s1, s2)\n", 322 | "print('Delta times shape', delta_times.shape)\n", 323 | "print(delta_times)" 324 | ] 325 | } 326 | ], 327 | "metadata": { 328 | "kernelspec": { 329 | "display_name": "Python 3", 330 | "language": "python", 331 | "name": "python3" 332 | }, 333 | "language_info": { 334 | "codemirror_mode": { 335 | "name": "ipython", 336 | "version": 3 337 | }, 338 | "file_extension": ".py", 339 | "mimetype": "text/x-python", 340 | "name": "python", 341 | "nbconvert_exporter": "python", 342 | "pygments_lexer": "ipython3", 343 | "version": "3.6.0" 344 | }, 345 | "toc": { 346 | "colors": { 347 | "hover_highlight": "#DAA520", 348 | "running_highlight": "#FF0000", 349 | "selected_highlight": "#FFD700" 350 | }, 351 | "moveMenuLeft": true, 352 | "nav_menu": { 353 | "height": "175px", 354 | "width": "252px" 355 | }, 356 | "navigate_menu": true, 357 | "number_sections": true, 358 | "sideBar": true, 359 | "threshold": 4, 360 | "toc_cell": false, 361 | "toc_section_display": "block", 362 | "toc_window_display": false, 363 | "widenNotebook": false 364 | } 365 | }, 366 | "nbformat": 4, 367 | "nbformat_minor": 2 368 | } 369 | -------------------------------------------------------------------------------- /jupyter_examples/example_13_stdp_basics.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# STDP Basics, 'offline'\n", 8 | "\n", 9 | "Demonstrates calculate dw from the stdp learning rule, applied to spike processes and firing records (as can be obtained from the model.step_time end timestep callback)." 10 | ] 11 | }, 12 | { 13 | "cell_type": "code", 14 | "execution_count": 1, 15 | "metadata": { 16 | "collapsed": false 17 | }, 18 | "outputs": [ 19 | { 20 | "name": "stderr", 21 | "output_type": "stream", 22 | "text": [ 23 | "/Applications/Anaconda/anaconda/envs/mlbook/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 24 | " from ._conv import register_converters as _register_converters\n" 25 | ] 26 | } 27 | ], 28 | "source": [ 29 | "import numpy as np\n", 30 | "from spikeflow import firing_to_spike_process, firings_to_spike_processes\n", 31 | "from spikeflow import spike_process_delta_times\n", 32 | "from spikeflow import STDPParams, stdp_offline_dw, stdp_offline_dw_process, stdp_offline_dw_processes, stdp_offline_dw_firings" 33 | ] 34 | }, 35 | { 36 | "cell_type": "markdown", 37 | "metadata": {}, 38 | "source": [ 39 | "# Make some data: create a couple firing records and spike processes" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": 2, 45 | "metadata": { 46 | "collapsed": false 47 | }, 48 | "outputs": [ 49 | { 50 | "name": "stdout", 51 | "output_type": "stream", 52 | "text": [ 53 | "firing 1: (8,) [ True False False True True False False True]\n", 54 | "firing 2: (8,) [False False True False False True True False]\n", 55 | "spike process 1: (4,) [0 3 4 7]\n", 56 | "spike process 2: (3,) [2 5 6]\n" 57 | ] 58 | } 59 | ], 60 | "source": [ 61 | "f1 = np.array([True, False, False, True, True, False, False, True])\n", 62 | "f2 = np.array([False, False, True, False, False, True, True, False])\n", 63 | "print('firing 1:', f1.shape, f1)\n", 64 | "print('firing 2:', f2.shape, f2)\n", 65 | "\n", 66 | "s1 = firing_to_spike_process(f1)\n", 67 | "s2 = firing_to_spike_process(f2)\n", 68 | "print('spike process 1:', s1.shape, s1)\n", 69 | "print('spike process 2:', s2.shape, s2)" 70 | ] 71 | }, 72 | { 73 | "cell_type": "markdown", 74 | "metadata": {}, 75 | "source": [ 76 | "# Apply STDP learning rule to calculate change in weight in various ways" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": 3, 82 | "metadata": { 83 | "collapsed": false 84 | }, 85 | "outputs": [], 86 | "source": [ 87 | "stdp_params = STDPParams(APlus=1.0, AMinus=1.0, TauPlus=10.0, TauMinus=10.0)" 88 | ] 89 | }, 90 | { 91 | "cell_type": "markdown", 92 | "metadata": {}, 93 | "source": [ 94 | "## ... calculate from time deltas, to check the math" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": 4, 100 | "metadata": { 101 | "collapsed": false 102 | }, 103 | "outputs": [ 104 | { 105 | "name": "stdout", 106 | "output_type": "stream", 107 | "text": [ 108 | "Delta times shape (12,)\n", 109 | "[ 2 5 6 -1 2 3 -2 1 2 -5 -2 -1]\n", 110 | "dw shape (12,)\n", 111 | "[ 0.81873075 0.60653066 0.54881164 -0.90483742 0.81873075 0.74081822\n", 112 | " -0.81873075 0.90483742 0.81873075 -0.60653066 -0.81873075 -0.90483742]\n" 113 | ] 114 | } 115 | ], 116 | "source": [ 117 | "delta_times = spike_process_delta_times(s1, s2)\n", 118 | "print('Delta times shape', delta_times.shape)\n", 119 | "print(delta_times)\n", 120 | "\n", 121 | "dw = stdp_offline_dw(delta_times, stdp_params)\n", 122 | "print('dw shape', dw.shape)\n", 123 | "print(dw)" 124 | ] 125 | }, 126 | { 127 | "cell_type": "markdown", 128 | "metadata": {}, 129 | "source": [ 130 | "## ... directly from spike processes" 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": 5, 136 | "metadata": { 137 | "collapsed": false 138 | }, 139 | "outputs": [ 140 | { 141 | "name": "stdout", 142 | "output_type": "stream", 143 | "text": [ 144 | "SUM dw from time deltas: 1.2035231918177673\n", 145 | "SUM dw from spike processes: 1.2035231918177673\n" 146 | ] 147 | } 148 | ], 149 | "source": [ 150 | "dw_p = stdp_offline_dw_process(s1, s2, stdp_params)\n", 151 | "print('SUM dw from time deltas:', sum(dw))\n", 152 | "print('SUM dw from spike processes:', dw_p)" 153 | ] 154 | }, 155 | { 156 | "cell_type": "markdown", 157 | "metadata": {}, 158 | "source": [ 159 | "## ... but like a 1x1 weight matrix" 160 | ] 161 | }, 162 | { 163 | "cell_type": "code", 164 | "execution_count": 6, 165 | "metadata": { 166 | "collapsed": false 167 | }, 168 | "outputs": [ 169 | { 170 | "name": "stdout", 171 | "output_type": "stream", 172 | "text": [ 173 | "dW shape (1, 1)\n", 174 | "[[1.2035232]]\n" 175 | ] 176 | } 177 | ], 178 | "source": [ 179 | "dW = stdp_offline_dw_processes(np.ones((1,1), dtype=np.float32), [s1], [s2], stdp_params)\n", 180 | "print('dW shape', dW.shape)\n", 181 | "print(dW)" 182 | ] 183 | }, 184 | { 185 | "cell_type": "markdown", 186 | "metadata": {}, 187 | "source": [ 188 | "## ... or like a bigger, sparse weight matrix" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 7, 194 | "metadata": { 195 | "collapsed": false 196 | }, 197 | "outputs": [ 198 | { 199 | "name": "stdout", 200 | "output_type": "stream", 201 | "text": [ 202 | "W shape (2, 3)\n", 203 | "[[1. 0. 0.]\n", 204 | " [0. 1. 1.]]\n", 205 | "dW shape (2, 3)\n", 206 | "[[1.2035232 0. 0. ]\n", 207 | " [0. 1.2035232 0. ]]\n" 208 | ] 209 | } 210 | ], 211 | "source": [ 212 | "W = np.array([[1, 0, 0], [0, 1, 1]], dtype=np.float32)\n", 213 | "dW = stdp_offline_dw_processes(W, [s1, s1], [s2, s2, s1], stdp_params)\n", 214 | "print('W shape', W.shape)\n", 215 | "print(W)\n", 216 | "print('dW shape', dW.shape)\n", 217 | "print(dW)" 218 | ] 219 | }, 220 | { 221 | "cell_type": "markdown", 222 | "metadata": {}, 223 | "source": [ 224 | "## ... same but from firing matrices\n", 225 | "\n", 226 | "(as can be extracted from model.step_time's end timestep callback)" 227 | ] 228 | }, 229 | { 230 | "cell_type": "code", 231 | "execution_count": 8, 232 | "metadata": { 233 | "collapsed": false 234 | }, 235 | "outputs": [ 236 | { 237 | "name": "stdout", 238 | "output_type": "stream", 239 | "text": [ 240 | "f_in shape (8, 2)\n", 241 | "f_out shape (8, 3)\n", 242 | "dWf shape (2, 3)\n", 243 | "[[1.2035232 0. 0. ]\n", 244 | " [0. 1.2035232 0. ]]\n" 245 | ] 246 | } 247 | ], 248 | "source": [ 249 | "f_in = np.vstack([f1, f1]).T\n", 250 | "f_out = np.vstack([f2, f2, f1]).T\n", 251 | "print('f_in shape', f_in.shape)\n", 252 | "print('f_out shape', f_out.shape)\n", 253 | "\n", 254 | "dWf = stdp_offline_dw_firings(W, f_in, f_out, stdp_params)\n", 255 | "print('dWf shape', dWf.shape)\n", 256 | "print(dWf)" 257 | ] 258 | } 259 | ], 260 | "metadata": { 261 | "kernelspec": { 262 | "display_name": "Python 3", 263 | "language": "python", 264 | "name": "python3" 265 | }, 266 | "language_info": { 267 | "codemirror_mode": { 268 | "name": "ipython", 269 | "version": 3 270 | }, 271 | "file_extension": ".py", 272 | "mimetype": "text/x-python", 273 | "name": "python", 274 | "nbconvert_exporter": "python", 275 | "pygments_lexer": "ipython3", 276 | "version": "3.6.0" 277 | }, 278 | "toc": { 279 | "colors": { 280 | "hover_highlight": "#DAA520", 281 | "running_highlight": "#FF0000", 282 | "selected_highlight": "#FFD700" 283 | }, 284 | "moveMenuLeft": true, 285 | "nav_menu": { 286 | "height": "175px", 287 | "width": "252px" 288 | }, 289 | "navigate_menu": true, 290 | "number_sections": true, 291 | "sideBar": true, 292 | "threshold": 4, 293 | "toc_cell": false, 294 | "toc_section_display": "block", 295 | "toc_window_display": false, 296 | "widenNotebook": false 297 | } 298 | }, 299 | "nbformat": 4, 300 | "nbformat_minor": 2 301 | } 302 | -------------------------------------------------------------------------------- /jupyter_examples/example_1_three_identity_neurons.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Three identity neurons" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": { 14 | "collapsed": false 15 | }, 16 | "outputs": [ 17 | { 18 | "name": "stderr", 19 | "output_type": "stream", 20 | "text": [ 21 | "/Applications/Anaconda/anaconda/envs/mlbook/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 22 | " from ._conv import register_converters as _register_converters\n" 23 | ] 24 | } 25 | ], 26 | "source": [ 27 | "import numpy as np\n", 28 | "from spikeflow import BPNNModel, IdentityNeuronLayer\n", 29 | "import spikeflow.drawing_utils.trace_renderers as rend" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "metadata": {}, 35 | "source": [ 36 | "# Create a model layer with 3 neurons in a single layer" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": 2, 42 | "metadata": { 43 | "collapsed": false 44 | }, 45 | "outputs": [], 46 | "source": [ 47 | "model_input_shape = (3,)\n", 48 | "\n", 49 | "nl = IdentityNeuronLayer('n1', 3)\n", 50 | "\n", 51 | "model = BPNNModel.compiled_model(model_input_shape, [nl], [])" 52 | ] 53 | }, 54 | { 55 | "cell_type": "markdown", 56 | "metadata": {}, 57 | "source": [ 58 | "# Run the model for 1000 timesteps" 59 | ] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": 3, 64 | "metadata": { 65 | "collapsed": false 66 | }, 67 | "outputs": [], 68 | "source": [ 69 | "traces = []\n", 70 | "\n", 71 | "def end_time_step_callback(i, graph, sess, results):\n", 72 | " traces.append(results)\n", 73 | " \n", 74 | "data = (np.ones(3,)*(7 if i > 200 else 0) for i in range(0, 1000))\n", 75 | " \n", 76 | "model.run_time(data, end_time_step_callback)" 77 | ] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "metadata": {}, 82 | "source": [ 83 | "# Extract the data we want and display" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 4, 89 | "metadata": { 90 | "collapsed": false 91 | }, 92 | "outputs": [ 93 | { 94 | "data": { 95 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABHoAAAC9CAYAAADMbRpqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Xm4XWV99//3J4GESYkoCCJqVEQpPipS2+pPAWeetj4O\nDD4QhUpxQKUg2gLWGiyKQx8UCxXUWGS41FIsVSmItDigAuIEBoGLMBpGZSYQhnx/f6y1YWezc84+\nJyfZ+yTv13Wta59932vd67v2zQnnfM89pKqQJEmSJEnS9Ddj2AFIkiRJkiRpapjokSRJkiRJWkOY\n6JEkSZIkSVpDmOiRJEmSJElaQ5jokSRJkiRJWkOY6JEkSZIkSVpDmOiRJEmSJElaQ5jokSRJkiRJ\nWkOY6JEkSZIkSVpDmOiRJEmSJElaQ5jokSRJj5FknySVZIdhxzJVkjwvyVlJ7klyW5KTkmw64LXX\ntJ/HP/ep26mt23Xqox49SV6a5LwkS5LclOTzSTYadlySJKlhokeSJK3xkjwV+CHwbOAw4J+APwe+\nl2TWBJraL8lTVkGI00KSFwL/DWwAfAD4MvBO4NRhxiVJkh61zrADkCRJWllJZgCzqur+FZxyGLAh\n8OKquq695kLge8A+wBcHuM1CYBvgEOCAlY15KiTZoKqWrMZbfgK4Hdipqu5qY7gG+FKS11bV2asx\nFkmS1IcjeiRJ0qQkmZXkY0l+nuTOJPcm+VGSnbvOSTvt6T/7XL9ee93xXWWzkxye5MokS5Ncn+TT\nSWb3XFtJjkmyV5KFwFLg9WOE+xbgO50kD0BVnQNcAew+4CNfA5zIgKN6kmyZ5CtJbm6fZWGSd/Sc\n05ki94ye8s50sJ26yr6f5DdJXpzkh0mW0CReOvX7t/dYmuSGJMcmmdPTbqeNbZOc206/Wpzkbwd4\nnscDrwFO7iR5WicC9zD45yhJklYhEz2SJGmyHg/8NfB94O+A+cCmwHfbKT5UVQEnA7sk2aTn+r9s\n2zgZHhmV8y3gg8C3gfcDpwMHAd/oc/9XAp9t6/6GJhHzGEm2BDYDLupTfSHwovEf9REfpxkRfchY\nJyV5MnA+8GrgmDa+K4EFSQ6cwP16PRE4E/gVcCBwbnu/+cCxwA3AwcBpwLuAs5Os29PGE4CzgF+3\n514GfCrJLuPc+/k0z77c51hVD7TxTORzlCRJq4hTtyRJ0mTdDjyj/UUfgCRfokkcvB/Yty0+Efgw\nzYiP47qun0eTnDmvfb8nTWJkx6rqlJHkN8BxSV5aVT/pun4b4PlVdek4cW7Rvt7Yp+5GYJMks6tq\n6TjtUFVXJTmJZlTPkVXVr01oEkIz2/j+0JYdl+RrwPwkx1fVfePdr4/NgXdXVfcoqE2BQ4GzgV2q\nallbfhlNkmke8K9dbTwFeHtVndSetwC4lqa/zhzj3uN9ji+fxPNIkqQp5ogeSZI0KVX1cCfJk2RG\nO2KnM+Jj+67zrgAuAPbqlLXn7gKc0o76AdgN+C1wWZIndQ7gf9r6R6aEtX4wQJIHYP32tV8i5/6e\ncwZxBGOM6kkSmqli327fdj/Ld4GN6fp8JmgpyydtoEmOzQI+10nytL4E3EWz6HS3e2hHUcEjI3Iu\nBJ45zr3H+xwn8hlKkqRVxESPJEmatCR7J7mY5hf9PwC30iQWNu459UTgZUme3r7fDVgXOKnrnK2B\nP2rb6D6uaOs362nz6gHD7Iycmd2nbr2ec8ZVVVfRxP3OJFv0OWVTYA7NblS9z9JJ0vQ+y6AWd4+g\nanU+08t74nwAuKqrvuN3Xcm1jttppnSNZbzPcTIjlCRJ0hRz6pYkSZqUJPOAE2jW0fkMcAvwMM00\nomf1nP51mvV09qJZQHgecFFVdScnZgCX0Gzb3c/1Pe8HTSx0phr1S8psAdw2yLStHh8H3kazNtHp\nPXWdP6SdDHx1Bddf3L72Jlw6Zq6gfCqSKQ+voDzjXDfe53jDpCOSJElTxkSPJEmarF1pRoy8uXuE\nSJLDe0+sqtuSnAHsleQU4GU0iwl3WwS8APjvPiNOJq2qFie5FdihT/VLaBYSnmibi5KcTLPg8QU9\n1bcCdwMz2529xnJ7+zqnp7x3FM5Yrm1ft6HpD6DZFQ2YC4wXw6B+AzxE8zn+W899XthdJkmShsep\nW5IkabI6I0MeGQmS5E+AP1vB+ScB29KM/nmYZpRPt38DtgT2670wyfpJNlyJWE8D/iLJVl1tvgp4\nDnDqJNs8gmb62XJbk1fVw+393pJku96L2sWTOxa1r6/oqp9JM+1rUOcADwAHtOsDdexLM4XujAm0\ntUJVdWd7r3lJHtdV9TZgIyb/OUqSpCnkiB5JkjSWdyR5fZ/yo4HvAG8G/qMdrTMXeDdwKc0v/r3O\noFnHZzfgzKq6paf+JNqduZLsDPyYZgrTc9vy19F/i/RBfKK977lJjm7j+xDNVLHexY0H0jWqZ+8+\n1YfQLB59QbsT2aXAJjSLML+6/ZqqWpjkfODIdoHq24C3MoGf0arq1iRHAh8FzkryLZrRPfsDP6Nr\n4eUp8GHgJ8APknwReCrNFu1nV9VZU3gfSZI0SSZ6JEnSWN6zgvIT2mNzmulLr6NJZsyjSajs1HtB\nVT2Q5Bs0CYiT+tQvS/JG4CDg7cCbgCU005GO5tFFmSesqq5PsiNwFPBJmhEwZwAHT2J9nm5H0Dzz\ncmvqVNXNSV4C/ANNMmx/miTXQpp1fbrtBRxPkxy6A1gAnAt8b9Agqmp+Oz3tfTRrId0GfBE4rKoe\nnPhjrfA+v0jyauBT7X3ubuM9dKruIUmSVk6mcAq8JEnSmJJ8lmZK0eZVtWTY8UiSJK1pXKNHkiSt\nFknWoxn9cppJHkmSpFXDqVuSJGmVSrIZzbo0uwJPpJmGJUmSpFXARI8kSVrVtgVOAW4BDqiqCW9n\nLkmSpMG4Ro8kSZIkSdIawjV6JEmSJEmS1hAmeiRJkiRJktYQw12jJ5kPfLSn9HKqnjuEaCT1k+wC\n/AsmhiVJkiRpdfoZVbtO9KJRWIx5Ic1OHB0PDSsQSX29BtgA+OKwA5EkSZKktci1k7loFBI9D1F1\n00BnJrOB2T2lS6laOuVRSeqYC/ySqo8MOxBJkiRJ0tgmn+hJtgZ2Bjajd0pH1ccm0NLWJDcA9wM/\nBQ6l6roVnHsoj53qdTgwfwL3kzQxc2m+NyVJkiRJI25y26sn+wFfAH4P3AR0N1JUbT9gO7sAGwGX\nA1vQJHG2BLaj6u4+5zuiR1qdkgB3AB+n6tPDDkeSJEmSNLbJJnquBf6Fqk9NbTSZQzMH7QNULZjS\ntiVNXLIJ8Adgd6pOHXY4kiRJkqSxTXYXnScAU/NLX/JekmtI7ge+CywGnj0lbUtaWXPb16uHGoUk\nSZIkaSCTTfScCrx2ZW++d7567P3MPopmnZ3taXbgeu7dbLQk4e0r276klWaiR5IkSZKmkckuxnwl\n8I8kfwpcAjy4XG3V5wdp5GTm7f8x/uG/ns515wJPAbYC6hY22xj4V+DE5S7ov0bPAcDDk3gGSeN7\nKXA3cNuwA5EkSZIkjW+ya/SM9df9ouqZA7QxawYPL72RLe54MrdsANwKnAes+wNesclO/OAFVWzS\nc818Hrvr1hLg3omEL2lCzqFqz2EHIUmSJEka3+QSPSt70/DLWSxd5yHW2W597lt0Lxvd1anbkt89\n9Q7mbHIvG32zit17LnTXLUmSJEmSpBUYfOpWchTwEarubb9ekaLq4HFaO/0J3P64W9hsu624/oLL\neN4VnYpXc84r/5Tzt3wPx817bMu1FDCpI61GCS8HPs3k1/SSJEmSJE3cr6t450QvmsgaPS8C1u36\nekXGHSJUxeFki1knsPdBr+es0zevmx7dwSt/9UzgjnfXcQ9MIDZJq85fAtsApw07EEmSJElaiyya\nzEVDmbr16N1zAXAhVe9v388ArgOOoeqTwwtMUkfCvwMbV/GaYcciSZIkSRrbUBM9SS2bwbIUqSIV\nKqGyjBnLIFQxc2jBSQIg4efAzyczZFCSJEmStHpNdnv1KZI3H8qRu/wvLn7Tujz4hNt5wjX/j4PP\nu5Q/eg2P3V1L0nDMBf592EFIkiRJksY33KlbK5CwJ7BHFf9n2LFIa7OEjYE7gP9bxdeHHY8kSZIk\naWyjuovO+cCrhh2EJOa2r1cPNQpJkiRJ0kBGLtGTsD5wALB42LFIMtEjSZIkSdPJUNfoSbid5bdj\nD/A4YAkwb8A2/moVhCap8Sqa78dbhx2IJEmSJGl8Q951i31YPtGzjOYXyguquH3ANkZvkSFpzXKO\nW6tLkiRJ0vQw9MWYE+YA+wLPa4suBRZUcefwopIkSZIkSZp+hj2iZwfgLOB+4MK2+I+B9YHXVvGL\nYcUmSZIkSZI03Qw70fMj4EpgvyoeasvWAb4MPLOKVwwtOEmSJEmSpGlm2Ime+4AXVXFZT/m2wEVV\nbDCcyCRJkiRJkqafYW+vfhfwtD7lWwF3r+ZYJEmSJEmSprVhJ3q+ASxI2CNhq/Z4K83Ura8NOTZJ\nkiRJkqRpZZ0h3/+DNNurn9gVy4PAF4BDhhWUJEmSJEnSdDT07dUBEjYAntW+XVTFkmHGI0mSJEmS\nNB2NRKJHkiRJkiRJK2/Ya/RIkiRJkiRpipjokSRJkiRJWkOY6JEkSY+RZJ8klWSHYccyVZI8L8lZ\nSe5JcluSk5JsOuC117Sfxz/3qduprdt16qMeLUlem2RBkt8keTjJNcOOSZIkLc9EjyRJWuMleSrw\nQ+DZwGHAPwF/DnwvyawJNLVfkqesghCniz3b407ghiHHIkmS+jDRI0mSpr0kM5KsN8YphwEbAq+s\nqs9X1SeA3YEXAPsMeJuFwEzgkJWJdSol2WA13/Iw4PFV9TLg16v53pIkaQAmeiRJ0qQkmZXkY0l+\nnuTOJPcm+VGSnbvOSTvt6T/7XL9ee93xXWWzkxye5MokS5Ncn+TTSWb3XFtJjkmyV5KFwFLg9WOE\n+xbgO1V1Xaegqs4BrqBJ+AziGuBEBhzVk2TLJF9JcnP7LAuTvKPnnM4UuWf0lHemg+3UVfb9dsrU\ni5P8MMkS4BNd9fu391ia5IYkxyaZ09Nup41tk5ybZEmSxUn+dpAPoKpuqKoHBzlXkiQNh4keSZI0\nWY8H/hr4PvB3wHxgU+C7SV4IUFUFnAzskmSTnuv/sm3jZGhG5QDfAj4IfBt4P3A6cBDwjT73fyXw\n2bbub2gSMY+RZEtgM+CiPtUXAi8a/1Ef8XFgHcYZ1ZPkycD5wKuBY9r4rgQWJDlwAvfr9UTgTOBX\nwIHAue395gPH0kynOhg4DXgXcHaSdXvaeAJwFs2InIOBy4BPJdllJeKSJEkjYp1hByBJkqat24Fn\nVNUDnYIkX6JJHLwf2LctPhH4MM3ImeO6rp9Hk5w5r32/J01iZMeq6pSR5DfAcUleWlU/6bp+G+D5\nVXXpOHFu0b7e2KfuRmCTJLOrauk47VBVVyU5iWZUz5FV1a9NaBJCM9v4/tCWHZfka8D8JMdX1X3j\n3a+PzYF3V1X3KKhNgUOBs4FdqmpZW34ZTZJpHvCvXW08BXh7VZ3UnrcAuJamv86cREySJGmEOKJH\nkiRNSlU93EnytGvkbELzR6SLgO27zrsCuADYq1PWnrsLcEo76gdgN+C3wGVJntQ5gP9p6x+ZEtb6\nwQBJHoD129d+iZz7e84ZxBGMMaonSWimin27fdv9LN8FNqbr85mgpSyftIEmOTYL+FwnydP6EnAX\nzaLT3e6hHUUF0PbhhcAzJxmTJEkaISZ6JEnSpCXZO8nFNAmTPwC30iQWNu459UTgZUme3r7fDVgX\nOKnrnK2BP2rb6D6uaOs362nz6gHD7Iycmd2nbr2ec8ZVVVfRxP3OJFv0OWVTYA7wTh77LJ0kTe+z\nDGpx9wiqVuczvbwnzgeAq7rqO37XlVzruJ1mSpckSZrmnLolSZImJck84ASadXQ+A9wCPEwzjehZ\nPad/nWY9nb1oFhCeB1xUVd3JiRnAJcAHVnDL63veD5qc6Uyv6peU2QK4bZBpWz0+DryNZm2i03vq\nOn9IOxn46gquv7h97U24dMxcQflkpnv1engF5ZmCtiVJ0pCZ6JEkSZO1K82IkTd3jxBJcnjviVV1\nW5IzgL2SnAK8jGYx4W6LaLY7/+8+I04mraoWJ7kV2KFP9UtoFjaeaJuLkpxMs+DxBT3VtwJ3AzPb\nnb3Gcnv7OqenvHcUzliubV+3oekPoNkVDZgLjBeDJElagzh1S5IkTVZnZMgjI0GS/AnwZys4/yRg\nW5rRPw/TjPLp9m/AlsB+vRcmWT/JhisR62nAXyTZqqvNVwHPAU6dZJtH0Ew/W25r8qp6uL3fW5Js\n13tRu3hyx6L29RVd9TNppn0N6hzgAeCAdn2gjn1pptCdMYG2JEnSNOeIHkmSNJZ3JHl9n/Kjge8A\nbwb+ox2tMxd4N3ApsFGfa86gWcdnN+DMqrqlp/4k2p25kuwM/JhmCtNz2/LX0X+L9EF8or3vuUmO\nbuP7EM1Usd7FjQfSNapn7z7Vh9AsHn1BuxPZpcAmNIswv7r9mqpamOR84Mh2gerbgLcygZ/RqurW\nJEcCHwXOSvItmtE9+wM/o2vh5ZWV5H8Bb2jfPhvYOMnft+9/XVXfnqp7SZKkyTHRI0mSxvKeFZSf\n0B6b00xfeh1NMmMeTUJlp94LquqBJN+gSUCc1Kd+WZI3AgcBbwfeBCyhmY50NI8uyjxhVXV9kh2B\no4BP0oyAOQM4eBLr83Q7guaZl1tTp6puTvIS4B9okmH70yS5FtKs69NtL+B4muTQHcAC4Fzge4MG\nUVXz2+lp76NZC+k24IvAYVX14MQfa4W2B/6xp6zz/qs0O41JkqQhyhROgZckSRpTks/STCnavKqW\nDDseSZKkNY1r9EiSpNUiyXo0o19OM8kjSZK0ajh1S5IkrVJJNqNZl2ZX4Ik007AkSZK0CpjokSRJ\nq9q2wCnALcABVTXh7cwlSZI0GNfokSRJkiRJWkO4Ro8kSZIkSdIaYriJnmQ+SfUclw01JkmSJEmS\npGlqFNboWUizQGPHQ8MKRFIfyTuB44cdhiRJkiStZc6j6uUTvWgUEj0PUXXTsIOQtEIvABYB7xt2\nIJIkSZK0Frl9MheNQqJna5IbgPuBnwKHUnVd3zOT2cDsntKlVC1dtSFKa7W5wEKqzhp2IJIkSZKk\nsU0+0ZNsDewMbEbvWj9VHxuwlQuAfYDLgS2AjwI/ItmOqrv7nH9oe063w4H5g4YtacLmAt8ddhCS\nJEmSpPFNbnv1ZD/gC8DvgZuA7kaKqu0nF03mANcCH6BqQZ96R/RIq1MSYAlwCFVHDzscSZIkSdLY\nJjui5++BD1P1qZWOIHkv8CFgc+DXwGLg2X3PbRI6JnWk1WdzYD3g6mEHIkmSJEka32S3V38CcOpK\n3z3ZAziKZvrV9jQ7cD0X6DdtS9LqN7d9NdEjSZIkSdPAZBM9pwKvXdmbf5l9j7+XDc4EzgXmAFsB\ndQ8bbpjw9pVtX9JKe2b7aqJHkiRJkqaBya7RcyjwAeAM4BLgweXqqz4/QBuzZvLQ0sVsecfm3LwB\ncCtwHrDuYp6y/lNZ/LoqZvZc02+Nnq8Dyyb+EJIG8ExgU6o2HXYgkiRJkqTxTXaNnncC9wA7tke3\nAsZP9MCTivALtt/zf9d/nflIafLp63jaLsCdfa7pv+tW1fxBA5ckSZIkSVpTTW5Ez8reNPxyFkvX\neYh1tluf+xbdy0Z3deq25HdPvYM5m9zLRt+sYveeC911S5IkSZIkaQUGH9GTHAV8hKp7269XpKg6\neJzWTi8ys8h223LpJT/jJb/qVLyBb71hGy6/7kCOnvfYlt11S1rdEvYEThl2HJIkSZK0ljmvipdP\n9KKJTN16EbBu19crMu4QoSoOh1l8PgfstQ8n3Pj4uutwAJIZwH7AMX9TRz8wgdgkrTp/ClwP/MOw\nA5EkSZKktcgtk7loKFO3Hr179gC+CrwLuBA4ENgdeC5VNw8vMEkdCd8GUsVfDDsWSZIkSdLYhpro\nSVgWlhEqnbJlzFgGzdvH7LolabVL+A1wbhXvH3YskiRJkqSxTXbXrany5mJGd6ppXZppYXvz2N21\nJK1mCQHmAl8ZdiySJEmSpPENNdFTxel9iv89YSGwB7BgNYckaXmbAhsAVw87EEmSJEnS+GYMO4AV\nOB941bCDkMTc9vWqoUYhSZIkSRrIyCV6EtYHDgAWDzsWSY8kehzRI0mSJEnTwFCnbiXczvLbsQd4\nHLAEmDdgG19eBaFJamwH3FbFXcMORJIkSZI0vmEvxnwQyyd6lgG3AhdUcfuAbWw35VFJ6valYQcg\nSZIkSRrMULdXB0iYA+wLPK8tuhRYUMWdw4tKkiRJkiRp+hlqoidhB+As4H7gwrb4j4H1gddW8Yth\nxSZJkiRJkjTdDDvR8yPgSmC/Kh5qy9YBvgw8s4pXDC04SZIkSZKkaWbYiZ77gBdVcVlP+bbARVVs\nMJzIJEmSJEmSpp9hb69+F/C0PuVbAXev5lgkSZIkSZKmtWEner4BLEjYI2Gr9ngrzdStrw05NkmS\nJEmSpGll2Nurf5Bme/UTu2J5EPgCcMiwgpIkSZIkSZqOhr69OkDCBsCz2reLqlgyzHgkSZIkSZKm\no5FI9EiSJEmSJGnlDXuNHkmSJEmSJE0REz2SJEmSJElrCBM9kiTpMZLsk6SS7DDsWKZKkuclOSvJ\nPUluS3JSkk0HvPaa9vP45z51O7V1u0591KMjyQZJ3pvk7CQ3Jrk7yS+TvCfJzGHHJ0mSGiZ6JEnS\nGi/JU4EfAs8GDgP+Cfhz4HtJZk2gqf2SPGUVhDgdPBP4ZyDAUTS7p14N/AvwlSHGJUmSugx7e3VJ\nkqSVlmQGMKuq7l/BKYcBGwIvrqrr2msuBL4H7AN8cYDbLAS2AQ4BDljZmKdCkg2qanXtVnoT8Pyq\nWthVdnySrwB/leQfq+rK1RSLJElaAUf0SJKkSUkyK8nHkvw8yZ1J7k3yoyQ7d52TdtrTf/a5fr32\nuuO7ymYnOTzJlUmWJrk+yaeTzO65tpIck2SvJAuBpcDrxwj3LcB3OkkegKo6B7gC2H3AR74GOJEB\nR/Uk2TLJV5Lc3D7LwiTv6DmnM0XuGT3lnelgO3WVfT/Jb5K8OMkPkywBPtFVv397j6VJbkhybJI5\nPe122tg2yblJliRZnORvx3ueqvp9T5Kn4z/a1+eN14YkSVr1TPRIkqTJejzw18D3gb8D5gObAt9N\n8kKAqirgZGCXJJv0XP+XbRsnwyOjcr5FMyXo28D7gdOBg4Bv9Ln/K4HPtnV/Q5OIeYwkWwKbARf1\nqb4QeNH4j/qIj9OMiD5krJOSPBk4H3g1cEwb35XAgiQHTuB+vZ4InAn8CjgQOLe933zgWOAG4GDg\nNOBdwNlJ1u1p4wnAWcCv23MvAz6VZJdJxrR5+/r7SV4vSZKmkFO3JEnSZN0OPKOqHugUJPkSTeLg\n/cC+bfGJwIdpRs4c13X9PJrkzHnt+z1pEiM7VlWnjCS/AY5L8tKq+knX9dvQTCW6dJw4t2hfb+xT\ndyOwSZLZVbV0nHaoqquSnEQzqufIqurXJjQJoZltfH9oy45L8jVgfpLjq+q+8e7Xx+bAu6uqexTU\npsChwNnALlW1rC2/jCbJNA/41642ngK8vapOas9bAFxL019nTiSYdn2jA2nW6vnZJJ5HkiRNMUf0\nSJKkSamqhztJniQz2hE769CMnNm+67wrgAuAvTpl7bm7AKe0o34AdgN+C1yW5EmdA/iftv6RKWGt\nHwyQ5AFYv33tl8i5v+ecQRzBGKN6koRmqti327fdz/JdYGO6Pp8JWsrySRtokmOzgM91kjytLwF3\n0Sw63e0e2lFUAG0fXkiz2PJEHQNsC7yvqh6axPWSJGmKmeiRJEmTlmTvJBfTJEz+ANxKk1jYuOfU\nE4GXJXl6+343YF3gpK5ztgb+qG2j+7iird+sp82rBwyzM3Jmdp+69XrOGVdVXUUT9zuTbNHnlE2B\nOcA7eeyzdJI0vc8yqMXdI6hanc/08p44HwCu6qrv+F1Xcq3jdpopXQNL8iFgP+AjVfVfE7lWkiSt\nOk7dkiRJk5JkHnACzTo6nwFuAR6mmUb0rJ7Tv06zns5eNAsIzwMuqqru5MQM4BLgAyu45fU97wdN\nznSmV/VLymwB3DbItK0eHwfeRrM20ek9dZ0/pJ0MfHUF11/cvvYmXDpmrqB8MtO9ej28gvIM2kCS\nfYBPAcdV1RFTEJMkSZoiJnokSdJk7UozYuTN3SNEkhzee2JV3ZbkDGCvJKcAL6NZ26XbIuAFwH/3\nGXEyaVW1OMmtwA59ql9Cs7DxRNtclORkmgWPL+ipvhW4G5jZ7uw1ltvb1zk95b2jcMZybfu6DU1/\nAI+snzMXGC+GCUnyf4AvA98E3juVbUuSpJXn1C1JkjRZnZEhj4wESfInwJ+t4PyTaNZz+Ux77dd7\n6v8N2JJmOtBykqyfZMOViPU04C+SbNXV5quA5wCnTrLNI2imny23NXlVPdze7y1Jtuu9qF08uWNR\n+/qKrvqZNNO+BnUO8ABwQLs+UMe+NFPozphAW2NK8gqafvshsFfPmkCSJGkEOKJHkiSN5R1JXt+n\n/GjgO8Cbgf9oR+vMBd4NXAps1OeaM2jW8dkNOLOqbumpP4l2Z64kOwM/ppnC9Ny2/HX03yJ9EJ9o\n73tukqPb+D5EM1Wsd3HjgXSN6tm7T/UhNItHX9DuRHYpsAnNIsyvbr+mqhYmOR84sl2g+jbgrUzg\nZ7SqujXJkcBHgbOSfItmdM/+NDthnTzW9YNq11f6Fs10s38Hdls+r8TFVXVxv2slSdLqY6JHkiSN\n5T0rKD+hPTanmb70OppkxjyahMpOvRdU1QNJvkGTgDipT/2yJG8EDgLeDrwJWEIzHeloHl2UecKq\n6vokOwJHAZ+kGQFzBnDwJNbn6XYEzTMvt6ZOVd2c5CXAP9Akw/anSXItpFnXp9tewPE0yaE7gAXA\nucD3Bg2iqua309PeR7MW0m3AF4HDqurBiT9WX3N5dJHtY/vUH86jaw9JkqQhyRROgZckSRpTks/S\nTCnavKo1603GAAANA0lEQVSWDDseSZKkNY1r9EiSpNUiyXo0o19OM8kjSZK0ajh1S5IkrVJJNqNZ\nl2ZX4Ik007AkSZK0CpjokSRJq9q2wCnALcABVTXh7cwlSZI0GNfokSRJkiRJWkO4Ro8kSZIkSdIa\nYriJnmQ+SfUclw01JkmSJEmSpGlqFNboWUizQGPHQ8MKRFIfySHAkcMOQ5IkSZLWMudR9fKJXjQK\niZ6HqLpp2EFIWqGtgV8AOw87EEmSJElaizw8mYtGIdGzNckNwP3AT4FDqbqu75nJbGB2T+lSqpau\n2hCltdpc4Eqq7hp2IJIkSZKksU0+0ZNsTfMX/s3oXeun6mMDtnIBsA9wObAF8FHgRyTbUXV3n/MP\nbc/pdjgwf9CwJU3YXODCYQchSZIkSRrf5LZXT/YDvgD8HrgJ6G6kqNp+ctFkDnAt8AGqFvSpd0SP\ntDol69CMtnsvVccPOxxJkiRJ0tgmO6Ln74EPU/WplY4geS/wIWBz4NfAYuDZfc9tEjomdaTVZytg\nJnD1sAORJEmSJI1vsturPwE4daXvnuwBHEUz/Wp7mh24ngv0m7YlafWb276a6JEkSZKkaWCyiZ5T\ngdeu7M2/zL7H38sGZwLnAnNoRg/UPWy4YcLbV7Z9SSttLs3UzP4LpEuSJEmSRspk1+g5FPgAcAZw\nCfDgcvVVnx+gjVkzeWjpYra8Y3Nu3gC4FTgPWHcxT1n/qSx+XRUze67pt0bPNRN/AEkDWh+4laqn\nDTsQSZIkSdL4JrtGzzuBe4Ad26NbAeMneuBJRfgF2+/5v+u/znykNPn0dTxtF+DOPtf033Wrav6g\ngUuSJEmSJK2pJjeiZ2VvGn45i6XrPMQ6263PfYvuZaO7OnVb8run3sGcTe5lo29WsXvPhe66JUmS\nJEmStAKDj+hJjgI+QtW97dcrUlQdPE5rpxeZWWS7bbn0kp/xkl91Kt7At96wDZdfdyBHz3tsy+66\nJa1uCR8A/t+w45AkSZKktcx5Vbx8ohcNPqInORd4E1V3tF+vSFH1ykGa/HwOWLQPJ3z38XXX/u09\nZtAs+noMVZ8cLDBJq1LCF4E/A94y7FgkSZIkaS1yXxXXT/SioUzdevTu2QP4KvAu4ELgQGB34LlU\n3Ty8wCR1JHwPuLOKXYcdiyRJkiRpbENN9CQsC8sIlU7ZMmYsg+btY3bdkrTaJVwJ/EcVHxp2LJIk\nSZKksU12162p8uZiRneqaV3gRcDePHZ3LUmrWcJM4GnA1cOORZIkSZI0vqEmeqo4vU/xvycsBPYA\nFqzmkCQtb0uaBKyJHkmSJEmaBmYMO4AVOB941bCDkMTc9tVEjyRJkiRNAyOX6ElYHzgAWDzsWCQ9\nkui5ZphBSJIkSZIGM9SpWwm3A92rQQd4HLAEmDdgG5eugtAkNZ4I3FjF/cMORJIkSZI0vmEvxnwQ\nyyd6lgG3AhdUcft4FyeZDfwbcGRVLV01IWoqtX12KPbZtNDpr4TZ9tf04PfY9GJ/TT/22fRif00/\n9tn0Yn9NP/bZ2mGo26sDJMwB9gWe1xZdCiyo4s7xr83jgTuBjavqrlUXpaaKfTa92F/Tj302vdhf\n0499Nr3YX9OPfTa92F/Tj322dhjqGj0JOwBX0ozs2aQ9DgIWJWw/zNgkSZIkSZKmm2FP3fos8G1g\nvyoeAkhYB/gy8DngFUOMTZIkSZIkaVoZdqJnB7qSPABVPJTwaeCi4YUlSZIkSZI0/Qx7e/W7gKf1\nKd8KuHuA65cCh7evmh7ss+nF/pp+7LPpxf6afuyz6cX+mn7ss+nF/pp+7LO1wFAXY074PPAm4IPA\nT9rilwGfAU6r4sBhxSZJkiRJkjTdDHvq1gdptlc/sSuWB4EvAIcMKyhJkiRJkqTpaOjbqwMkbAA8\nq327qIolw4xHkiRJkiRpOhqJRI8kSZIkSZJW3rAXY5YkSZIkSdIUMdEjSZIkSZK0hpjWiZ4k701y\nTZL7k1yQ5CXDjmltlOQVSb6d5IYkleSNPfVJ8rEkNya5L8k5SbbuOWe9JMcm+UOSe5KcluTJq/dJ\n1g5JDk3ysyR3J7klyelJtuk5xz4bEUnek+TiJHe1x0+T7NJVb1+NuCSHtP82fq6rzH4bEUnmt/3T\nfVzWVW9fjaAkWyY5uf3M70tySZIduurttxHR/qze+z1WSY5t6+2rEZNkZpJ/THJ12yeLknwkSbrO\nsd9GSJLHJflckmvb/vhJkj/uqre/1jLTNtGTZA/gKOBwYHvg18B3k2w21MDWThvSfP7vXUH93wIH\nAO8G/gS4l6av1us657PAXwK7ATsCTwG+uaoCXsvtCBwL/CnwGmBd4OwkG3adY5+Njt/R7EL4YmAH\n4H+A/0zyR229fTXC2h+y3gVc3FNlv42WhcAWXcf/11VnX42YJE8AfkyzU+suwLbAwcDtXafZb6Pj\nj1n+++s1bfmp7at9NXr+DngP8D7gee37vwXe33WO/TZavkzzvfU24PnA2cA5SbZs6+2vtU1VTcsD\nuAA4puv9DGAxcMiwY1ubD6CAN3a9D3Aj8MGuso2B+4G3dr1/ANi165zntm396bCfaU0/gE3bz/oV\n9tn0OIDbgH3tq9E+gI2AK4BXA98HPteW228jdADzgV+toM6+GsED+CTwozHq7bcRPoDPAVe2/WRf\njeABfAdY0FN2GnBy+7X9NkIHsD7wEPDnPeU/B46wv9bOY1qO6Ekyi+av2+d0yqpqWfv+z4YVl/qa\nC2zO8n11J02irtNXL6YZVdJ9zmXAddifq8PG7ett7at9NqLaodRvpRlF91Psq1F3LHBGVZ3TU26/\njZ6t00w/virJKUme1pbbV6PpDcBFSU5NMwX5l0n266q330ZU+zP8POAr1fwmaV+Npp8Ar0ryHIAk\nL6AZ6XhmW2+/jZZ1gJk0iZtu99H0m/21Flpn2AFM0pNo/mO+uaf8ZprMo0bH5u1rv77avOucB6rq\njjHO0SqQZAbNX9Z+XFW/aYvtsxGT5Pk0iZ31gHuAN1XVpUle2p5iX42YNiG3Pc2UhV5+j42WC4B9\ngMtpppV8FPhRku2wr0bVM2mmlRwFfILm++zzSR6oqq9iv42yNwJzgBPa9/bVaPok8HjgsiQP0/ze\n9eGqOqWtt99GSFXdneSnwEeS/JbmM/6/NAmaK7G/1krTNdEjaWocC2zH8utRaPRcDryQZvTVrsBX\nk+w43JC0Ikm2Ao4GXlNVvX9d04ipqjO73l6c5ALgWmB34LfDiUrjmAFcVFWHte9/2Sbm3g18dXhh\naQD7AmdW1Q3DDkRj2h3YC9iTZg2zFwKfS3JDm0zV6Hkb8BWapUweBn4BfI1mpI7WQtNy6hbwe5r/\ngHtXAX8ycNPqD0dj6PTHWH11EzAryZwxztEUS3IM8BfAzlX1u64q+2zEVNUDVXVlVf28qg6lWfz8\nb7CvRtWLgc2AXyR5KMlDNIsaHtB+3fmLmv02gtq/Zl4BPBu/x0bVjcClPWW/BTpT7uy3EZTk6TRr\nln25q9i+Gk2fAT5VVV+vqkuq6iSahXoPbevttxFTVYuqakea9QG3qqqX0EzFugr7a600LRM9VfUA\nzeJSr+qUtVNQXkUzvUGj42qafxy6++rxNKu9d/rq5zQ7Z3Sfsw3ND2z25xRrt1c8BngT8Mqqurrn\nFPts9M0AZmNfjar/ptnx4oVdx0XAKe3XnR+67LcRlGQjmiTPjfg9Nqp+DGzTU/YcmpFYYL+Nqr8C\nbgHO6Cqzr0bTBjSL+3Z7mEd/d7TfRlRV3VtVN7a7E74O+E/sr7XTsFeDnuwB7EGz4NTeNNv+HU+z\nreaThx3b2nbQZI47v8wUcFD79dPa+r9r++YNNL/8nE7zi856XW18geYHtJ1p/hr+E+Anw362NfEA\n/gW4g2aEweZdx/pd59hnI3IARwKvAJ7R9sWRwDKaaUH21TQ56Np1y34brQP4p/bfw2cALwW+B9wK\nbGpfjeZBsybPg8BhNEm5PWm2Ct6r6xz7bYQOmgTBtcAn+9TZVyN20Kyh9Dvgz9t/G9/U/rv4Kftt\nNA+apM7raRZefg3wK+B8YF37a+08hh7ASgUP72v/Y1xKs5jinww7prXxAHaiSfD0Hie09QE+RpNJ\nvp9mNffn9LSxHs16Mbe1P6x9E9h82M+2Jh4r6KsC9uk6xz4bkQNYAFzT/jt3S9sXr7GvptfBYxM9\n9tuIHMDXgRva77Hfte+fZV+N9kEz9fiStk9+C+zXU2+/jdABvLb9WeM5fersqxE7gMfRbNZxLc3O\nTYtotumeZb+N5kGzrtKi9v9lNwLHABvbX2vvkbZTJUmSJEmSNM1NyzV6JEmSJEmS9FgmeiRJkiRJ\nktYQJnokSZIkSZLWECZ6JEmSJEmS1hAmeiRJkiRJktYQJnokSZIkSZLWECZ6JEmSJEmS1hAmeiRJ\nkiRJktYQJnokSZIkSZLWECZ6JEmSJEmS1hAmeiRJkiRJktYQ/z/dfmfzeb9+bgAAAABJRU5ErkJg\ngg==\n", 96 | "text/plain": [ 97 | "" 98 | ] 99 | }, 100 | "metadata": {}, 101 | "output_type": "display_data" 102 | } 103 | ], 104 | "source": [ 105 | "neuron_layer_0_traces = np.array([r['n1'] for r in traces])\n", 106 | "\n", 107 | "rend.render_figure([rend.IdentityNeuronTraceRenderer(neuron_layer_0_traces, \n", 108 | " 'Layer 0 Neuron')], 0, 1000)" 109 | ] 110 | } 111 | ], 112 | "metadata": { 113 | "kernelspec": { 114 | "display_name": "Python 3", 115 | "language": "python", 116 | "name": "python3" 117 | }, 118 | "language_info": { 119 | "codemirror_mode": { 120 | "name": "ipython", 121 | "version": 3 122 | }, 123 | "file_extension": ".py", 124 | "mimetype": "text/x-python", 125 | "name": "python", 126 | "nbconvert_exporter": "python", 127 | "pygments_lexer": "ipython3", 128 | "version": "3.6.0" 129 | }, 130 | "toc": { 131 | "colors": { 132 | "hover_highlight": "#DAA520", 133 | "running_highlight": "#FF0000", 134 | "selected_highlight": "#FFD700" 135 | }, 136 | "moveMenuLeft": true, 137 | "nav_menu": { 138 | "height": "103px", 139 | "width": "252px" 140 | }, 141 | "navigate_menu": true, 142 | "number_sections": true, 143 | "sideBar": true, 144 | "threshold": 4, 145 | "toc_cell": false, 146 | "toc_section_display": "block", 147 | "toc_window_display": false, 148 | "widenNotebook": false 149 | } 150 | }, 151 | "nbformat": 4, 152 | "nbformat_minor": 2 153 | } 154 | -------------------------------------------------------------------------------- /jupyter_examples/example_2_three_izhikevich_neurons.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Print 3 Izhikevich neuron traces" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": { 14 | "collapsed": false 15 | }, 16 | "outputs": [ 17 | { 18 | "name": "stderr", 19 | "output_type": "stream", 20 | "text": [ 21 | "/Applications/Anaconda/anaconda/envs/mlbook/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n", 22 | " from ._conv import register_converters as _register_converters\n" 23 | ] 24 | } 25 | ], 26 | "source": [ 27 | "import numpy as np\n", 28 | "from spikeflow import BPNNModel, IzhikevichNeuronLayer\n", 29 | "import spikeflow.drawing_utils.trace_renderers as rend" 30 | ] 31 | }, 32 | { 33 | "cell_type": "markdown", 34 | "metadata": {}, 35 | "source": [ 36 | "# Create a model layer with 3 neurons in a single layer" 37 | ] 38 | }, 39 | { 40 | "cell_type": "code", 41 | "execution_count": 2, 42 | "metadata": { 43 | "collapsed": true 44 | }, 45 | "outputs": [], 46 | "source": [ 47 | "model_input_shape = (3,)\n", 48 | "\n", 49 | "nl = IzhikevichNeuronLayer.layer_from_tuples('n1', [\n", 50 | " IzhikevichNeuronLayer.C(a=0.010, b=0.2, c=-65.0, d=6.0, t=30.0, v0=0.0),\n", 51 | " IzhikevichNeuronLayer.C(a=0.015, b=0.2, c=-65.0, d=6.0, t=30.0, v0=0.0),\n", 52 | " IzhikevichNeuronLayer.C(a=0.020, b=0.2, c=-65.0, d=6.0, t=30.0, v0=0.0)\n", 53 | "])\n", 54 | "\n", 55 | "model = BPNNModel.compiled_model(model_input_shape, [nl], [])" 56 | ] 57 | }, 58 | { 59 | "cell_type": "markdown", 60 | "metadata": {}, 61 | "source": [ 62 | "# Run the model for 2000 timesteps" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": 3, 68 | "metadata": { 69 | "collapsed": false 70 | }, 71 | "outputs": [], 72 | "source": [ 73 | "traces = []\n", 74 | "\n", 75 | "def end_time_step_callback(i, graph, sess, results):\n", 76 | " traces.append(results)\n", 77 | " \n", 78 | "data = (np.ones(3,)*(7 if i > 1200 else 0) for i in range(0, 2000))\n", 79 | " \n", 80 | "model.run_time(data, end_time_step_callback)" 81 | ] 82 | }, 83 | { 84 | "cell_type": "markdown", 85 | "metadata": {}, 86 | "source": [ 87 | "# Extract the data we want and display (after 1000 timestep settle-down period)" 88 | ] 89 | }, 90 | { 91 | "cell_type": "code", 92 | "execution_count": 4, 93 | "metadata": { 94 | "collapsed": false 95 | }, 96 | "outputs": [ 97 | { 98 | "data": { 99 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABI8AAAE8CAYAAABJk5F+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xm8HFWd9/HPL/sCJLIkARIEZUc0iBvwjHbYFBdUBHUG\nFAbFcXTkQXBhUTuNCygKro8KMiCLDjK8RIEBXKBQRwUBUQQB2XcIBEIg5Gb7PX+cU9xKpfv2cru7\nuvt+369Xvaq7tj7dfarq1K/OOWXujoiIiIiIiIiISDXjik6AiIiIiIiIiIj0LgWPRERERERERESk\nJgWPRERERERERESkJgWPRERERERERESkJgWPRERERERERESkJgWPRERERERERESkJgWPRERERERE\nRESkJgWPRERERERERESkJgWPRERERERERESkJgWPRERERERERESkJgWPREREpOPM7DAzczN7VdFp\naRcz28HMrjCzZ81ssZmda2abNLjuvfH3+FaVeaU478D2p7r3mNnuZvY7M1tmZo+a2TfNbL2i0yUi\nIiLDFDwSERERaZKZzQV+A2wNHA98FXgL8Eszm9TEpo4ws806kMS+YGbzgV8D04CjgR8AHwIuLDJd\nIiIisrYJRSdAREREpNeY2Thgkrsvr7HI8cB0YFd3vz+ucx3wS+Aw4PQGPuYWYDvgWODI0aa5Hcxs\nmrsv6+JHfgl4Cii5+zMxDfcCZ5jZvu7+iy6mRURERGpQzSMRERHpCWY2ycxONLMbzGyJmT1nZr81\nswWZZSw2+fpZlfWnxPW+n5k22cwqZnanmQ2Z2QNm9hUzm5xb183s22Z2sJndAgwBbxohue8CLk0D\nRwDu/ivgDuDdDX7le4FzaLD2kZltbmb/aWaPxe9yi5kdnlsmbR64ZW562hSulJmWmNnfzGxXM/uN\nmS0jBHPS+R+JnzFkZg+b2XfMbGZuu+k2djSzq2PTs4fM7FMNfJ8NgH2A89LAUXQO8CyN/44iIiLS\nYQoeiYiISK/YAPggkACfBhYCmwBXxuZNuLsD5wH7mdmGufXfFrdxHrxQe+jnwCeAS4CPARcDHwcu\nqPL5ewKnxXn/lxDcWYeZbQ7MAq6vMvs6YJf6X/UFXyTUBD92pIXMbDbwR2Bv4NsxfXcCZ5rZUU18\nXt5GwOXATcBRwNXx8xYC3wEeBo4BLgL+DfiFmU3MbeNFwBXAX+KytwFfNrP96nz2zoTvvtbv6O4r\nYnqa+R1FRESkg9RsTURERHrFU8CWMXgAgJmdQQhGfAz4QJx8DnACoWbK9zLrH0II+Pwuvv8XQrDl\nDe6eTsPM/gZ8z8x2d/ffZ9bfDtjZ3W+tk85N4/iRKvMeATY0s8nuPlRnO7j73WZ2LqH20UnuXm2b\nEIJM42P6nozTvmdmPwYWmtn33f35ep9XxRzgw+6era21CXAc8AtgP3dfE6ffRghcHQKcldnGZsD7\n3f3cuNyZwH2E/+vyET673u/4Ty18HxEREekA1TwSERGRnuDuq9PAkZmNizWL0popr8wsdwdwLXBw\nOi0uux9wfqydBHAQ8HfgNjPbOB2Aq+L8F5rDRdc0EDgCmBrH1YJDy3PLNOILjFD7yMyM0Ezukvg2\n+12uBGaQ+X2aNMTagSAIAbdJwNfTwFF0BvAMoWPwrGeJtb3ghZpD1wEvqfPZ9X7HZn5DERER6SAF\nj0RERKRnmNmhZvZXQvDgSWARIVgxI7foOcAeZvbi+P4gYCJwbmaZbYCd4jaywx1x/qzcNu9pMJlp\nDZ/JVeZNyS1Tl7vfTUj3h8xs0yqLbALMJDyFLP9d0sBP/rs06qFsTa8o/U1vz6VzBXB3Zn7qwUzA\nLvUUoTnbSOr9jq3UpBIREZEOULM1ERER6QlmdghwNqFfolOAx4HVhCZUL80t/l+E/okOJnTyfAhw\nvbtnAx7jgJsJj4Cv5oHc+0aDFWkzq2qBnk2BxY00Wcv5IvA+Ql9PF+fmpTf7zgN+WGP9v8ZxPoiT\nGl9jejsCNKtrTLc669X7HR9uOUUiIiLSVgoeiYiISK84kFCz5YBsTRYzq+QXdPfFZnYZcLCZnQ/s\nQejwOesu4BXAr6vUjGmZuz9kZouAV1WZ/RpCZ8/NbvMuMzuP0Cn1tbnZi4ClwPj4RLeRPBXHM3PT\n87WFRnJfHG9H+D+A8DQ8YCugXhoa9TdgFeF3/Enuc+Znp4mIiEix1GxNREREekVag+WFGitm9lpg\ntxrLnwvsSKiltJpQGynrJ8DmwBH5Fc1sqplNH0VaLwLeambzMtvcC9gWuLDFbX6B0PRurcfcu/vq\n+HnvMrOX5VeKHVyn7orj12fmjyc0eWvUr4AVwJGxv6XUBwjNBy9rYls1ufuS+FmHmNn6mVnvA9aj\n9d9RRERE2kw1j0RERKSbDjezN1WZ/g3gUuAA4KexVtFWwIeBWwnBhLzLCP0iHQRc7u6P5+afS3wi\nm5ktAP6X0Hxr+zj9jeQeE9+EL8XPvdrMvhHT90lCM7l8B9QNydQ+OrTK7GMJHXxfG59AdyuwIaGj\n7L3ja9z9FjP7I3BS7ER8MfBemijzufsiMzsJKANXmNnPCbWQPgL8iUzn2G1wAvB74BozOx2YCxwD\n/MLdr2jj54iIiMgoKHgkIiIi3fTvNaafHYc5hKZbbyQESA4hBGlK+RXcfYWZXUAIapxbZf4aM3sH\n8HHg/cA7gWWEpljfYLjj7Ka5+wNm9gbgVOBkQk2dy4BjWujvKOsLhO+8Vh9F7v6Ymb0G+BwhwPYR\nQuDsFkI/SVkHA98nBJyeBs4ErgZ+2Wgi3H1hbJr3H4S+pRYDpwPHu/vK5r9Wzc+50cz2Br4cP2dp\nTO9x7foMERERGT1rYxcAIiIiIl1lZqcRmlPNcfdlRadHREREZBCpzyMRERHpS2Y2hVBL5yIFjkRE\nREQ6R83WREREpK+Y2SxCPz8HAhsRmqCJiIiISIcoeCQiIiL9ZkfgfOBx4Eh3v6ng9IiIiIgMNPV5\nJCIiIiIiIiIiNanPIxERERERERERqUnBIxERERERERERqWmwgkdmCzHz3HBb0ckSEREREREREelX\ng9hh9i2EJ7CkVhWVEBGpwmwSsBy4DXis4NSIiIiIiIiMJcfgfmOzKw1i8GgV7o8WnQgRqWkzwICj\ncP9F0YkRERERERGRkQ1Ws7VgG8wexuxuzM7HbIsRlzabjNkGuWFyl9IqMhbNi+MHC02FiIiIiIiI\nNKS3ah6ZbQMsAGaRD2y5n9jAFq4FDgNuBzYFysBvMXsZ7ktrrHNcXC6rAixsNNki0pS5cfxAoakQ\nERERERGRhpi7F52GwOwI4LvAE8CjQDZhjvsrW9jmTOA+4Gjcz6yxzGQgX9NoCPehpj9PROoz+yTw\nGdxnFJ0UERERERERqa+Xah59BjgB9y+3bYvuT2N2B7D1CMsMAQoUiXTPPNRkTUREREREpG/0Up9H\nLwIubMuWzD6K2b2YLQd2ASa2Zbsi0g5zUZM1ERERERGRvtFLwaMLgX1HtQWzr2K2EDgVOB+4EVgJ\nfBCzWaNNoIi0hWoeiYiIiIiI9JFearZ2J/B5zF4H3EwI+gxz/2YD25gLHACMBw4Ffge8H7gGOBw4\nuY3pFZHWzAUuLToRIiIiIiIi0phe6jD7nhHmOu4vaWAbk4BlwIG4X5yZ/kNgJu5vr7JOtQ6zb24g\nxSLSmi2AD9bsxF5ERERERER6Su/UPHLfqg1b2ZhQ6+ix3PTHgO1rrHMcUM5Nq+C+sA3pERERERER\nERHpa8UGj8xOBT6L+3PxdS2O+zEdSsVJhD6SsvT0NRERERERERERig4erf0ktF1GWK7RtnVPAKuB\n2bnps4FHq2/Zh1CwSKQnWcUMeA/wUy+79tMBYBV7B5B42Z8uOi0iY4FVbGdggpf9z0WnRWQssIpN\nB97sZW/PU6RFpC6r2P7Ab73sTxWdlkFW7NPW3Bfg8QIivK417Nng9lYANwB7vTDNbFx8/4e2p19E\nOm0b4MfA3kUnREbPKrYh8FPgXUWnRWQM+Qrw5aITITKGvBf4STzniUiHxYDtzwg3nKWDiq551Amn\nAj/E7HrgOuAoYDpwVqGpEpFWpB3lv6jQVEi7pH3P6f8U6Z5tAN2JFemeHeN4JrC4yISIjBEvjWOV\nLzts8IJH7hcc+k474KqtOHPRdMZv+yTP7X0XHzv1Ss93oi0ivS/tSH9moamQdtkhjmcUmgqRMcIq\nNhHYElhTcFJExpL0RonOdSLdsXUca5/rsGKbrXWAVew958znHQ/O4IihCex082x+dNrunGoVm1V0\n2kSkaVvGsU4Gg0EFapHuejHhKbTa50S6RzdKRLpLwaMuGbjgEXA0cIaX/Swv+63Ah4FlwOHFJktE\nWqCaR4NFwSOR7lKBWqSLrGJT0Y0vkW7Tua5LBqrZmlVsErArcFI6zcu+xir2K2C3GutMBibnJg/p\nyU4iPSENHulkMBjSu7EKBop0R1qgnmwVm+JlX15oakQG33aAxdc614l0R9rnkfa5Dhu0mkcbE6pn\n5/s3egyYU2Od44AlueG4TiVQRJqi4NGAsIpNIfyfa9D/KdItW2dea78T6bz0JonOdSLdo5pHXTJQ\nNY9adBLhCW1ZqnUkUjCr2PrARoQCmO4k9L9dCTcsrkMnd5Fu2YZwU2xGHPTwEJHOei1wLzAVnetE\nOi7enJwHLEX7XMc1HTyyiv0AOM/LnrQ/OaP2BLAamJ2bPht4tNoKsXmagkUivWebOL4dnQwGwQLC\nRew1wIEFp0Vk4FnFJgB7AFcD70DHUZFuWEDY5/ZA+5xIN+xBaCp6NfDKgtMy8FpptrYJcIVV7AGr\n2ClWsVe0O1Gt8rKvAG4A9kqnWcXGxfd/KCpdItKSNwHPEU4GqnnU//YEfgMsRgVqkW7YjbCv/Ti+\n134n0kFWsY2BlwMJwzX+RKSz3gw8DPwW7XMd13TwyMv+dmBT4PPAq4EbrWK3WMWOt4pt2d7kteRU\n4Air2KFWsR2A7wLTgbOKTZaINGl/4EpCrUGdDPpYrFK8OyEQuASYYRWzkdcSkVF6M7AIuCq+13FU\npLPeEMcJ8DTa50S64c3A/xD2ufWtYuMLTs9Aa6nDbC/7U172073sJeDFwNnA+4A725e01njZLwA+\nAZwI3ATMB97kZVc7f5E+YRXblNBvwM/Q3btB8F7CUy1/Qfg/xwPTCk2RyACLTdbeBVxBKFCDjqMi\nnfavwN+97PejsotIx1nFXg5sD1xO2OcA1i8uRYNvVB1mW8UmAq8iXORtSQc7YrSK3UsIVGUd52U/\nObPMFoSaRguAZ4FvxmVWdSpdItIRnyI0WbsUeBsw1So2KTZNlT5iFZtKCOb/t5f9lkwN1ZmE/1hE\n2u8jhKfPvNfLvsoq9hxq/ivSMVax1wNvAd4TJy0BtiguRSKDLdZg/zrwD+Ay4PVx1kyGb5pIm7VU\n88gqtsAqdgYhWHQ28AzwVmBu+5JW1ecITebS4VuZNI0nZJxJhOYRhwKHES5aRKRPWMVeDXwMONHL\nvhjdNe9bVrFJhP5WZgEnxMn6P0U6yCq2J+FJsqd72W+Mk1ULQqRDrGJbE8511wL/HSdrnxPpkBg4\n+gKhwsjH4gOw0ppH2u86qJWnrT0EbEioCv0h4JL4h3XDUi971aemAfsCOwJ7xyZqN1nFPgt82Sq2\nsFaNBavYZEJziqyhLn4nEYmsYq8jBIFvINxNgOGTwUxC/x3S4+KDCvYBvgLsALzDy35HnK2Tu0gH\nWMXmAB+Pw9XA0ZnZ6n9FpM2sYusBHyDcqH4UeLuXfU2creCRSAdYxV4JfJHwYJ1PeNmvjLNUvuyC\nVpqtLQQu9LIXUR3s2BgQuh/4EXBapknabsDNub6NriQ0Y9sJ+HONbR4HlHPTKoTvKSJdEPs4+jTw\nH8B1wJszAV/VVOlh8e7PHGC7OPwfQuBoNuG/fF2m9gPo5C4yalaxaYQmMdsRug/YA/gnYCXhbuyX\nczfBdCErMgqxJu084KWEfW43wtOcJxEeyvMpL/tTmVW0z4mMQixfbkjotmZnwoO69iTclLwH2N/L\nfklmFZUvu6Dp4JGX/YxOJKQB3wRuJDzmeXdClexNGb6zNod1+1x6LDOvlpMIT2jLUq0jkS6wim0H\nHEm4c7ec0DT1FC/7ysxi2ZpH0qJ4Ep4ATCQUdidmhkmEDqzTYXqV19MJnRBunBk2iUPa+fVq4GZC\nc+ZLgN972T2XFJ3cZcyI+93EKsMUhvexqTXG04AXxWHDzHhTQlPQ1GOEQO2RwH/lLmBTupCVMWGE\nfS57nqu2r6WvZxD2s+wwC9gMSJ8SugS4nnCz+cexg+y8JYT+GifmyjQiAyd2H1Ntn5tK7XNcOl6P\ndc9zGxG6w1kv8zG3A/9L6Bf1ci/76lwyVL7sglF1mD1aVrGTCbUNRrKDl/02L3s2wPNXq9gQcLpV\n7LjRNDGL6ypYJNJFVrG9gWMIVU4fJ1Q//VaNGo3ptPdYxTYDniIEkZ8l3GVfkRtbHMZlXuffZ19P\nIDz9Kz80M71by+ZPyo2+n0Trx/vlwLI4PEtoOvgEIZj/RBzuJpzU726gU/NnAUcn974SL8iyeXHc\nGHg/gXX3q5GGasu2+sjgZcDzhONdesx7FPg7IVh0bxzuAh6uEqTNW0II9EqfyOxzzeTdXthvRvO+\n0X2tE/vccsI+t4Swvy0GniR0xvsEw/vcPcC9meZptWQvZJ9oMU3SZbHZfaP5tRf2mXa8b3TfGmk/\nTAOrzVhNONc9R9jf0nPdXcCfgAeB++Jwh5d9SY3tpJYTrgNUvuygQoNHwNcId6hHcneN6dcR0r8l\n4aLlUeA1uWVmx3GtfpKqsopd0MzyItKUOYQnItxI6Nj+gjoB4CWE/f1fgA92Pnmj5oQT4qo4zg7V\npjUzfSXh5LiU4WBZOqyo8brR92mA6LnM6+er3NkZ3Y9T9jVWsSeAilXsZcBVhGP4Q/Hzat6hjRdT\n2eBfWgjqhcLXIBYos+970RrW3j/a/T7d51YS9of8fjOa4fk4vLCvZcbLGwgGNWsRcIBV7GLg58At\nhPLVUkI/j1U/L7fP5Ydeyp+jed9Lacm+b+VirNOc0e1X9ZbN7iPLqH3OambI3mCqtr8tI+xz9YJB\nzUr7aLzSKnYh4WL4DkJAasTPq7Hf9VLeHOT3vajeeWq0+2S6r6Tly0b3qUaG7H6W3/dWtvNc52V3\nq9gi4HNWsR2BXxPKlw/Ez1vVwrkun1eaHXd73WaW/YaX/famfmQKDh552RfRege48wkZ/vH4/g/A\nCVaxWV72dNo+hCfB3drktjcinCClf+j/6h/PAe8mPLq97v8WC1ivBbCKTWG4Sut01r47ko7XEPKD\nN/h6tEGetaZ14KJvEO0LHAYcSHiy3gusYqsJBZN8kCh93as6HciodpE11OXP7Jn3HbjQG3QLCbUm\nDgLOzM+0ii0nHA/zBede3ufSQH07ghaNvl/R5u11M+3t2O90fmvc7wn723sITfKnZmdaxVYQyg/V\nArO9arTBw1ber+jy5/XUe+1zTduPUL48CPhIbt6a2HIpX67s9XNdKrv/VTtH1Bvnp53dSiLM+yBP\nWsV2I1w8Xk2IiO4GnEZo73hoXGY8cBPwMKEt5BzgXOAHXvbji0i3iIjUFu/0zAa2JfQnMSUOkwkn\ntnTw3Pv8UPQFlQIZ0jesYusDWwNbMdzvRHphO9J+lt0Pi76oWqOLKukXVrG0pcS2hP4bpxLOdRMZ\nztv19r3Cgxna56RfxPLlLMI+tynD57lmype1blA0GpwZ7bJrTeuV/a9fgkevBP4fsD3hT7+HEBg6\nNdvc5VdX2QMTxjE3u+6qNZyw957+pW6mV0RERERERERkUBTd51FD4mOeX1dvuQnjWE2oGnpGZtrS\nDiZNRERERERERGSg9UXwqElLSyVvuIPsJLHJhNpMWUOlUutPcBMRkeqSxKYQmh8/TaiOO53QdKaR\n9uZrGO7Qu5Fxs8suL5XUBE0GT5LYhcBbCA8gmB6HRvpWcdbtTL/VfazaeFmp1N5O+UV6QZLYQcBP\nCE+MmkrY56Y0uPpK2nNOqzVvRanUB01PRJqQJDaRUL58jlBenBaHRqxidOe5esssH5R9bhCDR8cm\niX0WuB/4EXBaqeSrRlj+OKCcm1YhdC4pIiLttRPh3PPWUsmvBUgSM0KhehrDwaRq45HmTSM87GBe\njfkNne+SxEa6UB5tAX45sGpQChDSV3YBvlcq+dHwwj43ifr7VL39byaweY11JjWSsCSxIdoTiKq2\njefRhbIUYz7wSKnk89IJSWLjaHx/G2n/22yEZRu5EbM6nus6cTMmvVBWUFi6bQdChZB9SiX/Lbyw\nz02hsTLkSPvd7BGWaehGTGaf68R+93y39rm+CR4liZ1AuGs2n1AQmFllsXMIT1ibD7wU+DzhAPvx\nETZ9EnBqbppqHYmIdMYuhDtCN6cT4oVd+ujyJzvxoUlikxhdYX0asD6hAFFt2XwN1lrWJIktJwSS\nihgUvBpjksRmEMpEf06nxTwwFIfFHfrciQzf+R3NfrdxjWXXenpVnbRk94Hn6e4+t1L73Ji0C5l9\nDiDWbH02Dm1X5UZMqxfLG48wb3yDaUkf/V7EMKTg1Zi0Sxz/JZ0Q97k0qNnqE95rqnMjppn9Lr0R\nU23exAbTkt3nnm9g/LVSyZt9In3vd5idJHYy8Ok6i+0A/IPwtLVHgU8Sela/gPjjqxmaSH+KB+av\nE/bplYTHto40bmSZZscq/LdJkth3gFKp5DsVnZZ2ShIbT/3CwZQODM1KgwYrqoyrTev0suk+torh\nfU2F/jZKEns9cA3w8lLJb663fL+Id5TT5kDVCuud2ueafaRzdp8bzX4y2uXz+9wLg/a59ksSewQ4\ns1TyzxSdlnaqcyMmfT2Z0e1jU3PvG7p4zllN985jjSy7Mj+oXNleSWJfJ9Rq37rotLRTAzdi0nPd\n1CbHHyuV/Ppm09NSzaMksc+NNL9U8hNb2W4NXwPOzrx/J6Gp2Wsy0+4G9gV2BPYulfwx4KYksdMI\nHWhvA/ytjWkSke7ZEjgSuJ5QPXMiIcpfbZyf1tAdskYkiaUXt20PTOWmrSZcSK/Kvc6/b2W5keat\n6VJBZp27sYMgXnwtjUNXZO54NVswn0Qo3FcbV5s2fYR5+WmjrtGcJOZUKWg3MKxqcb2RtpHuI6tz\nr/PjRqetM68L+918wgXMbR3+nK6Kd5TTav1dEfe5ibRnnxtp2owGl59MaxfW+e+V7nPt2IdGu52O\n7Wt0aZ9LEpsDzGEwz3VpcOTpbn1mvDnTTEBqKiPvM9XmTSHsd42u10hTpXrfazXdPZ+NtFxH9rXM\n626UMeczmPvcSkJ/hUuKTgu0Xsh7Z+79RGArQua4C2hb8KhU8kVkqpnFSP6aUsnXKgQlie0G3BwD\nR6mn4njDWttXh9kiPS+thrp/qeSPNLNivDNdLajUjnGjy04BNmhi+fFxmEAbCifNiAWZWkEmJzQ3\n89zrWuP8NAjfbzvgoq58oQGXa3rUE4WKuM9NovECeHZ/mpB53eowgeGAV6vrd1WS2BoaL5Sn+5Jn\nxl5n2maE8tHKzn2LsSHuc+nF8zMFJwdYK6DVTEC4kf2gmf2m2j7XzDa6KgbLGr0ATs9jNDBOX6ed\n9A7chWwR4s2ZtOlRT8gEtBo917V6Phpp/rRRbKNtN1cbFcuYjQabmjnHOaFG6Pas24+xtFlLhaRS\nyXfJT0sS2wA4G/jpKNPUqpcDU5LEXkG487sbcHycN32E9dRhtkhv2wV4rNnAEbxwZzq9uO478aIg\nDSRNGOH1SPOafV1rnsVhXJXXtcb51ysI54jz2vk7Se+I+1za5r7vZPa5bAE73QeqjUea18gyzayf\n3Q9pcGyEO8uXjeqHkZ6VC2j1nQb2uXbsU6PZRnqRXW9/y7524D7gntZ+Fel1vRjQaka80ZMGltp1\nPmvXOS+NT9Q7t2WnAVxKiEVIB7XtDlup5M8kiZWBS4BzG1mn0f6MSiW/LdNh9q5Uf4LHasId7Zsa\nTzWgDrNFet1AVkNtRLwoSGv/iEiH5fa55wtOjsjA0z4n0n3xRk/fBp2lOO2unj0jDo3K92dUzd1x\nPAm4kFBN+I1VlrsFOAD4V+CKOG0ecB2hE+2qYvM0BYtEetcuNBiQFhERERERkfZrtcPsI3OTjPAk\npPcBlze6nXx/RnWWLcfPrvWEnj8AnHLKGbv/z/988MfuDCWJ7U8INt1qxiTgve6cU++zksRubOgL\nrKvZJ3BoPa3XzfWK+MxW1xvPcMeHM4BW90kREREREREZJfMWOj5PEsu34V1DCAJdBZxUKnnbnzaT\nJLYFoePrzxE67E77XbqzVPJnY8dlq/baaxUXXrj5kg03fOxRQieR3y6V/HgzNgIed1+7gzB1mC0i\nIiIiIiIiUlurHWZv1e6ENOBE4NDM+7QPlAVAUir56iSxr7nbMWZrJgNzCbUWHo/LzaX602jUYbaI\niIiIiIiISA0t1Txql2Y6zM6scxjw9VLJZ2YXMuPPhKcbvILQ/9GqjTd+cNP111+80T33vPzvwFbA\nFe68O5cG1TwS6VFmTCN0wn+E+wv9n0mfMmMc8DOg4s71RadHZCww4yhgqjsnFZ0WkbHAjJ0ID+N5\nm7s6JBbpNDMMuBg4yZ0/Fp2eQdbuDrOb1UyH2fVcHMfzgSuBZ2fPvm+bV7/6Fwc/+OC2F65cOeUu\n4KL8SuowW6SnbQvsCexO48cC6V2zgLcC14CCRyJd8k5gOih4JNIlbwD2JTy4566C0yIyFmwI7A9c\nCwoedVKhwaNmOsyux50KgBn3Ahe4szxJ/s8JwFNnnVX5Qjs+Q0S6bl5uLP1N/6dI980jBI9EpDuy\n5zoFj0Q6T+XLLim65lHDMh1mbwGMTxKbH2elHWa/DZh99dX8EZibJOwDHA98tZgUi0gbzM2Npb/p\n/xTpothUdHNgkhmT3VXTWqQLdK4T6S7tc13SN8Ej6nSYDawEPrrnnqvPMHPcwd0c7DPAZ9KV8k9b\nE5GepjsJg0X/p0h3bQJMiq83R81/RbpB5zqR7tI+1yV9Ezwqlfww4LAR5l9B6BD7He5kewGfCOxC\nCDzln6pzUzg2AAAgAElEQVQmIr1NdxIGS/o/6uQu0h1zc68VPBLpPJVdRLpL+1yX9E3wqFHuL3Sc\nnfXfZtwCvAc4s8tJEpHW6WQwWNL/cbYZk/QUGpGOywePRKSD4lOfVHYR6a50X9vIjKnuPF9oagbY\nuKIT0EV/BPYqOhEi0pR5hE71NzFjStGJkVFL/08DNis4LSJjwTxCs/5nUI0/kW7YGJhMONdpnxPp\njrR8CQradtSYCB6ZMRU4Enio6LSISGMyd+/+ECdtXmBypD2y/6dO7iKdNxd4ELgf7XMi3ZDuZ39A\n+5xIt6h82SUD12zNjKdgrT6PDFgfWAYc0uA2lOlEijcTmAL8Htgf2MVMTwrqY0YIAJ5O+D9fbsa9\nhaZIZPBtQwgePQdsrfKNSMe9Io7/AOxvxktATbRFOii92XwWoXy5sxn/KDZJfeEJd5Y3u5K5e/2l\n+ogZh7F28GgNoRrbte481eA2ButHEelvuwPXEDq/l/73ZuAcQtV+Eem8HwLPAh8tOiEiY8Qywrku\nKTgdImPJ24HvAZsWnZA+sbc7v252pYELHgGYMRP4ALBDnHQrcKY7S4pLlYiIiIiIiIhI/xm44JEZ\nrwKuAJYD18XJrwamAvu6c2NRaRMRERERERER6TeDGDz6LXAncIQ7q+K0CcAPgJe48/oi0yciIiIi\nIiIi0k8GMXj0PLCLO7flpu8IXO/OtGJSJiIiIiIiIiLSf8YVnYAOeAbYosr0ecDSLqdFRERERERE\nRKSvDWLw6ALgTDPeY8a8OLyX0GztxwWnTURERERERESkr0woOgEd8AnACY+CTr/fSuC7wLFFJUpE\nREREREREpB8NXJ9HKTOmAS+Nb+9yZ1mR6RERERERERER6UcDGzwSEREREREREZHRG8Q+j0RERERE\nREREpE0UPBIRERERERERkZoUPBIRERERERERkZoUPBIRERERERERkZoUPBIRERERERERkZoUPBIR\nERERERERkZoUPBIRERERERERkZoUPBIREZGOM7PDzMzN7FVFp6VdzGwHM7vCzJ41s8Vmdq6ZbdLg\nuvfG3+NbVeaV4rwD25/q3mJm+5rZmWb2NzNbbWb3Fp0mERERWZeCRyIiIiJNMrO5wG+ArYHjga8C\nbwF+aWaTmtjUEWa2WQeS2C/+JQ5LgIcLTouIiIjUoOCRiIiISI6ZjTOzKSMscjwwHdjT3b/p7l8C\n3g28AjiswY+5BRgPHDuatLaTmU3r8kceD2zg7nsAf+nyZ4uIiEiDFDwSERGRnmBmk8zsRDO7wcyW\nmNlzZvZbM1uQWcZik6+fVVl/Slzv+5lpk82sYmZ3mtmQmT1gZl8xs8m5dd3Mvm1mB5vZLcAQ8KYR\nkvsu4FJ3vz+d4O6/Au4gBJEacS9wDg3WPjKzzc3sP83ssfhdbjGzw3PLpM0Dt8xNT5vClTLTkthc\nbFcz+42ZLQO+lJn/kfgZQ2b2sJl9x8xm5rabbmNHM7vazJaZ2UNm9qlGfgB3f9jdVzayrIiIiBRH\nwSMRERHpFRsAHwQS4NPAQmAT4Eozmw/g7g6cB+xnZhvm1n9b3MZ5EGoPAT8HPgFcAnwMuBj4OHBB\nlc/fEzgtzvu/hODOOsxsc2AWcH2V2dcBu9T/qi/4IjCBOrWPzGw28Edgb+DbMX13Amea2VFNfF7e\nRsDlwE3AUcDV8fMWAt8hNCU7BrgI+DfgF2Y2MbeNFwFXEGoOHQPcBnzZzPYbRbpERESkh0woOgEi\nIiIi0VPAlu6+Ip1gZmcQghEfAz4QJ58DnECo4fO9zPqHEAI+v4vv/4UQbHmDu6fTMLO/Ad8zs93d\n/feZ9bcDdnb3W+ukc9M4fqTKvEeADc1ssrsP1dkO7n63mZ1LqH10krtX2yaEINP4mL4n47TvmdmP\ngYVm9n13f77e51UxB/iwu2dra20CHAf8AtjP3dfE6bcRAleHAGdltrEZ8H53PzcudyZwH+H/uryF\nNImIiEiPUc0jERER6QnuvjoNHMU+hzYk3Oi6HnhlZrk7gGuBg9Npcdn9gPNj7SSAg4C/A7eZ2cbp\nAFwV57/QHC66poHAEcDUOK4WHFqeW6YRX2CE2kdmZoRmcpfEt9nvciUwg8zv06Qh1g4EQQi4TQK+\nngaOojOAZwgdg2c9S6ztBRD/w+uAl7SYJhEREekxCh6JiIhIzzCzQ83sr4QgzJPAIkKwYkZu0XOA\nPczsxfH9QcBE4NzMMtsAO8VtZIc74vxZuW3e02Ay0xo+k6vMm5Jbpi53v5uQ7g+Z2aZVFtkEmAl8\niHW/Sxr4yX+XRj2UrekVpb/p7bl0rgDuzsxPPZgJ2KWeIjRnExERkQGgZmsiIiLSE8zsEOBsQr9E\npwCPA6sJTahemlv8vwj9Ex1M6OT5EOB6d88GPMYBNwNH1/jIB3LvGw34pE3LqgV6NgUWN9JkLeeL\nwPsIfT1dnJuX3uw7D/hhjfX/Gsf5IE5qfI3prTR1y1tdY7q1YdsiIiLSAxQ8EhERkV5xIKFmywHZ\nmixmVskv6O6Lzewy4GAzOx/Yg9Dhc9ZdwCuAX1epGdMyd3/IzBYBr6oy+zWEzqeb3eZdZnYeoVPq\na3OzFwFLgfHxiW4jeSqOZ+am52sLjeS+ON6O8H8A4Wl4wFZAvTSIiIjIgFGzNREREekVaQ2WF2qs\nmNlrgd1qLH8usCOhltJqQm2krJ8AmwNH5Fc0s6lmNn0Uab0IeKuZzctscy9gW+DCFrf5BULTu7Ue\nc+/uq+PnvcvMXpZfKXZwnborjl+fmT+e0OStUb8CVgBHxv6WUh8gNB+8rIltiYiIyABQzSMRERHp\npsPN7E1Vpn8DuBQ4APhprFW0FfBh4FZgvSrrXEboF+kg4HJ3fzw3/1ziE9nMbAHwv4TmW9vH6W8k\ndMbdii/Fz73azL4R0/dJQjO5fAfUDcnUPjq0yuxjCR18XxufQHcrsCGho+y942vc/RYz+yNwUuxE\nfDHwXpoo87n7IjM7CSgDV5jZzwm1kD4C/IlM59ijZWYvB/aPb7cGZpjZZ+L7v7j7Je36LBEREWmd\ngkciIiLSTf9eY/rZcZhDaLr1RkKA5BBCkKaUX8HdV5jZBYSgxrlV5q8xs3cAHwfeD7wTWEZoivUN\nhjvObpq7P2BmbwBOBU4m1NS5DDimhf6Osr5A+M5r9VHk7o+Z2WuAzxECbB8hBM5uIfSTlHUw8H1C\nwOlp4EzgauCXjSbC3RfGpnn/QehbajFwOnC8u69s/mvV9Erg87lp6fsfEp4wJyIiIgWzNnYBICIi\nItJVZnYaoTnVHHdfVnR6RERERAaR+jwSERGRvmRmUwi1dC5S4EhERESkc9RsTURERPqKmc0i9PNz\nILARoQmaiIiIiHSIgkciIiLSb3YEzgceB45095sKTo+IiIjIQFOfRyIiIiIiIiIiUpP6PBIRERER\nERERkZoUPBIRERERERERkZoGq88js4VAOTf1dty3LyA1IlKL2Z3AS4tOhoiIiIiIyBizN+6/bnal\nwQoeBbcQnsCSWlVUQkSkCrMZhMDR14HrCk6NiIiIiIjIWHJLKysNYvBoFe6PNry02WRgcm7qEO5D\nbU2ViKTmxvF/4/6/haZERERERERE6hrEPo+2wexhzO7G7HzMtqiz/HHAktxwXKcTKTKGzYvjBwtN\nhYiIiIiIiDTE3L3oNAwz2wZYAMwiH9hyP7GB9fcD1gNuBzYl9H+0OfAy3JfWWEc1j0S6yeyDwOnA\nZNxXFp0cERERERERGVnvBI/MjgC+CzwBPApkE+a4v7KFbc4E7gOOxv3MdiRTREbJrAIcgftmRSdF\nRERERERE6uulPo8+A5yA+5fbtkX3pzG7A9i6bdsUkdGaCzxQdCJERERERESkMb3U59GLgAvbsiWz\nj2J2L2bLgV2AiW3Zroi0wzzU35GIiIiIiEjf6KXg0YXAvqPagtlXMVsInAqcD9wIrAQ+iNms0SZQ\nRNpCNY9ERERERET6SC81W7sT+DxmrwNuJgR9hrl/s4FtzAUOAMYDhwK/A94PXAMcDpy8zhrVO8x+\n/TqfLyLtoppHIiIiIiIifaSXOsy+Z4S5jvtLGtjGJGAZcCDuF2em/xCYifvbq6yzkPBUNhHpnrfg\n/j9FJ0JERERERETq652aR+5btWErGxNqHT2Wm/4YsH2NdU4iNHPLGsJ9qA3pERERERERERHpa8UG\nj8xOBT6L+3PxdS2O+zEdSUMIEilQJCIiIiIiIiJSRdE1j7JPQttlhOUabVv3BLAamJ2bPht4tLmk\niUjRrGLTgUuAD3jZR2raKqNgFZtE+J2P8bL/rej0yOCzin0N+LOX/byi0yKDzyr2PmC+lzt0I1Ik\nwyq2E6FVw1u97OpDVTrKKjaNUIb7Ny/7nUWnRwZbscEj9wVVX7e+vRWY3QDsBYQ+j8zGxfffHvX2\nRaTbdgAWAK8EFDzqnK0JT7t8NaDgkXTDIcCLAAWPpBsOAHYGFDySbtiLcE7dGHik4LTI4NsR2BPY\nlfAAKpGOKbrmUSecCvwQs+uB64CjgOnAWYWmSkRakfaFNqPQVAy+beNYv7N0nFVsPWAWym/SPdui\n/Cbdkz7kZwYKHknnqQwnXTN4wSP3Cw59px1w1VacuWg647d9kuf2vouPnXql5zvRFpHet2Uczywy\nEWOACh7STQoKS9dYxcYTaleaVcy83CuPGZYBlg0eiXTadnGs/CYdN67oBLSbVew958znHQ/O4Iih\nCex082x+dNrunGoVm1V02kSkabrI7A4Fj6SbdGEl3bQFMInQx+aUgtMiY4OOcdJNKsNJ1wxezSM4\nGjjDy34WgFXsw8BbgMOBk/MLW8UmA5Nzk4e87HoCm0jxtoxjnRA7SwUP6SZdWEk3bZt5PQN4vqiE\nyOCzihk6xkl3qQwnXTNQNY/iE4N2BX6VTvOyr4nvd6ux2nHAktxwXGdTKiINSmseqdlaZ6nKs3ST\nLqykm7bLvFaek06bDUyNr5XfpKNisFLBI+maQat5tDEwHsj3b/QYsH2NdU4idLKdpVpHIgWLJ8Qt\n41udEDvEKjaD0HnxKvQ7S3ekwSMFhaUbtiUc3yagY5x03ksyr3WMk06bA6yHynDSJU0Hj6xiPwDO\n87In7U9O98XmaQoWifSe2YT+KZ5FJ8ROek0c/wkVdKU7Xgo8CWxkFZviZV9edIJkoL2W8PTd3dEx\nTjrvpXG8GJVdpPNeG8fXofwmXdBKs7VNgCusYg9YxU6xir2i3YkahSeA1YSLzqzZwKPdT46IjML8\nOL4eFfg7aR/Co4R/jwoe0mFWsTmEZkRXx0nKc9IxVrENCd0ZXBQnKb9Jp70euAN4GOU36by9gbuA\nm1B+ky5oOnjkZX87sCnweeDVwI1WsVusYsdbxbZsb/KaTtsK4AZgr3SaVWxcfP+HotIlIi3ZH7gX\nuBadEDtpH0K/cE+j31k6723AGuDH8b3ynHTSnoCh4JF0QbzmeBvwc0Ifqspv0mlpGU75TbqipQ6z\nvexPedlP97KXgBcDZwPvA+5sX9JadipwhFXsUKvYDsB3genAWcUmS0QaFfs72p9QAHsa1TzqCKvY\nLEINLxU8pFv2B34H3B3fK89JJ+0D3O5lvw9YivKbdNZrCK0dfobOqdJhVrEtCH26/RLlN+mSUT1t\nzSo2EXgVob3llqzbUXXbWMXutYp5bjg2t8wWwPsJ3+s/gZsJF0Zv8rJ3LG0i0na7AZsTCmBPAzNi\nQEna6yjCY6uvIBQ8JlvFJhebJBlUsXbyPgxfWIEKu9IhsYnkwcBP4yRdXEmnHQEsIrR2UH6TTjua\nEBT/NTG/qawsndZS8MgqtsAqdgYhWHQ28AzwVmBu+5JW1ecITebS4VuZNI0HLgMmESL/bwWeAn7t\nZb+2w+kSkTaJ1b6/CtwK/IZwQhwPTCsyXYPGKrYtoeDxVS/74+hiXjooFmi/Q7iwOgPlN+mgmN++\nQnggylfiZF3MS8dYxf4JOBxY6GVfjfKbdJBVbGfgo8CXvOxPo7KydEkrT1t7CNiQcKf6Q8Al8Yll\n3bDUy16r4+t9gR2BvWMto5usYp8FvmwVWxj7QxKR3nc8oebRAi/7KqtYepE5E3iuuGQNjljouAy4\nj7UvrCAUdh8vIl0ymKxiUwmBozcDb/eyL7WKpeUPXVxJW1nFJgFfJnSn8K9e9qfiLF3MS0fEwNHP\nCA+e+H6crPwmHWEV25VQhvs78PU4OVuGU1lZOqbp4BGwELgwRjm77dgYELof+BFwmpd9VZy3G3Bz\nrnnalYQ+j3YC/lxtg7GJRr6ZxlAXA2IiAljFpgAnEZpSlb3sSZyVHmtmAA8VkLSBEC/WX0to2ns4\ncDuwr5f92biIaoJIW1nF5gLvBD5OqJn8fi/7zwFiYPg5lN+kTaximxE6Kz4K2Ab4mJf97MwiupiX\ntonXD7sTmqr9M/BbYP9Y6wiU36SNYlcxuwGHEcpxfyV0y7I8LpK90fpw1xMoY0bTwSMv+xmdSEgD\nvgncCCwmHKxPIjRdOzrOn8O6fS49lplXy3FAOTetQgiSiUiHxaDRB4BjgVnAUV72b2QWUVCjQbGp\nxkbAZnHYnPBY9JcRjpszgEeBzxKC79kguX5naZpVbDphv51N6PtwO0IHnrvG16uAi4H9vOy351bX\nxZU0xSo2jZDXZgHzgO3j8ArCcc6BS4D3etn/klt9SVxXpCHxpstGhPy2GSEouR2wA+FCfhrhMen/\nDpyRCRxBpg8aL7t3NeHSl2IXLGl+m0PIb9sT8tvrgPWBB4FPAd/ysq/MrK4ynHRFKzWP2sYqdjLw\n6TqL7eBlv83Lfmpm2l+tYkPA6Vax40ZZS+gkwhPaslTrSKTDYmemH47DJoRHd3++ygVmWvOoL5+4\nFgsDE+MwIY4nA1MyQ7332WnrARtkhhm51xNzSbgXuIVwnLsSuD5XwE1l71pJn4oBxHGE/JIOU3Lv\nG503jVBYzQ7rxfEGwMas27/C48AdwFWEIOVVXvYnayR3CcpvfS2T3yZRPS81M20qw/mr2rAx4em5\nWYsJTTf+CHwJ+JWXfVGN5C4hXPhLH4t5bhLNH8/y89LzabXj2/rAiwgX8tkOiFcQnix9B+Em86+B\nm7zsa6okdQnhnD8NNSPqWzG/peW2Zo9p+WkjHd9mEI5x2f6IVxHy2+2EprhXAn+uU4ZT8KiPxfw2\nntrn1GaOddkyXLW8989e9t83m8ZCg0fA1wgdbo/k7hrTryOkf0vCTvUooaPsrPQOU61+koiBp7WC\nRVaxcbHTXhFpvznA54FDCCfGs4FveNnvqLF8Gjz6mlXsI4SLhcXAs8DKOKyK49WEgt54wgl4XOZ1\nfloa1EkDOtkhP200y4zmyRcrCMen5XEYit/7mTg8SOhYPH2/BHiEUGX5YeDRJvp7eyaOd7KKXVyj\ncNI3Mifg9H+oNzSyXLuW6fS2WuUM57MhYBnhSS7PxvESQp5L3y8iBIseJ9T0fTDTv0wjlgBbW8Wm\ne9n7+uIqk9+KzFtFfU6rnOG8NkR48uPS3PBw5vWTDOe1x4CHvexPNPF5S4DZVrHZg/AE3lhO7Zdj\nUjuXGU35fCVrH+Oyx7elrH18e5qQz7LHuPubODemF/MvJzx9ra/F/NZLx51ubWO0+S17jMvmtaWE\nrhjS188wnN/SY9wDudpFI0nz28usYr8ckDJcvx63RrutVq1h7fyWluHS4WnCMS5939J50LxPa1Ja\nxQ4GzgE29rI/ZRXbD7gU2DQ+OQir2IeAU4BZzdROsor1548i0j+eINxFObORi02r2DGEJgkbEu4E\nbkSIqOcDNRMIB881hEBSvXE++FTrfbuWSd+nhdflVV6/ULCtcTezY6xi1xOaGy0Cfgf8g3DxlqZp\nJcOBt2rDhCrvizo5j2/zz1PNGsJ/WmtYWWd+o8s0u63VrF2AWJ57X2vaqm42r7CKnUbon+Z5Qn8h\ndxA6cV+WSV822Fsrn9XKb9282Cg6v3UiHzWT31bQeD5Lp3U7v72b0F/mOOBa4G+EJkdLM+mC+vms\nG8e3RpbpxiO5ezG/pUM+b9XLfyu6eU61im0NXE+oCXIbcAOhFslihs/1q6mf1yZQPc91O5gy1vNb\n0+fULuc3I9Qy34FQaeL3hDLcIzRWhquWB9tdLmtmmSLPqd0uu+WXS8twTZ1TM/1Ad1RfBI+sYrsR\nOnq9mnCS3w04Dbjcy35oXGY8cBPhQudThNoN5wI/8LIf3+TnHd6+1ItIzirgUi/74qITImuLhY9X\nA+8CdiG0t59NqP5a7e6bE05yqxk+4WWHkU6aRQVW2rWt1d0O7g0iq9hLgQOAPQj5bR6hynW++SUM\n57dqeW2k/NbL+ajRbSm/tYFVbCNgf8ITercFtiI0hZtUY5Vq+Wws5LdVwBr11TM6sU/HfQgdue8A\nbE0IJk2hejCmWj5blRn3Wh5RfushsXbYawnn1PmEc+os6pfh0jzWyDGu1/NRo8vonNqifgkevRL4\nf4ROwyYD9xACQ6dmaxT96ip7YMI45mbXXbWGE/be07/UzfSKiIiIiIiIiAyKCUUnoBFe9hsJvcyP\naMI4VgOfA87ITFvawaSJiIiIiIiIiAy0vggeNWlpqeQ1O8gWEZF1JYkZoR+QeYRO9Z4mdMD4dG7I\nT8u+f75U6oPqrNITksROA/6Z1vPbMuU3aVSS2D8D32K449Bm892SUqm/O6GV7kkS25bw1MlVNH5M\ny+e3Vd1PufSjJLEJhH61NqGxY1q1act1TpV6+qLZWqOSxO5luK+E+wkdI5420sE3SSx9rF3WUKnU\neAfbIiL9LknsxcC9hKffLSb0yzAzN6TTat14WEnjhZRq055VwWXsSBK7n9Bh8Q2snb/yQ638torm\nA07Z90tLJfV5MFYkiZ1P6A/kAtY9pmWH6SNsJht4avY4t6RUavjpl9LnksT+DfgO4eEg2XyWz3Pr\njbCZZ2ktsJ7mN13LjBFJYjsSOsw+g3CcqlZ2S4danVGnZbhmy27pNJXhxoBBq3n0TeBGwoXP7sBJ\nwKbA0SOscxxQzk2rAAs7kD4RkV61SxyfUCr5w7UWijWUpjFywSQ/bYvMtBdRu3PaNUli9QoqzxAK\n1CMOpVLDj7eVAiSJbUSo5fapUsn/a4TlDJhK43ltJjA3N21Kjc17Jr/VuhBbQmP5TUGB3jcfuLJU\n8hNGWihJbCKwAfXzWTptC8Lj2NNpG4yw7WXUqW3C8OPjRxpU6673zQf+3kB+m8C6+W2k49xcwtNn\nZ2SmV30SWpLYcuoHnBrNbwq097a0DPfpUqn2U4zjOXU6jZ9Ps2W4mYQyXLUHWgCszpXhquU3leH6\nXM/XPEoSOxn4dJ3FdiiV/LYksS2A7wILCJnvBmBvYL1a0XfVPBLpfUli2xBOWs8THkmZjpcTmkqp\nUDNKSWILgY8Aszt9UZIkNoXmggHZwNP6hIJPvccGr6CBAkpuWJYZnh/h9ZAu3EYnSWwv4FfE83eH\nP2syjeWzWss0kt9W0nx+e46R81n6WvltlJLEphEukj9cKvkZ9ZYf5WeNJxynmgmwZ6etR+0Ae8oJ\n+afZPNdIfntezfNGL0nsj8A/SiV/X4c/ZxzD+a2V49x6rHsdVE0j+S0biHqOdfNXtTz3vJrnjV6S\n2CnAQaWSb9nhzzHCDZlmjmnZwNN6jFzbLlWtDDdSoPM5Gj+nrtA5tXX9EDzaBNgI+A9CppkNHAi8\nJrPY3YRHCt4EPAp8klDj6HxCRt2+VPLbu5hsEWmTJLH1gScYuTC9knUDS82Om15nkAo8SWI/A6aU\nSv7GotNSTywsT2W4ENLosH6dedUeZVuNUz/AVDUIQMg/Q7nX+XGtaUODclGXJHYMcCKwQa9/p0zt\np2bzW7282Gjtb2fdi696+S17XGsmj601rdf/m0Ylib2G0Kfbq0slv77o9NSTJDaJELRsZ56b1kQS\nVtD4cS19/zwt5LHc65WDcFEXA4hLgc+USn5q0empJ9a2a3d+G6n5Z95KGgs05ee3fGyL44HIbwBJ\nYr8k1Nh5Z9FpqWcUZbh659RazfHy0jJcI8e17OtW8tha8wbhZndLwaMksc+NNL9U8hNbTlH9zz4M\n+Hqp5DNz0/cDLgU2K5X8sTjtP4F/JdxJf7xTaRKRzkkS2wP4HSFo/CDhhDOlQ+Nmm/KuYvjksCIO\nKzOvqw2jmb+KEChPh1VNvk7fQ6h2nB0uAc4vlbxeTc+BFAMEEwkXWFPjuNbrVudPJuS1ydSu9l1P\nNs9VK7SkeXElw3lpZQferwbW5IZ60zx+78mEZuWzSiXfrcXfoe/FC7bR5rVay+bzW72aLLWson5h\nOZ9HGslHzU7L5qNm8l6a394NfJZQE315i79FX4sXbFOonleayWfVXk9nOK+l+a5ebb1qnPoXZNlz\nZaeOc+l5s9lj3Lj4/bcmXJPsVSr5VS38Dn0vk986dYybytrHuEZv/uTVCwQM0Z5jWL1lWjm+GcP7\n3NXAt0olr7T4O/S9eE5t53Et/7odZbiV1D+ndrr8lh7j/l4q+ZJmv0CrfR7lo5oTga1iQu4i3E3s\ntoMItY7mJIlNB3YD9o/zNgeqBo/UbE2k580nHOgu6XS/IrHvgSk0H3SaVGWYWGP6ek0u3+pFXyv+\n2MXP6inx7mN6UfJ0pz8vFqyzF1r51/lxo9PSh0akw7Tc+0kNvm/lwq9Z3+jCZ/Ss2KfDSkIfEB0V\n89sk2pPH8tOyeWgqjeex7Ptu5LcbxmrgCCDe8U7vondUDMan59N2HdvyF27Z41uzea4bfb6uJLSI\nGJNy+e3JTn9epvzWzmNbOi3NQ9nzaTN5rlt9DI/ZMhy8cE5N+47rqEwZrl15rFoZLq2B2kqea+Sc\nujfw62a/e0uZuVTyXfLTksQ2IDyl56etbLMN0n4JriH8+PcQOtCuAHNGWE8dZov0tvnALd3okDY2\nQ0vbT/eEWAgfTziujScct8dXeT3SvPxy41j3rtcy4O9d+lpjXixYp009ek5sdjFS4WRcZhg/wvv8\nvMUfxCEAACAASURBVOxdt7917QuNcTG/pc3YOl6wblYmv9XKc7Xy00jTxhPyW/q97+naFxrjYjA+\nPbcsLTg568jUNK2W5yYx8jFspPdpzakh4LFSyRd37UuNcb1YfktVyW/VjnNpWa/R41ta0yo9vi0F\nOtp/oAzrkzJcvYDmva1su619HiWJ7UyoHbBlg8s30xn2CcBbgF2BSaWSrxVRSxI7HTiixjbeXCr5\n5TXSoJpHIj0sSexPwM2lkh9edFpERERERETGonZXo5sRh0Z9jVBbaSR3x/Ek4EJC9e5qHbo+CnDK\nKaef8aY3nf2FnXf+/QrCY4CvAx41YxLwXnfOya4Ug0RrBYqSxOYnSUM1qButZt3ryxX52fouvfnZ\nRX6XCQx33Dgd2Bk4t4nPERERERERkTZqtcPsI/PbITzd7H3ANaWS/0sb0lbrs38AfKBKzaP9gP9Z\nsGC1w7g57jyeJPYh4BRg1oIFvh7wuPvaPbHXqHnUc1W6RcaoFcBiYJ9SydXERUREREREpACt1jz6\neO79GmAR8EPCE1TaLklsC2DDOJAkNj/OurNU8meBXwCYYRddNPuOJHn8UWAz4NuxdtH2VA8KVe3z\nqFTyhe3/FiIiIiIiIiIi/aWtfR51UpLY2cChVWYtKJU8MePPs2bdv9miRZvP2nLLW33cuNVutmbc\n4sWbPrh48aaLCU+Du8Kdd+e2qz6PRERERERERERqKDR41EyH2Zl1DgO+Xir5zOxCZi/UHioT+lJ6\ntlS6YMFmm929y49+dNzJhB7FL3Kn409sEpH2MGMz4AFgX/fmHycpjTFjBqH26HvcC3tipowhZvwJ\n+J37OjWZRdrOjK8Bb3DnVUWnRQafGe8AfgLMcufpotMjg82M2cBDwFvcubLo9Mhga3eH2c1qpsPs\nEblTATDjXuACd5YnyXuvBy494ojjT1VNIpG+9BLC40i3AQWPOujFhEd3blN0QmTM2BZ4rOhEyJix\nLTq+SfdsSzinvhgUPJKO2woYTzjGKXgkHVVo8KhU8kWEu91t484PM2/nA08pcCTSt+bmxtIZ+p2l\na8zYANgA5TfpnrnABmZs4M4zRSdGBl72nPqXIhMiY4LKcNI1Rdc8alimw+wtgPH5DrOTxN4GzF6w\nYM3pgIPbuHGrzN3cndXpdvJPWxORnqYTYnfod5Zu2jyOld+kW9K8tjkoeCQdp3OqdNO8OFZ+k47r\nm+ARcCJrd5j95zheACTASuCjlcqBQ2bO0NDUR+6552VXXHDBJ65evXrc/Lhu/qlqItLb5uXG0hn6\nnaWb0ny2kRnT3FlWaGpkoJkxFdg4vp0H/L3A5MjYoHOqdFMaNFJ+k47rm+BRqeSHAYeNMP+K/9/e\nncfLUZX5H/882TdIQEICJEhQQBaHHUVmoNlUfm6gII6goAyOG7gry+ClwRFQB9BBEQFFARUZHFwQ\nVMQeUVkFBImAhH1JWLJAyJ48vz/OKW7dSnff7pvuW9Xd3/frVa/qrqque24/XdtT55wCriuVBk6/\n/HIArjTjXuBw4OL2lFBE2kB374aHvmcZTunf2WbAP/IqiPSEzVKvtY+T4aBjqgwn1TySYTMi7wIM\no5uB/fMuhIg0ZSbgwEwzLO/CdLHke55mxpi8CyNdL/m9Ja9F2in5jTn6vUmbxWPoNPR7k+Ezg/B7\nm2HWU9f2koOe+IHFKsvHEx5jKCKdYwZwLzAe2CDnsnSz5Hs2YNOcyyLdbwYwO/VapJ2S39hs9HuT\n9tuUcCy9F/3eZHjMJPzextDfRFekLboueWTGAjPmp4YFwIvAB4HP5Vw8EWlQvHs3HbgpTtIdvDaI\nNbpmou9Zhs9MQlO159HvTdpvJuG39gD6vUn7Jb+xm1CtaWkzM0YBm6BzOBkmHdPnURM+RX91eIA1\nwLPALe4saGQFZpzWjoKJSFMmEe7e3QQcC3zOjIfyLVJXGg1MoP97Pt6MA/MtknS5HYErgceBt5sx\nNufySHd7E+G39gRwmM7xpM22jePkmHqGGStyLI90twnASPp/b58x48F8iyQd4vvuzGn2Q+bugy/V\nYcyYAhxD/w58NnCxO4sa/Pxj7SqbiDRlMaGvsp8Ar8y5LN1sKfBm4Nv07zdF2mUN8ElgVwY+RVWk\nXX4A3Ap8gy6sdS+Fcx/wIeA6wsW9SDu9BBwA/BCYlXNZpHO8150/NvuhrksembEbYWe9jHCiALA7\noc+UN7pzR15lExERERERERHpNN2YPLoReBA41p1Vcdoo4CJgS3f2zrN8IiIiIiIiIiKdpBuTR0uB\nnd25LzN9O+B2d1UfFRERERERERFpVDe2+34B2LzK9JmEp66JiIiIiIiIiEiDujF5dAVwsRmHmzEz\nDu8hNFv7Uc5lExERERERERHpKKPyLkAbfBZwwpM1kv9vJXA+cEJehRIRERERERER6URd1+dRwowJ\nwKvi2znuLMmzPCIiIiIiIiIinahrk0ciIiIiIiIiIrLuurHPIxERERERERERaRElj0RERERERERE\npCYlj0REREREREREpCYlj0REREREREREpCYlj0REREREREREpCYlj0REREREREREpCYlj0RERERE\nREREpCYlj0RERKTtzOxoM3Mz2y3vsrSKmW1rZteZ2WIzm29ml5rZ1AY/+0j8Pv67yrxSnHdo60td\nHGY2wcw+Zma/MbOnzexFM7vTzD5iZiPzLp+IiIj0U/JIREREpElmNgP4A/Bq4CTga8BbgN+a2Zgm\nVnWsmW3ahiJ2gi2B/wYMOBv4LPAw8C3guzmWS0RERDJG5V0AERERkaIxsxHAGHdfVmORk4CJwK7u\n/lj8zK3Ab4Gjge808GfuBbYBTgCOX9cyt4KZTXD3JcP05+YCr3X3e1PTLjCz7wIfMLPT3f3BYSqL\niIiI1KGaRyIiIlIIZjbGzE4zs7+Y2SIze8nMbjSzfVPLWGzy9bMqnx8XP3dBatpYMyub2YNmttzM\nHjezr5jZ2Mxn3czOM7MjzOxeYDnw5jrFfRfwyyRxBODu1wMPAO9u8F9+BPgBDdY+MrPNzOy7ZjYv\n/i/3mtkHM8skzQO3yExPmsKVUtMqZvY3M9vVzP5gZkuAL6fmfzT+jeVm9pSZfdPMpmTWm6xjOzP7\nvZktMbMnzezzg/0/7v5cJnGU+N843nawdYiIiMjwUPJIREREimJ94N+ACvAF4FRgKvBrM9sJwN0d\nuAw4yMw2zHz+bXEdl8HLtYd+TmgO9QvgOOBq4FPAFVX+/n7AOXHeJwjJnbWY2WbAxsDtVWbfCuw8\n+L/6sv8k1AQ/od5CZjYNuBk4ADgvlu9B4GIz+2QTfy/rFcC1wF3AJ4Hfx793KvBN4CngM8BVwL8D\nvzGz0Zl1bABcB/w1LnsfcJaZHTTEMk2P4+eG+HkRERFpMTVbExERkaJYAGzh7iuSCWZ2ISEZcRxw\nTJz8A+BkQg2fb6c+fyQh4fPH+P69hGTLPu6eTMPM/gZ828ze4O5/Tn1+G0IzqtmDlHOTOH66yryn\ngQ3NbKy7Lx9kPbj7Q2Z2KaH20RnuXm2dEJJMI2P5no/Tvm1mPwJONbML3H3pYH+viunAh909XVtr\nKnAi8BvgIHdfE6ffR0hcHQl8L7WOTYH3u/ulcbmLgUcJ8bq2mcLE/qI+Sej76LYh/D8iIiLSBqp5\nJCIiIoXg7quTxJGZjYg1i0YRavjsklruAeAW4IhkWlz2IODyWDsJ4DDg78B9ZrZRMgA3xPkvN4eL\n/q+BxBHA+DiulhxallmmEV+iTu0jMzNCM7lfxLfp/+XXwGRS30+TljMwEQQh4TYGODdJHEUXAi8Q\nOgZPW0ys7QUQY3groUPsZp0HbAd83N1XDeHzIiIi0gZKHomIiEhhmNlRZnY3IQnzPPAsIVkxObPo\nD4C9zOyV8f1hwGjg0tQyWwHbx3Wkhwfi/I0z63y4wWImNXzGVpk3LrPMoNz9IUK5P2Rmm1RZZCow\nBfgQa/8vSeIn+7806sl0Ta8o+U7vz5RzBfBQan7iiVTCLrGA0JytYWb2OeBY4BR3/1UznxUREZH2\nUrM1ERERKQQzOxK4hNAv0VeBZ4DVhCZUr8os/mNC/0RHEDp5PhK43d3TCY8RwD3Ap2v8yccz7xtN\n+CRNy6olejYB5jfSZC3jP4H3Efp6ujozL7nZdxnw/RqfvzuOs0mcxMga04fS1C1rdY3p1ugKzOxo\n4Czg2+7+pRaUSURERFpIySMREREpikMJNVvema7JYmbl7ILuPt/MrgGOMLPLgb0IfeWkzQF2BH5X\npWbMkLn7k2b2LLBbldl7EDqfbnadc8zsMkKn1LdkZj8LvAiMjE90q2dBHE/JTM/WFqrn0TjehhAP\n4OX+iGYBg5WhKWb2DuAi4KfAx1q5bhEREWkNNVsTERGRokhqsLxcY8XMXgfsWWP5Swn943w1fvbH\nmfk/ATYjNIUawMzGm9nEdSjrVcBbzWxmap37A1sDVw5xnV8iNL0b8Jh7d18d/967zGyH7IdiB9eJ\nOXG8d2r+SEKTt0ZdD6wAjo/9LSWOITQfvKaJddVlZnsT4vYH4IhMH0siIiJSEKp5JCIiIsPpg2b2\n5irTvw78Engn8L+xVtEs4MPAbGBSlc9cQ+gX6TDgWnd/JjP/UuIT2cxsX+BPhOZbr4nT30TojHso\nvhz/7u/N7OuxfJ8jNJPLdkDdkFTto6OqzD6B0MH3LfEJdLOBDQkdZR8QX+Pu95rZzcAZsRPx+cB7\naOKcz92fNbMzgD7gOjP7OaEW0kcJT0C7rN7nGxX7q/o5oand/wCHDcxVcbe7313tsyIiIjK8lDwS\nERGR4fSRGtMvicN0QtOtNxESJEcSkjSl7AfcfYWZXUFIalxaZf4aMzsY+BTwfuAQYAmhKdbX6e84\nu2nu/riZ7QOcDZxJqKlzDfCZIfR3lPYlwv88oI8id59nZnsAXyQk2D5KSJzdS+gnKe0I4AJCwmkh\ncDHwe+C3jRbC3U+NTfM+Tuhbaj7wHeAkd1/Z/L9V1Sz6O0L/ZpX5Zfr7chIREZEcWQu7ABAREREZ\nVmZ2DqE51XR3X5J3eURERES6kfo8EhERkY5kZuMItXSuUuJIREREpH3UbE1EREQ6ipltTOjn51Dg\nFYQmaCIiIiLSJkoeiYiISKfZDrgceAY43t3vyrk8IiIiIl1NfR6JiIiIiIiIiEhN6vNIRERERERE\nRERqUvJIRERERERERERq6q7kkdmpmHlmuC/vYomIiIiIiIiIdKpu7DD7XsITWBKr8iqIiFRhNhJY\nCqwBVuZcGhERERERkV7yNtwrzX6oG5NHq3Cfm3chRKSmacBowk7rl3kXRkREREREROrrrmZrwVaY\nPYXZQ5hdjtnmdZc2G4vZ+plh7DCVVaQXzYzjJ3IthYiIiIiIiDSkWDWPzLYC9gU2JpvYcj+tgTXc\nAhwN3A9sAvQBN2K2A+4v1vjMiXG5tDJwaqPFFpGmzIjjx3MthYiIiIiIiDTE3D3vMgRmxwLnA88B\nc4F0wRz3XYawzinAo8Cncb+4xjJjgWxNo+W4L2/674nI4Mw+AZwJTKAwOyARERERERGppUg1j/4D\nOBn3s1q2RveFmD0AvLrOMssBJYpEhs9M4HEljkRERERERDpDkfo82gC4siVrMvsYZo9gtgzYmdA5\nr4gUwwzU35GIiIiIiEjHKFLy6Ergjeu0BrOvYXYqcDZwOXAH4VHg/4bZxutaQBFpiZkoeSQiIiIi\nItIxitRs7UHgdMxeD9xDSPr0c/9GA+uYAbwTGAkcBfwReD/wf8AHCf2siEi+ZgCVvAshIiIiIiIi\njSlSh9kP15nruG/ZwDrGAEuAQ3G/OjX9+8AU3N9R5TPVOsy+BFgz6N8TkaE4GDgO9/PzLoiIiIiI\niIgMrjg1j9xntWAtGxFqHc3LTJ8HvKbGZ04E+jLTHoyDiLTer4Df5l0IERERERERaUy+ySOzs4FT\ncH8pvq7Fcf9Mm0pxBqGPpLTl8SlsIiIiIiIiIiI9Le+aR+knoe1cZ7lG29Y9B6wGpmWmTwPmVl+z\nLweUKBIpKCvbNsAD3leUNrbFF7+zB73PV+ddFmk/K5sBW3uf3593WaT9rGxTgHHe59XPa6SrWNle\nCTzjfb4077JI+1nZNgZeCdyu857eYGUbCWzpff6PvMsi7WdlewUwwvv82bzLMhT5Pm3NfV/cF6Ze\n1xr2a3B9K4C/APu/PM1sRHx/U8vLLyJtZWXbDJgN7JN3WTqFle0A4D7CwwOkN7wdmB1PSKT7nQn8\nOO9CSPvFi8pHgG/lXBQZBla2HYCngVuB1+VcHBk+/wrcbWUbk3dBZFhcCFyQdyGGKt/kUXucDRyL\n2VGYbQucD0wEvpdvsURkCLYi7Kc2ybsgnSBeaJwT307OsywyrPYibCdT8y6IDIvt0T6xV+wQx7X6\n7ZTu8k/0X5tNz7MgMqx2BMYBU/IuiAyL3ejgY3jezdZaz/2Kow6xd94wi4ufncjIrZ/npQPmcNzZ\nv/ZsJ9oiUnxJR/pKhDRmJ/ovNtbLsyAyrHaPY20nvWFLwsNBpPvtFcdqztIbZgEvEo7f2p/3jm3i\neDLwTJ4FkfaKzc5nAovzLstQdV3NIyvb4T/YiYOfmMyxy0ex/T3T+OE5b+Ds2IZYRDrLFnGsuzGN\n2TqOF6MTz55gZRsB7BrfKuZdzso2HtgUxbpXJMkjNWfpDbOA+4FlaBvvJenkkXS318Zxx8a665JH\nwKeBC73Pv+d9Phv4MLAE+GC+xRKRIVDNo+ZsDcwDHkffWa/Ymv5aZop599sijsepf4yekCSPtG33\nhlnAw8AiFPOeYGUbTahNCop5L+j45FFXNVuLJ1K7Amck07zP11jZrgf2rPGZscDYzOTl3ud6AptI\n/raI447dyQ6zrQnNG0ah76xX7J56rZh3v1elXk8GOvJpLTK41FO3XkLbdq+YBdxO6ANHMe8Ns+i/\nHlfMu1+SPJpoZRvlfb4q19IMQbfVPNqI0A9Atn+jedTueO5EQoY/PZzYrgKKSFO2iGM1W2vMVsAD\n6K5lL9mdkDB8AcW8F2yZeq14d7dd4vgGFOuuZ2UbBWyOah71mm1SrxXz7rcj/f0drZ9nQYaq25JH\nQ3EGYWNND2fU/YSItF2sSTgjvtUBdRBWNiPUPFLyqLfsDtyGYt4rtgQ8vla8u9suhO36LhTrXjCD\ncANcyaPesj0hmbAExbyrWdnGEVpI/S5O6sh4N91szcp2EXCZ93ml9cVZZ88Bq4FpmenTgLnVPhCb\np6mJmkjxbA4Y8BQduoMdZlMJ39MDhKYtW9ZfXDpd7CthJ+AKwiOeVUOv+20P3Adsi+Ld7XYB7gAW\nomNgL0j6eFTyqLfsC/wRHcN7wa6Ehx/8CngHHRrvodQ8mgpcZ2V73Mr2VSvbjq0u1FB5n68A/gLs\nn0yLT6LZH7gpr3KJyJDsTbjD/gc6dAc7zJJ9cVLzSN9Z99sBGIdqHvUEK9t6hP3iVXGS4t3dkuTR\nImCSlW1kzuWR9tqXEOskeaRjeJeL/e7+C6Emio7h3W8vQh92N8b3HRnvppNH3ufvADYBTidUl7/D\nynavle0kK9sWrS3ekJwNHGtlO8rKti1wPjAR+F6+xRKRJr2DkPT9Bx26gx1mhxNOOmejk5Be8XpC\nbds7Ucx7wYGEu5aXxfeKd5eysm1CqImSbNvQof1jSMPeBfzM+3wl2p/3ij2B8cD1KOa9YC/gZuD5\n+L4j4z2kPo+8zxd4n3/H+7xEeBLEJcD7gAdbV7Sh8T6/AvgscBqhnfhOwJu9z7OdaItIQVnZJhAu\nlK5GVfYHFdtRH0ZoUuzoO+t6sY+rDwE3eJ8vQSeeveDtwGzv8/tR/xjd7iOEO9TX0J88Ury7lJVt\nO2A7+msV6hjeG95BSCTcjY7hXS0+PfNA+muZQYfGe506zI79LewGvI7wVKS2JWisbI9Y2TwznJBZ\nZnMr2zXAVwiZ3G8Ae3mf39KucolIWxxJ2IavJuxk11OV/bo+QLgrfXl8vwgYE5NK0p0OJNwc+Up8\nrxPPLmZlew3wXgZu44p3F7KyTSQkj77rfb6QDr/QkIacACwAfhPfa/vucla2zQnb+be8z9egmHe7\nzwErgQtS/S13ZLyHlDyysu1rZbuQkCy6hPCI4LfS/2SkdvkioclcMvx3qkwjCXdoxgBvAI4CjibU\nQBKRDmFl2wj4MvAD7/N/oCr7dVnZtge+BlwUaySALja6mpVtOnAhofpz8tQOnXh2qZhMuBh4jNA0\nHxTvrhRrFH4PmACcGydrf97FrGyHElpvfNr7fFmcvAgYH2/SS5eJfR1dRIjzV+Nk7dO7lJVtP+B4\n4Fzv8/lxcsfGeyhPW3sS2BC4jlBl/hcxgzYcXvQ+r/rUNOCNhCqfB8QmandZ2U4BzrKynRo70xaR\nArOyTSEkgQE+H8cL43gy4c6c8PJFxmHABcBDwKdSs9MXG2qy20WsbPsQLi5HAYfFZorQwSciUpuV\nbQdC4mh74E2Zi0vFu4tY2WYQ+ul8K/Au7/OH4iwlj7qQlW0M8GngS8BPgO+nZqdj/twwF03ayMq2\nFSFxtAfwVu/zF+Ms7dO7jJVtFPBvwH8Bvwf+MzW7Y+PddPIIOBW4MlalHW4nxITQY8APgXO8z1fF\neXsC92T6Nvo14UC8PaHTwbXE7O/YzOTlw5gQE+l5MRHyFuA8Qg2jA1LbcnIS1fNPHon7qz0IT2U5\nnJAwvxo42vt8cWpRXWx0iXjneTvgIOBgQjPxm4Ajvc+fSC26CJhsZbNUQkk6THxC7FbAAYQkwpsI\nDw3Y1/v8ttSiHXviKUE87s0E9iP0ffL/iDX5vc+vSS2q/XmXsLJtQOju4xDCzZ8NCLUJT8zst5U8\n6hJxn74lYZ/+FsJ2/hjhZsAfUotqn94FYk3h3QmxPpTQrc+FwKcyFVk6Nt5NJ4+8zy9sR0Ea8A3C\nI0vnE5qlnUFouvbpOH86a99hn5eaV8uJQF9mWpmQJBORNorNTd8OnAzsSsjMf9D7/JHUYumaR10n\nnlhMIiTNJsfxhvQ3z900DtsArwZGEg461wEf9T7/vyqr1cVGgcWLxnGEhOgUQpymAFMJzb+T4ZXA\ntoTm2EsIN0QOAX4e+0hIW0Q4po+Py0qBxP7HNqA/5snrafTHexYhUTiR0DfCn4FjCR3hZ29oLYrr\nkIKJCd9km06GDQn78ZmEWM8kbNtTACc0QT0RuDBVEyGxjPB70P68gOJ5zPoMjPcUYCNCnDcn7Mu3\niAPA44QapN/1Pr+vymp1DC+4WHMs2c6T8UbAZoRtfDNC7LcnnOOtJuzTP0boliF7nO7YZEK3i+ds\nkxh4zrYBIb+QjverCOfpBjxNaElxvvf5HVVW27HxHkrNo5axsp0JfGGQxbb1Pr/P+/zs1LS7rWzL\nge9Y2U5cx1pCZ9Dfh0BCtY5E2sjKNhk4BjiOcDL1B8JdmRuq1JpITqJebWW7A1jcqpoV8YAwOg5j\n4lDtdbVp4wgX6uMyr6tNq/Z6PcKBYz3CgaaaZ4CnCAehXxP6wLgNuMv7fHWdf00nnlXERF02pulx\nvXnVlhlP6JtkfGrIvs9Om0C40BhTo5jzgSficDOhydJfgVtTTZaqScdcySMGbN9DjW9228/GtZFh\nIiEm2RrOiaWEC8kngL8Rmq/cA/wpU5swaxH9F6I9b4ixrrdssp9udhufRIh5NcsIsX6cUJvsV4Sn\nAt9W74nA3uduZevYC412iUmbZvbh9eYNZdseT9iX1+qP0QnH78eAR4FbCdv2bcA/qtwASNMxvIoq\n52zNxjk7rVZcq23j6WFyHFezGHgyDvcDPyXs2//kfb6oxmcgxHyslW3cIMf6nhG38XU5bqfHY2ku\nzsm8iYRtvNpDexyYSzh+P0lIFt1NqOxyTwPbeEdu37kmjwhtAC8ZZJmHaky/lVD+LQgb51xCc460\naXFcq58kUj2ev8zKdoyVa13Licg6mk546sAE4ApCvy2311l+IeEC66I4LLeyvUA4EV9G2H5XEhIw\nI+KQfj2CsK+olhBqRWeUa2L5llUZp18/l5n2IqGJwguEg0h6vBCY532+cohleiGOPxQT7TcOctLS\nsJiAacXFeB7LrNMTRjNWEeK4lJCsWVrl/XM1lllEiPHCzOv5Ve5GNiqpoXeqle0y4A7v85eGuK6X\nVbkob3Vc2hn7Vp7jrGZgjKsNiwjnG+lYL6E/vulhQRwvGWIyfCGwjZXtC8AvCRej69S3Y40Ls3VN\nwrRi2UY+08pYr6F/P11rG3+BEOvsb2Ax/bFda1iHGx8LgUOtbI8SOsmft643UWK80wmYdmzD7ZzX\nyhP1bLyrDQurTEvvv7PDolTXGs1K9ufHxX5T/tyK/TkMuIlSpFgWfZ+e3Q8k52npY/gi4Hnv8xcY\nmiTmZ1rZfgL8tUXH8GaTrEVappXb+ArW3o9n9/GLqkx/ierb+SLguXU4T18I/IuV7ZOEpNPD67C/\neFkq4dbIcP9Qfq/mHdo1gpXtCOAHwEbe5wusbAcRTqA28T5/Ji7zIUIv9hs3UzvJytaZX4pIZ1gN\nXAKc4n3+dCMfiI803YrQrGdjwt3dcYQ7CeMIO8E1cfDU6+T9yjisiMPKGuNm5i0Flq3DgaOtrGyf\nJXTUt02c9BzhJGgFIeG2inAiNor+k7Jar9MH91YmYNJxSY9rxaAoy6wqWr9CVrbxwFmENvabxMlz\nCReyyU2S1dSOb62Yt/pkvZHvf6jxate8lYPU9Bt2VrbtgNOBNxMS8WsIdz5foj/eRmPbd5Jcb3Ws\n89pu1+nzg9wtzoWV7V+BTxD6PIMQ5yfoj/VKQiKokVin9+etVIR99lA+v7qA+3MjdKnxfkJzVgj7\n82cI8V5B2MYa2Zcnr5OL8mo1KIaq0X16EY7bA+YVbTuPNfK/QmiaPjVOnktIMiTbudPceVs7EjC5\nx24Iny/iNr4boYucAwlxWkmolbqE/muOesfwavv00TQX7wO9z69vuuydkDyysu1JOGD+nnC3elgr\nZAAAGOpJREFUfk/gHOBa7/Oj4jIjCdV/nyI8pWk6cCnh8dUnNfn3VO1IpI2KthPvZla2bYEdCcm3\njQgJt+QkciUhibQq9To7Tl63+qSgcAmYbhDvKu9C6E/l1YTq1knMRzAwrtXinH7d6gRMoU7Wu4GV\nbQLh/OjVhJrY4wmxHktIKDUS61W0+IRfsW6P+ES23Qjx3oT+bXsMtWNbLe6tvlgr3MVZN4jXIzsC\nryXEfEP6Yz6K+vvvwWK+rhfl2s7bIF7P7gq8htDRdtLMvdoxvJl9+rqew2kbbwMr2/rA6wnb9+b0\nH8PHMPAYPljcVw5hmDNI8/jqZe6Q5NEuwLcIG9JY4GFCYujsdI2i62+wx0eNYEb6s6vWcPIB+/mX\nh7O8IiIiIiIiIiLdIu8+jxoSeyl//WDLjRrBauCLhEfiJdOyT60QEREREREREZEGdUTyqEkvlkpe\ns4PsrErFxrL2U1CWl0rr9AQ3EZHcVCo2gdDXTbZPiwXAs40O2g92jkrFXg/clJm8iOqxfa7a9FJp\nyJ11yzCrVOx04D9Sk16gemyfq/F6canUAVXPBYBKxe6jv/86CE1JasW22vY+v1QqVr9dUlulYpsQ\nuuHIeon+uD5D7Zgnw0vazjtDpWJvAq5LTVpJ/f159v3z2sY7R6Vi3wb+PTVpOYPv09PT5pdK+TQb\n7cbk0QmVip1CeDTmD4FzSqW6vZefSOiULq1M6MRKRKQTvZaQODoEmEfofLHasHPq9YTsSioVe5Hm\nkk0teRqMDMkehJOPfYFXsHasNyL8LpL3k7IrqFRsCYNfhKbfL9KFSW5eB1SArxFim431tsDe8X21\nR4kvr1Ss1olqtQsUJR9yUqnYZELi6GTgZqpv29Pp3743Yu0HG6ypVGw+jSeTnyuV1u3pfbJOdo7j\nvQj79VrH8G2Af6H2dr6sUrFsbOslnV7QPj03rwOeB95N9X36NGCH1Ptsx+deqVj2BmHd5FOp5Mva\n+y9JHXsAPyV0y5ON9VRgU0J/Z1MJ53SN7tNrJp1atU/vtuTRN4A7gPnAG4AzCB0KfrrOZ84Azs5M\n0912EelkOxOegnJdoycHsbZSrRPUZKh7QVqp2FLiHTBqP7641rA4r7soXWJn4G+lkmdrH1VVqdg4\nBp6oVDt5mUU4wZlK6Kg1a2VMQDxHf8zrPro69foFJSOGplIxI8T7vFLJr2lg+TH0x7TaRUn6RHUj\nqicfvFKx5+k/MV3AwHhm4z5gXqlUzKdSdogd4/gXpZLfM9jClYqNADag+radjvmWqWnZGvhUKpbU\nXHyeteNdb9teCCxTEmKd7Ez4zm9q9HuMLSlqxTwZtgB2p/Y+fUXcp6eP4QuoHff0oH36utkZuLNU\n8hsGWzAeA6aw9jadfb1L6n21G4TpmmzzqX0MrzZtqbbxoYnH5B2Ai0sl/10Dyyf79Gr78fQw2D49\nXUN5PnByqeR3Nlv+wneYXanYmcAXBlls21LJ76tUbHPgfMKd18XAX4ADgElqfiHSmeJB8uvAxoTt\nejGh6nYj48WEats9dQe1UrELgD1LJf+nNv6NeieqrwAmE05ussP4Gqtcw9oXoy+mhsVNvO+5qvqV\nit0F3FYq+bFtWv8owsVGrYuSarGeAkyss9oXGBjvJObpWDby+qVeumipVGwG4ZG+7yiV/OdtWH+t\n5EM6AZWOcbKtr1dntUtY+2LkBQbGslp815o2SG3yrlOp2CeAs4D12pGEi8fYSQxtf17rse8rWPui\ns5F9d7XXy3twf/4/wIalku/Xxr8xiuq1VJPEUq19er3tPLtPr3fMrjdtcS/t0wEqFXsEuLJU8s+1\naf3JDcJaiaYNqL6d16posoq19+nZY3hD+3R6LNlcqdhOwJ3AP5dK/qc2rD/Zp9e7QbgB8MVGbkhk\nDSl5VKnYF+vNL5X8tKZXWvtvJQeujxN+YNOAQwl3QxMPEe6y3wXMBT5HqHF0OeHLeU2p5Pe3qkwi\nMnwqFZtF2MbvILQBn0S4IE3GtZIRaStpPOGUTjwtAZbVGJZm3i8vSs2ZSsVuBf5eKvlReZclKyad\nal2IVDtJnRTH66XeTwKszp9xQhxfTi4QYrmEELclmSE7rd4ySwm1U5cDK4pwwhPvYi0GPlkq+bfy\nLk9apWKj6Y/3YHGfzMB4T0qNszVhspYw8KS0kdg2+jqJeVHi/Tbg58ArSyV/LO/yJOLF6PoMjHM2\n5un36zMwxkncayUkEstZ+2KkXhwHi3N22jIKEmuASsUuAXYolXy3vMuSFi9QJtLYtl1tf568HjfI\nn1pFJllMa7fv5YTjd2GSkpWKzQF+Vip5vZYTuYjbeTPH8GrH8fVobJ+ejXm1Y3Oz07PH8NzP2yoV\n24BQE+SIUsl/mHd5EnEbn0DteFfbv1fbpw+2ja+m/7w7fQwfyv682rLJOXohtvFKxT4AXAysXyr5\n4rzL06yhJo+yVZxGE6q3rwLmlEq+SwvKVutvHw2cWyr5lMz0g4BfApuWSj4vTvsu8AFgWqnkz9RY\nnzrMFimwSsUOIbQL3qRaZ/iVio0kHNySpMLEIY6z0+rVmKhlBdUTS8mwMjWsyLzPDrXmryEkTtID\ncTyacFfh08AJpZKfO4T/ofBizYgJ1E4uZU9SJ8RhfOp1rWlrVfUdxAr6T0SzQ715ybCScOxcPcg4\n/XoEIdbJsBkh5m9otNlaJ4knsOOpffGZTTTVi3n2/WAntVkraTzWtaatJMQyPawa5P2oWNZxhNrV\newAbFSXB0Sox1mOpHtNq05JxEtfsOPu6GSsYGMNsPGvNy77OxnqwYQ2hz7qxhHh/nNAEuS21CvMW\nE8zJMbjetp2+eVArvtnte7BEZNoaGt9/15qX3p83un0nv/lkmAScBLy/VPJLmyh/x4jb+TgGP3Yn\n05Ibhdnjd619fDNq7dMbPY4n+/PB9uGrMq/Tx/FXA8cB25dKPrvJ8hdeTDimt/FG9um1tu3stNFN\nFCW9jdfbnw/2vto2PtgwkrBfHwMcDEwvlTz9EISOMaQ+j0ol3zk7rVKx9YFLgP9dxzIN1WGEWkfT\nKxWbCOwJvD3O24zQQVw16jBbpNh2BubVeopirNqc3KlomZigSE7exxEOUuPqDPXmJ/PSF/zrZ95n\nhzE1pifJIs8MEA5ozwK3A4P2hdKp4p3C5C5VS8Vk5DgGTzClhzFVplUbJhKaA6SnjSacVIzKjKtN\nSx+zV9OfUFwK/JFQ+7brxARJckdxXivXHbfzJN61Ek3jGTzetX4D61eZlo1ztffJtOTuvDMwEX15\ntyWO4OVYJ//js61cdyoxNVjCoV5c672fVGN+Nq7ZYUTmNYTtO7lQeRa4spXfRZHEpngL4tBSMTFV\na18+nrBPbnR7zk6bXGVadn8+2OtEOoGxBPgdcH1Lv4wCidt5Uquz1vXZkKTO3eolmho5XlcbxhNq\n2KR/D83GPJ3QXEWI/TLgRuCBVn4XRRFr/CTN21oqbuPjqX/jYDxD287XIzTxyi5X7Xhdb/+eSJKN\ni4Bvt/abGD4t7fOoUrHXEjr026JlK+1f98nAW4BdgTGlkltm/lXAO2t8/P+VSn5tjfWq5pFIgVUq\n9nNgdKnkB+VdFpG8xeSWF6GqvbRfTHiMANZ0Y7JIBorxNm3fvSEmOlC8e4uO472jG4/hrX7a2uQ4\nNKSZzrAJWb8rCZ2xvanKcs/H8QeA6+Lr8YS+UmqKSSIlikSKa2fgsrwLIVIEvdaJaK+LJ5uKeY+I\n8e6KCwwZnJIHvUnH8d7RjcfwISWPKhU7PjPJCB1Uvw+oWsOnhv8iNHWr5yGAUsn74t/evsZycwFO\nP/3ynW644b0/cmd57GgXYK4ZY4D3uPODwQpVqVgzHY7W67S1l5YtSjmKsGxRytFpy0Ko/jsxNawH\nzKBLm+OIiIiIiIh0gqHWPPpU5v0aQpvs7wNnNLqSUsmfpXVt2m8CqFQO/8RVV01/X6Uybw4wm1BT\naTbhIvR7MDB5VKPZ2hsI/1MjmrlD1Oiy7Vhnr//9bvyf8v777VjnKsI2O5f+p2U9SXiykIiIiIiI\niORgqB1mzxp8qdaqVGxzQkejG8b3O8VZD8bH3P0GmOdu0555ZvMvbLjhvNcDxwC/jU3TXkPooCqr\naofZpZKf2oZ/Q0RERERERESko7S0w+xmNdPnUaVilwBHVZm/b6nkFTPuHD162ehVq0Zvv8UWs91s\nzepVq8YsHzly5ZiHH/6nvwOzgOvceXemDOowW0RERERERESkhryTR1OBVwyy2EOlkq9IfeZo4NxS\nyaekFzJ7ufZQH6EvpcXbb/+nrXbf/TdHXH75iaesXDluDnCVOysQkY5gxvbA3wj9lV2Rd3k6gRlb\nAnOAD7gP2qecdAEzjiB0Kr+Ne3c+6lf6mXElsIM72+ZdFmkvM0YQHmd+tjsn5l0eaT8zdgNuA97h\nrib7vcCM44BvAFPdeS7v8kh7mfE7YJw7e+VdlqFo9dPWmtLKPo/cKQOY8QhwhTvLKpV/PhlY8L3v\nlb/Uir8hIsNuRmYsg9N31nvSMVfyqPvNAGaaYe56MleX24jwtGHtz3uHjuG9Jx1zJY+63wxgXN6F\nGKpck0fNSPV5tDkwMtvnUaVibwOm/f733AzMqFQ4EDgJ+Fo+JRaRFpiZGcvg9J31HsW8t8wkPI1y\nMrAw57JIe2nb7j2Kee9Jx1xPF+5iZhghzmPMGOnO6rzL1KyOSR4BpzGwz6M743hfoAKsBD62336r\nLzRz3MHdHOw/gP9IPuTOyOEqsIisM92Ba56+s96jmPcIM0YBm8S3M1DyqNtp2+49innvUcx7xwbA\n+Ph6GvBUjmUZko5JHpVKfjRwdJ351xE6xD44U417NLAzIfGUfaqaiBSbDqjN03fWe3SnundMB0bE\n1zMIfcJJ90q26RlqptgzdAzvPTqG944ZmddKHuXNnaurTP4fM+4FDgcuHuYiicjQ6YDaPH1nvUcX\nG71jZo3X0p2SbXosof+jlvQTKoWmY3gPiZ3ibxbf6hje/bLH8FvzKshQjRh8ka5xM7B/3oUQkabM\nABYDm5gxOu/CdIjkO9vAjIl5F0bay4yxwMaEmOtio/slFxeL0YVGL5hJiHXyWrpfcgyfEftHke62\nMaGVjI7hvWEGsAZYRocew3sieWTGeOB44Mm8yyIiTZlByMobobmGDC75zqD/bpZ0ryTGt9KhJyLS\nlBnAS8BsFO9ekN6fK95dLlUL5VZCbbNX5FsiGQbJdq1jeG9Imqo9TofGu+uSR2YsMGN+algAvAh8\nEPhczsUTkQaZsT6wPvDnOEl3ZAYRa6FMQ99ZL0lifBOwoRkT8iyMtN1M4AnCiae27+43E7iD8FAY\nxbv7TQXGoGN4L0kfw1XbrPvNJBy/n6BDt++u6/MI+BQM6FBwDaGN+C3uLGhkBWbMbUfBRKQpyZMR\nk5Ooa8xYnldhOkRyQ+Bmwn7wKjOW5Vgeab9xcZxsJ4+add6jX6Vhk4E/Ao8Bx+l8petNAx4lXGic\nZcYpOZdH2iu5Lkv25zfovKfrjQeWE5LE44C5ZuoYv4ttAFwNLAXem/Mx/DB3bmz2Q+befb9PM6YA\nxwDbxkmzgYvdWTT4Z20scCJwhrtrh11wilfnUcw6i+LVeRSzzqJ4dR7FrLMoXp1HMessilfv6Lrk\nkRm7AdcROqJK2onvTsjsvtGdO+p/3tYHFgGT3f2FdpZV1p3i1XkUs86ieHUexayzKF6dRzHrLIpX\n51HMOovi1Tu6sdnaOcAvgGPdWQVgxijgIuBcYO8cyyYiIiIiIiIi0lG6MXm0G6nEEYA7q8z4CnB7\nfsUSEREREREREek8Xfe0NeAFYPMq02cSnromIiIiIiIiIiIN6sbk0RXAxWYcbsbMOLyH0GztRw18\nfjlQjmMpPsWr8yhmnUXx6jyKWWdRvDqPYtZZFK/Oo5h1FsWrR3Rjh9ljgK8CH6a/Wd5K4HzgBHf9\nqEVEREREREREGtV1yaOEGROAV8W3c9xZkmd5REREREREREQ6Udcmj0REREREREREZN11Y59HIiIi\nIiIiIiLSIkoeiYiIiIiIiIhITUoeiYiIiIiIiIhITUoepZjZx8zsETNbZma3mNkeeZepF5nZ3mb2\nCzN7yszczA7OzDczO83MnjazpWZ2vZltlVlmnJl908yeN7PFZnaVmU0b3v+kN5jZiWZ2m5m9aGbP\nmNnVZrZNZhnFrEDM7CNmdreZvRCHm8zsoNR8xavAzOyEuG88NzVNMSsQMzs1xig93Jear3gVjJlt\nZmaXxe97qZndY2a7peYrZgUSz9ez25ib2TfjfMWrQMxspJmdbmYPx3jMMbNTzMxSyyhmBWNm65nZ\nuWb2aIzJn81s99R8xazHKHkUmdnhwNlAGdgF+CvwazPbONeC9aaJhO//YzXmfx44Hvgw8DrgJUKs\nxqWWOQd4G3AYsA+wKfDTdhW4x+0DfBN4PXAgMBr4jZlNTC2jmBXLE8AJwK7AbsANwM/MbPs4X/Eq\nqHjS9u/A3ZlZilnx3Atskhr+OTVP8SoQM9sA+BOwEjgI2A74DLAgtZhiViy7M3D7OjBOvzKOFa9i\n+QLwEeDjwLbx/eeB41LLKGbFcxFh23of8FrgN8D1ZrZZnK+Y9Rp31xCeOHcLcF7q/QjgSeCEvMvW\nywPgwMGp9wY8DXw2NW0ysAx4T+r9CuDQ1DKviet6fd7/U7cPwNT4Xe+tmHXOAMwHjlG8ijsAk4AH\ngAOACnBunK6YFWwATgXuqjFP8SrYAJwJ3FhnvmJW8AE4F3gwxkrxKtgA/BK4ODPtKuCy+FoxK9gA\njAdWAW/JTP8L8CXFrDcH1TwCzGwM4Q789ck0d18T3++ZV7mkqlnAdAbGahEh+ZfEaldC7Zf0MvcB\nj6F4DofJcTw/jhWzAotVyd9DqPF3E4pXkX0TuMbdr89MV8yKaSsLza8fMrPLzWzzOF3xKp63A7eb\n2ZUWml/faWbHpuYrZgUWz+OPBL7r4epU8SqePwP7m9nWAGa2I6E25rVxvmJWPKOAkYRkUNpSQuwU\nsx40Ku8CFMRGhI1jXmb6PEJ2VIpjehxXi9X01DIr3H1hnWWkDcxsBOHu35/c/W9xsmJWQGb2WkKy\naBywGDjE3Web2RviIopXgcQE3y6EphpZ2saK5xbgaOB+QpOaPuBGM9sBxauItiQ0qTkb+DJhO/uG\nma1w9++jmBXdwcAU4JL4XvEqnjOB9YH7zGw14brrZHe/PM5XzArG3V80s5uAU8zs74Tv+V8JSZ8H\nUcx6kpJHItJK3wR2YGDfHlJM9wM7EWqKHQp838z2ybdIUo2ZzQS+Dhzo7tk7gFJA7n5t6u3dZnYL\n8CjwbuDv+ZRK6hgB3O7uJ8X3d8ZE34eB7+dXLGnQMcC17v5U3gWRmt4NHAG8l9Af3E7AuWb2VEzQ\nSjG9D/guoSuX1cAdwI8INYqkB6nZWvAcYYPI9vw+DZg7/MWROpJ41IvVXGCMmU2ps4y0mJmdB7wV\n2Nfdn0jNUswKyN1XuPuD7v4Xdz+R0En9J1C8imhXYGPgDjNbZWarCJ1OHh9fJ3f9FLOCinddHwBe\njbaxInoamJ2Z9ncgaWqomBWUmb2S0A/cRanJilfxfBU4y91/7O73uPulhI6UT4zzFbMCcvc57r4P\noc/Fme6+B6EZ2kMoZj1JySPCRRSh86/9k2mx+c3+hGYdUhwPE3Y26VitT+jhP4nVXwhPTEkvsw3h\nJFDxbLH4mM7zgEOA/dz94cwiillnGAGMRfEqot8RnnKyU2q4Hbg8vk5O4hSzgjKzSYTE0dNoGyui\nPwHbZKZtTagtBopZkX0AeAa4JjVN8SqeCYTOl9NW038tqpgVmLu/5O5PxydTvgn4GYpZb8q7x+6i\nDMDhhA7BjiI8QvICwiNap+Vdtl4bCNnt5ALJgU/F15vH+V+IsXk74YLqasLF07jUOs4nnPTtS7hr\n/2fgz3n/b904AN8CFhJqQkxPDeNTyyhmBRqAM4C9gS1iPM4A1hCaRSleHTCQetqaYla8Afha3Cdu\nAbwB+C3wLDBV8SreQOjjaCVwEiHJ917CI6ePSC2jmBVsICQeHgXOrDJP8SrQQOiP6gngLXG/eEjc\nJ56lmBV3ICSK3kzoHPtA4C7gZmC0YtabQ+4FKNIAfDz+uJcTOrt8Xd5l6sUBKBGSRtnhkjjfgNMI\n2e5lhB78t86sYxyh/5358QTwp8D0vP+3bhxqxMqBo1PLKGYFGoCLgUfivu6ZGI8DFa/OGVg7eaSY\nFWgAfgw8FbexJ+L7VylexR0Iza7vifH4O3BsZr5iVrABeGM839i6yjzFq0ADsB7hgSqPEp7WNYfw\nuPcxillxB0JfVXPisexp4DxgsmLWu4PFoIqIiIiIiIiIiKxFfR6JiIiIiIiIiEhNSh6JiIiIiIiI\niEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiI\niIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhNSh6JiIiIiIiIiEhN/x9Z\n1RSuRHqGUAAAAABJRU5ErkJggg==\n", 100 | "text/plain": [ 101 | "" 102 | ] 103 | }, 104 | "metadata": {}, 105 | "output_type": "display_data" 106 | } 107 | ], 108 | "source": [ 109 | "neuron_layer_0_traces = np.array([r['n1'] for r in traces])\n", 110 | "\n", 111 | "rend.render_figure([rend.IzhikevichNeuronTraceRenderer(neuron_layer_0_traces, 'Layer 0 Neuron')], 1000, 2000)" 112 | ] 113 | } 114 | ], 115 | "metadata": { 116 | "kernelspec": { 117 | "display_name": "Python 3", 118 | "language": "python", 119 | "name": "python3" 120 | }, 121 | "language_info": { 122 | "codemirror_mode": { 123 | "name": "ipython", 124 | "version": 3 125 | }, 126 | "file_extension": ".py", 127 | "mimetype": "text/x-python", 128 | "name": "python", 129 | "nbconvert_exporter": "python", 130 | "pygments_lexer": "ipython3", 131 | "version": "3.6.0" 132 | }, 133 | "toc": { 134 | "colors": { 135 | "hover_highlight": "#DAA520", 136 | "running_highlight": "#FF0000", 137 | "selected_highlight": "#FFD700" 138 | }, 139 | "moveMenuLeft": true, 140 | "nav_menu": { 141 | "height": "103px", 142 | "width": "252px" 143 | }, 144 | "navigate_menu": true, 145 | "number_sections": true, 146 | "sideBar": true, 147 | "threshold": 4, 148 | "toc_cell": false, 149 | "toc_section_display": "block", 150 | "toc_window_display": false, 151 | "widenNotebook": false 152 | } 153 | }, 154 | "nbformat": 4, 155 | "nbformat_minor": 2 156 | } 157 | -------------------------------------------------------------------------------- /jupyter_explorations/README.md: -------------------------------------------------------------------------------- 1 | # Explorations in jupyter 2 | 3 | Contributions welcome! 4 | --------------------------------------------------------------------------------