└── README.md /README.md: -------------------------------------------------------------------------------- 1 | # Papers : Biological and Artificial Neural Networks 2 | I have collected the papers of **Artificial Neural Networks** which related to **Neuroscience** (especially Computational Neuroscience). If there are papers which is not listed, I would appreciate if you could tell me from **Issue**. 3 | 4 | ## Artificial neural networks and computational neuroscience 5 | #### Survey 6 | - D. Cox, T. Dean. "Neural networks and neuroscience-inspired computer vision". *Curr. Biol.* **24**(18) 921-929 (2014). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0960982214010392?via%3Dihub)) 7 | - A. Marblestone, G. Wayne, K. Kording. "Toward an integration of deep learning and neuroscience". (2016). ([arXiv](https://arXiv.org/abs/1606.03813)) 8 | - O. Barak. "Recurrent neural networks as versatile tools of neuroscience research". *Curr. Opin. Neurobiol.* (2017). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438817300429?via%3Dihub)) 9 | - D. Silva, P. Cruz, A. Gutierrez. "Are the long-short term memory and convolution neural net biological system?". *KICS.* **4**(2), 100-106 (2018). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S2405959518300249)) 10 | - N. Kriegeskorte, P. Douglas. "Cognitive computational neuroscience". *Nat. Neurosci.* **21**(9), 1148-1160 (2018). ([arXiv](https://arXiv.org/abs/1807.11819)) 11 | - N. Kriegeskorte, T. Golan. "Neural network models and deep learning - a primer for biologists". (2019). ([arXiv](https://arxiv.org/abs/1902.04704)) 12 | - K.R. Storrs, N. Kriegeskorte. "Deep Learning for Cognitive Neuroscience". (2019). ([arXiv](https://arxiv.org/abs/1903.01458)) 13 | - T.C. Kietzmann, P. McClure, N. Kriegeskorte. "Deep Neural Networks in Computational Neuroscience". *Oxford Research Encyclopaedia of Neuroscience*. (2019). ([Oxford](https://oxfordre.com/neuroscience/view/10.1093/acrefore/9780190264086.001.0001/acrefore-9780190264086-e-46), [bioRxiv](https://www.biorxiv.org/content/10.1101/133504v2))) 14 | - J.S. Bowers. "Parallel Distributed Processing Theory in the Age of Deep Networks". *Trends. Cogn. Sci.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S1364661317302164?via%3Dihub)) 15 | - R.M. Cichy, D. Kaiser. "Deep Neural Networks as Scientific Models". *Trends. Cogn. Sci.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S1364661319300348?via%3Dihub)) 16 | - S. Musall, A.E. Urai, D. Sussillo, A.K. Churchland. "Harnessing behavioral diversity to understand neural computations for cognition". *Curr. Opin. Neurobiol.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438819300285)) 17 | - B.A. Richards, T.P. Lillicrap, et al. "A deep learning framework for neuroscience". *Nat. Neurosci.* (2019). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-019-0520-2)) 18 | - U. Hasson, S.A. Nastase, A. Goldstein. "Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks". *Neuron*. (2020). ([Neuron](https://www.cell.com/neuron/fulltext/S0896-62731931044-X)) 19 | - A. Saxe, S. Nelli, C. Summerfield. "If deep learning is the answer, then what is the question?". (2020). ([arXiv](https://arxiv.org/abs/2004.07580)) 20 | 21 | #### Issue 22 | - T.P. Lillicrap, K.P. Kording. "What does it mean to understand a neural network?". (2019). ([arXiv](https://arxiv.org/abs/1907.06374)) 23 | 24 | ### Analysis methods for neural networks 25 | Methods for understanding of neural representation of ANN. 26 | #### Survey 27 | - D. Barrett, A. Morcos, J. Macke. "Analyzing biological and artificial neural networks: challenges with opportunities for synergy?". (2018). ([arXiv](https://arXiv.org/abs/1810.13373)) 28 | 29 | #### Neuron Feature 30 | - I. Rafegas, M. Vanrell, L.A. Alexandre. "Understanding trained CNNs by indexing neuron selectivity". (2017). ([arXiv](https://arxiv.org/abs/1702.00382)) 31 | - A. Nguyen, J. Yosinski, J. Clune. "Understanding Neural Networks via Feature Visualization: A survey". (2019). ([arXiv](https://arxiv.org/abs/1904.08939)) 32 | 33 | #### Comparing the representations of neural networks with those of the Brains 34 | 35 | ##### Representational similarity analysis (RSA) 36 | - N. Kriegeskorte, J. Diedrichsen. "Peeling the Onion of Brain Representations". *Annu. Rev. Neurosci*. (2019). ([Annu Rev Neurosci](https://www.annualreviews.org/doi/10.1146/annurev-neuro-080317-061906)) 37 | 38 | ##### Canonical correlation analysis (CCA) 39 | - M. Raghu, J. Gilmer, J. Yosinski, J. Sohl-Dickstein. "SVCCA: Singular Vector Canonical Correlation Analysis for Deep Learning Dynamics and Interpretability". *NIPS.* (2017). ([arXiv](https://arXiv.org/abs/1706.05806)) 40 | - H. Wang, et al. "Finding the needle in high-dimensional haystack: A tutorial on canonical correlation analysis". (2018). ([arXiv](https://arxiv.org/abs/1812.02598)) 41 | 42 | ##### Centered kernel alignment (CKA) 43 | - S. Kornblith, M. Norouzi, H. Lee, G. Hinton. "Similarity of Neural Network Representations Revisited". (2019). ([arXiv](https://arxiv.org/abs/1905.00414v1)) 44 | 45 | ##### Representational stability analysis (ReStA) 46 | - S. Abnar, L. Beinborn, R. Choenni, W. Zuidema. "Blackbox meets blackbox: Representational Similarity and Stability Analysis of Neural Language Models and Brains". (2019). ([arXiv](https://arxiv.org/abs/1906.01539)) 47 | 48 | #### Fixed point analysis for RNN 49 | - M.B. Ottaway, P.Y. Simard, D.H. Ballard. "Fixed point analysis for recurrent networks". *NIPS.* (1989). ([pdf](https://papers.nips.cc/paper/181-fixed-point-analysis-for-recurrent-networks.pdf)) 50 | - D. Sussillo, O. Barak. "Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks". *Neural Comput.* **25**(3), 626-649 (2013). ([MIT Press](https://www.mitpressjournals.org/doi/full/10.1162/NECO_a_00409?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dpubmed), [Jupyter notebook](https://github.com/google-research/computation-thru-dynamics/blob/master/notebooks/Fixed%20Point%20Finder%20Tutorial.ipynb)) 51 | - M.D. Golub, D. Sussillo. "FixedPointFinder: A Tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks". *JOSS.* (2018). ([pdf](https://web.stanford.edu/~mgolub/publications/GolubJOSS2018.pdf), [GitHub](https://github.com/mattgolub/fixed-point-finder)) 52 | - G.E. Katz, J.A. Reggia. "Using Directional Fibers to Locate Fixed Points of Recurrent Neural Networks". *IEEE.* (2018). ([IEEE](https://ieeexplore.ieee.org/document/8016349)) 53 | 54 | #### Ablation analysis 55 | - A.S. Morcos, D.G.T. Barrett, N.C. Rabinowitz, M. Botvinick. "On the importance of single directions for generalization". *ICLR.* (2018). ([arXiv](https://arxiv.org/abs/1803.06959)) 56 | 57 | ### Computational psychiatry 58 | I haven't been able to completely survey papers in this field. 59 | - R.E. Hoffman, U. Grasemann, R. Gueorguieva, D. Quinlan, D. Lane, R. Miikkulainen. "Using computational patients to evaluate illness mechanisms in schizophrenia". *Biol. Psychiatry.* **69**(10), 997–1005 (2011). ([PMC](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3105006/)) 60 | 61 | ## Deep neural network as models of the Brain 62 | Understanding the neural representation of the brain is difficult. Neural networks learn specific tasks (or be optimized for a specific loss function), and (sometimes) can get the same representation as the brain. Then, we can indirectly know the purpose of neural representation in the brain. 63 | 64 | ### Survey 65 | - A.J.E. Kell, J.H. McDermott. "Deep neural network models of sensory systems: windows onto the role of task constraints". *Curr. Opin. Neurobiol.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438818302034)) 66 | 67 | ### Cortical neuron 68 | - P. Poirazi, T. Brannon, B.W Mel. "Pyramidal Neuron as Two-Layer Neural Network". *Neuron*. **37**(6). (2003). ([Neuron](https://www.cell.com/neuron/fulltext/S0896-62730300149-1)) 69 | - B. David, S. Idan, L. Michael. "Single Cortical Neurons as Deep Artificial Neural Networks". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/613141v1)) 70 | 71 | ### Vision 72 | - D. Zipser, R.A. Andersen. "A back-propagation programmed network that simulates response properties of a subset of posterior parietal neurons". *Nature.* **331**, 679–684 (1988). ([Nature.](https://www.nature.com/articles/331679a0)) 73 | - A. Krizhevsky, I. Sutskever, G. Hinton. "ImageNet classification with deep convolutional neural networks". *NIPS* (2012). ([pdf](https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf)) 74 | - (cf.) I. Goodfellow, Y. Bengio, A. Courville. "[Deep Learning](https://www.deeplearningbook.org/)". MIT Press. (2016) : Chapter 9.10 "The Neuroscientific Basis for ConvolutionalNetworks" 75 | - D. Yamins, et al. "Performance-optimized hierarchical models predict neural responses in higher visual cortex". *PNAS.* **111**(23) 8619-8624 (2014). ([PNAS](https://www.pnas.org/content/111/23/8619)) 76 | - S. Khaligh-Razavi, N. Kriegeskorte. "Deep supervised, but not unsupervised, models may explain IT cortical representation". *PLoS Comput. Biol*. **10**(11), (2014). ([PLOS](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1003915)) 77 | - U. Güçlü, M.A.J. van Gerven. "Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream". *J. Neurosci.* **35**(27), (2015). ([J. Neurosci.](http://www.jneurosci.org/content/35/27/10005)) 78 | - D. Yamins, J. DiCarlo. "Eight open questions in the computational modeling of higher sensory cortex". *Curr. Opin. Neurobiol.* **37**, 114–120 (2016). ([sciencedirect](https://www.sciencedirect.com/science/article/abs/pii/S0959438816300022)) 79 | - K.M. Jozwik, N. Kriegeskorte, K.R. Storrs, M. Mur. "Deep Convolutional Neural Networks Outperform Feature-Based But Not Categorical Models in Explaining Object Similarity Judgments". *Front. Psychol*. (2017). ([Front. Psychol](https://www.frontiersin.org/articles/10.3389/fpsyg.2017.01726/full)) 80 | - M.N.U. Laskar, L.G.S. Giraldo, O. Schwartz. "Correspondence of Deep Neural Networks and the Brain for Visual Textures". (2018). ([arXiv](https://arxiv.org/abs/1806.02888)) 81 | - I. Kuzovkin, et al. "Activations of Deep Convolutional Neural Network are Aligned with Gamma Band Activity of Human Visual Cortex". *Commun. Biol.* **1** (2018). ([Commun. Biol.](https://www.nature.com/articles/s42003-018-0110-y)) 82 | - M. Schrimpf, et al. "Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like?". (2018). ([bioRxiv](https://www.biorxiv.org/content/early/2018/09/05/407007)) 83 | - E. Kim, D. Hannan, G. Kenyon. "Deep Sparse Coding for Invariant Multimodal Halle Berry Neurons". *CVPR.* (2018). ([arXiv](https://arXiv.org/abs/1711.07998)) 84 | - S. Ocko, J. Lindsey, S. Ganguli, S. Deny. "The emergence of multiple retinal cell types through efficient coding of natural movies". (2018). ([bioRxiv](https://www.biorxiv.org/content/early/2018/10/31/458737)) 85 | - Q. Yan, et al. "Revealing Fine Structures of the Retinal Receptive Field by Deep Learning Networks". (2018). ([arXiv](https://arXiv.org/abs/1811.02290)) 86 | - H. Wen, J. Shi, W. Chen, Z. Liu. "Deep Residual Network Predicts Cortical Representation and Organization of Visual Features for Rapid Categorization". *Sci.Rep.* (2018). ([Sci.Rep.](https://www.nature.com/articles/s41598-018-22160-9)) 87 | - J. Lindsey, S. Ocko, S. Ganguli, S. Deny. "A Unified Theory of Early Visual Representations from Retina to Cortex through Anatomically Constrained Deep CNNs". (2019). ([arXiv](https://arXiv.org/abs/1901.00945)) 88 | - I. Fruend. "Simple, biologically informed models, but not convolutional neural networks describe target detection in naturalistic images". *bioRxiv* (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/578633v1)) 89 | - A. Doerig, et al. "Capsule Networks but not Classic CNNs Explain Global Visual Processing". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/747394v1)) 90 | - A.S. Benjamin, et al. "Hue tuning curves in V4 change with visual context". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/780478v1)) 91 | - S. Baek, M. Song, J. Jang, et al. "Spontaneous generation of face recognition in untrained deep neural networks". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/857466v1)) 92 | 93 | #### Recurrent networks for object recognition 94 | - C. J. Spoerer, P. McClure, N. Kriegeskorte. "Recurrent Convolutional Neural Networks: A Better Model of Biological Object Recognition". *Front. Psychol.* (2017). ([Front. Psychol](https://www.frontiersin.org/articles/10.3389/fpsyg.2017.01551/full)) 95 | - A. Nayebi, D. Bear, J. Kubilius, K. Kar, S. Ganguli, D. Sussillo, J. DiCarlo, D. Yamins. "Task-Driven Convolutional Recurrent Models of the Visual System". (2018). ([arXiv](https://arXiv.org/abs/1807.00053), [GitHub](https://github.com/neuroailab/tnn)) 96 | - T.C. Kietzmann, et al. "Recurrence required to capture the dynamic computations of the human ventral visual stream". (2019). ([arXiv](https://arxiv.org/abs/1903.05946)) 97 | - K. Qiao. et al. "Category decoding of visual stimuli from human brain activity using a bidirectional recurrent neural network to simulate bidirectional information flows in human visual cortices". (2019). ([arXiv](https://arxiv.org/abs/1903.07783)) 98 | - K. Kar, J. Kubilius, K. Schmidt, E.B. Issa, J.J. DiCarlo . "Evidence that recurrent circuits are critical to the ventral stream’s execution of core object recognition behavior". *Nat. Neurosci.* (2019). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-019-0392-5), [bioRxiv](https://www.biorxiv.org/content/10.1101/354753v1)) 99 | - T.C. Kietzmann, C.J. Spoerer, L.K.A. Sörensen, R.M. Cichy, O.Hauk, N. Kriegeskorte, "Recurrence is required to capture the representational dynamics of the human visual system". *PNAS.* (2019). ([PNAS](https://www.pnas.org/content/early/2019/10/04/1905544116)) 100 | 101 | #### Primary visual cortex (V1) 102 | - S.A. Cadena, et al. "Deep convolutional models improve predictions of macaque V1 responses to natural images". *PLOS Comput. Biol.* (2019). ([PLOS](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1006897), [bioRxiv](https://www.biorxiv.org/content/10.1101/201764v2)) 103 | - A.S. Ecker, et al. "A rotation-equivariant convolutional neural network model of primary visual cortex". *ICLR* (2019). ([OpenReview](https://openreview.net/forum?id=H1fU8iAqKX), [arXiv](https://arxiv.org/abs/1809.10504)) 104 | 105 | #### Visual illusion 106 | Also see the papers associated with [PredNet](#PredNet-Predictive-coding-network). 107 | - E.J. Ward. "Exploring Perceptual Illusions in Deep Neural Networks". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/687905v1)) 108 | - E.D. Sun, R. Dekel."ImageNet-trained deep neural network exhibits illusion-like response to the Scintillating Grid". (2019). ([arXiv](https://arxiv.org/abs/1907.09019)) 109 | 110 | #### Recursive Cortical Network (RCN; non NN model) 111 | - D. George, et al. "A generative vision model that trains with high data efficiency and breaks text-based CAPTCHAs". *Science* (2017). ([Science](http://science.sciencemag.org/content/358/6368/eaag2612.full), [GitHub](https://github.com/vicariousinc/science_rcn)) 112 | 113 | #### Weight shared ResNet as RNN for object recognition 114 | - Q. Liao, T. Poggio. "Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex". (2016). ([arXiv](https://arXiv.org/abs/1604.03640)) 115 | 116 | #### Generating visual super stimuli 117 | - J. Ukita, T. Yoshida, K. Ohki. "Characterisation of nonlinear receptive fields of visual neurons by convolutional neural network". *Sci.Rep.* (2019). ([Sci.Rep.](https://www.nature.com/articles/s41598-019-40535-4)) 118 | - C.R. Ponce, et al. "Evolving super stimuli for real neurons using deep generative networks". *Cell*. **177**, 999–1009 (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/516484v1), [Cell](https://www.cell.com/cell/fulltext/S0092-86741930391-5)) 119 | - P. Bashivan, K. Kar, J.J DiCarlo. "Neural Population Control via Deep Image Synthesis". *Science.* (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/461525v1), [Science](https://science.sciencemag.org/content/364/6439/eaav9436), [GitHub1](https://github.com/dicarlolab/npc), [GitHub2](https://github.com/dicarlolab/retinawarp)) 120 | - A.P. Batista. K.P. Kording. "A Deep Dive to Illuminate V4 Neurons". *Trends. Cogn. Sci.* (2019). ([Trends. Cogn. Sci.](https://www.cell.com/trends/neurosciences/fulltext/S0166-22361930111-0)) 121 | 122 | #### Visual number sense 123 | - K. Nasr, P. Viswanathan, A. Nieder. "Number detectors spontaneously emerge in a deep neural network designed for visual object recognition". *Sci. Adv.* (2019). ([Sci. Adv.](https://advances.sciencemag.org/content/5/5/eaav7903)) 124 | 125 | ### Auditory cortex 126 | - U. Güçlü, J. Thielen, M. Hanke, M. van Gerven. "Brains on Beats". *NIPS* (2016) ([arXiv](https://arxiv.org/abs/1606.02627)) 127 | - A.J.E. Kell,D.L.K. Yamins,E.N. Shook, S.V. Norman-Haignere, J.H.McDermott. "A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy". *Neuron* **98**(3), (2018) ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0896627318302502?via%3Dihub)) 128 | - T. Koumura, H. Terashima, S. Furukawa. "Cascaded Tuning to Amplitude Modulation for Natural Sound Recognition". *J. Neurosci.* **39**(28), 5517-5533 (2019). ([J. Neurosci.](https://www.jneurosci.org/content/39/28/5517), [bioRxiv](https://www.biorxiv.org/content/10.1101/308999v2), [GitHub](https://github.com/cycentum/cascaded-am-tuning-for-sound-recognition)) 129 | 130 | ### Motor cortex 131 | - D. Sussillo, M. Churchland, M. Kaufman, K. Shenoy. "A neural network that finds a naturalistic solution for the production of muscle activity". *Nat. Neurosci.* **18**(7), 1025–1033 (2015). ([PubMed](https://www.ncbi.nlm.nih.gov/pubmed/26075643)) 132 | - J.A. Michaels, et al. "A neural network model of flexible grasp movement generation". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/742189v1)) 133 | - J. Merel, M. Botvinick, G. Wayne. "Hierarchical motor control in mammals and machines". *Nat. Commun.* (2019). ([Nat.Commun.](https://www.nature.com/articles/s41467-019-13239-6)) 134 | 135 | ### Spatial coding (Place cells, Grid cells, Head direction cells) 136 | - C. Cueva, X. Wei. "Emergence of grid-like representations by training recurrent neural networks to perform spatial localization". *ICLR.* (2018). ([arXiv](https://arXiv.org/abs/1803.07770)) 137 | - A. Banino, et al. "Vector-based navigation using grid-like representations in artificial agents". *Nature.* **557**(7705), 429–433 (2018). ([pdf](https://deepmind.com/documents/201/Vector-based%20Navigation%20using%20Grid-like%20Representations%20in%20Artificial%20Agents.pdf), [GitHub](https://github.com/deepmind/grid-cells)) 138 | - J.C.R. Whittington. et al. "Generalisation of structural knowledge in the hippocampal-entorhinal system". *NIPS.* (2018). ([arXiv](https://arxiv.org/abs/1805.09042)) 139 | - C.J. Cueva, P.Y. Wang, M. Chin, X. Wei. "Emergence of functional and structural properties of the head direction system by optimization of recurrent neural networks". (2020). ([arXiv](https://arxiv.org/abs/1912.10189)) 140 | 141 | ### Rodent barrel cortex 142 | - C. Zhuang, J. Kubilius, M. Hartmann, D. Yamins. "Toward Goal-Driven Neural Network Models for the Rodent Whisker-Trigeminal System". *NIPS.* (2017). ([arXiv](https://arxiv.org/abs/1706.07555)) 143 | 144 | ### Convergent Temperature Representations 145 | - M. Haesemeyer, A. Schier, F. Engert. "Convergent temperature representations in artificial and biological neural networks". *Neuron*. (2019). ([bioRxiv](https://www.biorxiv.org/content/early/2018/08/29/390435)), ([Neuron](https://www.cell.com/neuron/fulltext/S0896-62731930601-4)) 146 | 147 | ### Cognitive task 148 | - H.F. Song, G.R. Yang, X.J. Wang. "Reward-based training of recurrent neural networks for cognitive and value-based tasks". *eLife*. **6** (2017). ([eLife](https://elifesciences.org/articles/21492)) 149 | - G.R. Yang, M.R. Joglekar, H.F. Song, W.T. Newsome, X.J. Wang. "Task representations in neural networks trained to perform many cognitive tasks". *Nat. Neurosci.* (2019). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-018-0310-2)) ([GitHub](https://github.com/gyyang/multitask)) 150 | 151 | ### Time perception 152 | - N.F. Hardy, V. Goudar, J.L. Romero-Sosa, D.V. Buonomano. "A model of temporal scaling correctly predicts that motor timing improves with speed". *Nat. Commun.* **9** (2018). ([Nat. Commun.](https://www.nature.com/articles/s41467-018-07161-6)) 153 | - J. Wang, D. Narain, E.A. Hosseini, M. Jazayeri. "Flexible timing by temporal scaling of cortical responses". *Nat. Neurosci.* **21** 102–110(2018). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-017-0028-6)) 154 | - W. Roseboom, Z. Fountas, K. Nikiforou, D. Bhowmik, M. Shanahan, A. K. Seth. "Activity in perceptual classification networks as a basis for human subjective time perception". *Nat. Commun.* **10** (2019). ([Nat. Commun.](https://www.nature.com/articles/s41467-018-08194-7)) 155 | - B. Deverett, et al. "Interval timing in deep reinforcement learning agents". *NeurIPS 2019*. (2019). ([arXiv](https://arxiv.org/abs/1905.13469v2)) 156 | - Z. Bi, C. Zhou. "Time representation in neural network models trained to perform interval timing tasks". (2019). ([arXiv](https://arxiv.org/abs/1910.05546)). 157 | 158 | ### Short-term memory task 159 | - K. Rajan, C.D.Harvey, D.W.Tank. "Recurrent Network Models of Sequence Generation and Memory". *Neuron.* **90**(1), 128-142 (2016). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0896627316001021?via%3Dihub)) 160 | - A.E. Orhan, W.J. Ma. " A diverse range of factors affect the nature of neural representations underlying short-term memory". *Nat. Neurosci.* (2019). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-018-0314-y)), ([bioRxiv](https://www.biorxiv.org/content/10.1101/244707v3)), ([GitHub](https://github.com/eminorhan/recurrent-memory)) 161 | - N.Y. Masse. et al. "Circuit mechanisms for the maintenance and manipulation of information in working memory". *Nat. Neurosci.* (2019). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-019-0414-3)), ([bioRxiv](https://www.biorxiv.org/content/10.1101/305714v2)) 162 | 163 | ### Language 164 | - J. Chiang, et al. "Neural and computational mechanisms of analogical reasoning". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/596726v1)) 165 | - S. Na, Y.J. Choe, D. Lee, G. Kim. "Discovery of Natural Language Concepts in Individual Units of CNNs". *ICLR.* (2019). ([OpenReview](https://openreview.net/forum?id=S1EERs09YQ)), ([arXiv](https://arxiv.org/abs/1902.07249)) 166 | 167 | #### Language learning 168 | - B.M. Lake, T. Linzen, M. Baroni. "Human few-shot learning of compositional instructions". (2019). ([arXiv](https://arxiv.org/abs/1901.04587)) 169 | - A. Alamia, V. Gauducheau, D. Paisios, R. VanRullen. "Which Neural Network Architecture matches Human Behavior in Artificial Grammar Learning?". (2019). ([arXiv](https://arxiv.org/abs/1902.04861)) 170 | 171 | ## Neural network architecture based on neuroscience 172 | ### Survey 173 | - D. Hassabis, D. Kumaran, C. Summerfield, M. Botvinick. "Neuroscience-Inspired Artificial Intelligence". *Neuron.* **95**(2), 245-258 (2017). 174 | ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0896627317305093)) 175 | 176 | ### PredNet (Deep predictive coding network) 177 | - W. Lotter, G. Kreiman, D. Cox. "Deep predictive coding networks for video prediction and unsupervised learning". *ICLR.* (2017). ([arXiv](https://arXiv.org/abs/1605.08104), [GitHub](https://coxlab.github.io/prednet/)) 178 | - E. Watanabe, A. Kitaoka, K. Sakamoto, M. Yasugi, K. Tanaka. "Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction". *Front. Psychol.* (2018). ([Front. Psychol.](https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00345/full)) 179 | - M. Fonseca. "Unsupervised predictive coding models may explain visual brain representation". (2019). ([arXiv](https://arxiv.org/abs/1907.00441), [GitHub](https://github.com/thefonseca/algonauts)) 180 | - W. Lotter, G. Kreiman, D. Cox. "A neural network trained to predict future video frames mimics critical properties of biological neuronal responses and perception". *Nat. Machine Intelligence*. (2020). ([arXiv](https://arXiv.org/abs/1805.10734), [Nat. Machine Intelligence](https://www.nature.com/articles/s42256-020-0170-9)) 181 | 182 | #### subLSTM 183 | - R. Costa, Y. Assael, B. Shillingford, N. Freitas, T. Vogels. "Cortical microcircuits as gated-recurrent neural networks". *NIPS.* (2017). ([arXiv](https://arXiv.org/abs/1711.02448)) 184 | 185 | #### Activation functions 186 | - G.S. Bhumbra. "Deep learning improved by biological activation functions". (2018). ([arXiv](https://arxiv.org/abs/1804.11237)) 187 | 188 | #### Normalization 189 | - L. Gonzalo, S. Giraldo, O. Schwartz. "Integrating Flexible Normalization into Mid-Level Representations of Deep Convolutional Neural Networks". (2018). ([arXiv](https://arxiv.org/abs/1806.01823)) 190 | - M.F. Günthner, et al. "Learning Divisive Normalization in Primary Visual Cortex". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/767285v1)) 191 | 192 | ## Reinforcement Learning 193 | I haven't been able to completely survey papers in this field. 194 | - N. Haber, D. Mrowca, L. Fei-Fei, D. Yamins. "Learning to Play with Intrinsically-Motivated Self-Aware Agents". *NIPS.* (2018). ([arXiv](https://arxiv.org/abs/1802.07442)) 195 | - J. X. Wang, et al. "Prefrontal cortex as a meta-reinforcement learning system". *Nat. Neurosci.* (2018). ([Nat. Neurosci.](https://www.nature.com/articles/s41593-018-0147-8)), ([bioRxiv](https://www.biorxiv.org/content/10.1101/295964v2)), ([blog](https://deepmind.com/blog/prefrontal-cortex-meta-reinforcement-learning-system/)) 196 | - M. Botvinick. et al. "Reinforcement Learning, Fast and Slow". *Trends. Cogn. Sci.* (2019). ([Trends. Cogn. Sci.](https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-66131930061-0)) 197 | - E.O. Neftci, B.B. Averbeck. "Reinforcement learning in artificial and biological systems". *Nat. Mach. Intell.* (2019). ([Nat. Mach. Intell.](https://www.nature.com/articles/s42256-019-0025-4)) 198 | - W. Dabney, Z. Kurth-Nelson, N. Uchida, C.K. Starkweather, D. Hassabis, R. Munos, & M. Botvinick. "A distributional code for value in dopamine-based reinforcement learning". *Nature*. (2020). ([Nature](https://www.nature.com/articles/s41586-019-1924-6 )). ([blog](https://deepmind.com/blog/article/Dopamine-and-temporal-difference-learning-A-fruitful-relationship-between-neuroscience-and-AI)) 199 | 200 | ## Learning and development 201 | 202 | ### Biologically plausible learning algorithms 203 | #### Survey 204 | - J. Whittington, R. Bogacz. "Theories of Error Back-Propagation in the Brain". *Trends. Cogn. Sci.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S1364661319300129?via%3Dihub)) 205 | - T.P. Lillicrap, A.Santoro. "Backpropagation through time and the brain". *Curr. Opin. Neurobiol.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438818302009)) 206 | - T.P. Lillicrap, A. Santoro, L. Marris, et al. "Backpropagation and the brain". *Nat. Rev. Neurosci*. (2020). ([Nat. Rev. Neurosci.](https://www.nature.com/articles/s41583-020-0277-3)) 207 | 208 | #### Equilibrium Propagation 209 | - Y. Bengio, D. Lee, J. Bornschein, T. Mesnard, Z. Lin. "Towards Biologically Plausible Deep Learning". (2015). ([arXiv](https://arXiv.org/abs/1502.04156)) 210 | - B. Scellier, Y. Bengio. "Equilibrium Propagation: Bridging the Gap Between Energy-Based Models and Backpropagation". *Front. Comput. Neurosci.* **11**(24), (2017). ([arXiv](https://arXiv.org/abs/1602.05179)) 211 | - J. Sacramento, R. P. Costa, Y. Bengio, W. Senn. "Dendritic cortical microcircuits approximate the backpropagation algorithm". *NIPS.* (2018). ([arXiv](https://arXiv.org/abs/1810.11393)) 212 | 213 | #### Feedback alignment 214 | - T. Lillicrap, D. Cownden, D. Tweed, C. Akerman. "Random synaptic feedback weights support error backpropagation for deep learning". *Nat. Commun.* **7** (2016). ([Nat. Commun.](https://www.nature.com/articles/ncomms13276)) 215 | - A. Nøkland. "Direct Feedback Alignment Provides Learning in Deep Neural Networks". (2016). ([arXiv](https://arxiv.org/abs/1609.01596)) 216 | - M. Akrout, C. Wilson, P.C. Humphreys, T.Lillicrap, D. Tweed. "Deep Learning without Weight Transport". (2019). ([arXiv](https://arxiv.org/abs/1904.05391)) 217 | - B.J. Lansdell, P. Prakash, K.P. Kording. "Learning to solve the credit assignment problem". (2019). ([arXiv](https://arxiv.org/abs/1906.00889)) 218 | 219 | #### Local error signal 220 | - H. Mostafa, V. Ramesh, G.Cauwenberghs. "Deep Supervised Learning Using Local Errors". *Front. Neurosci.* (2018). ([Front. Neurosci.](https://www.frontiersin.org/articles/10.3389/fnins.2018.00608/full)). 221 | - A. Nøkland, L.H. Eidnes. "Training Neural Networks with Local Error Signals". (2019). ([arXiv](https://arXiv.org/abs/1901.06656)) ([GitHub](https://github.com/anokland/local-loss)) 222 | 223 | #### Others 224 | - M. Jaderberg, et al. "Decoupled Neural Interfaces using Synthetic Gradients" (2016). ([arXiv](https://arxiv.org/abs/1608.05343)) 225 | - N. Ke, A. Goyal, O. Bilaniuk, J. Binas, M. Mozer, C. Pal, Y. Bengio. "Sparse Attentive Backtracking: Temporal CreditAssignment Through Reminding". *NIPS.* (2018). ([arXiv](https://arXiv.org/abs/1809.03702)) 226 | - S. Bartunov, A. Santoro, B. Richards, L. Marris, G. Hinton, T. Lillicrap. "Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures". *NIPS.* (2018). ([arXiv](https://arXiv.org/abs/1807.04587)) 227 | - R. Feldesh. "The Distributed Engram". (2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/583195v1)) 228 | - Y. Amit. "Deep Learning With Asymmetric Connections and Hebbian Updates". *Front. Comput. Neurosci.* (2019). ([Front. Comput. Neurosci.](https://www.frontiersin.org/articles/10.3389/fncom.2019.00018/full)). ([GitHub](https://github.com/yaliamit/URFB)) 229 | - T. Mesnard, G. Vignoud, J. Sacramento, W. Senn, Y. Bengio "Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks". (2019). ([arXiv](https://arxiv.org/abs/1911.08585)) 230 | 231 | #### Issue 232 | - F. Crick. "The recent excitement about neural networks". *Nature*. **337**, 129–132 (1989). ([Nat.](https://www.nature.com/articles/337129a0)) 233 | 234 | ### Learning dynamics of neural networks and brains 235 | - J. Shen, M. D. Petkova, F. Liu, C. Tang. "Toward deciphering developmental patterning with deep neural network". (2018). ([bioRxiv](https://www.biorxiv.org/content/early/2018/08/09/374439)) 236 | - A.M. Saxe, J.L. McClelland, S. Ganguli. "A mathematical theory of semantic development in deep neural networks". *PNAS*. (2019). ([arXiv](https://arXiv.org/abs/1810.10531)). ([PNAS](https://www.pnas.org/content/early/2019/05/16/1820226116)) 237 | - D.V. Raman, A.P. Rotondo, T. O’Leary. "Fundamental bounds on learning performance in neural circuits". *PNAS*. (2019). ([PNAS](https://www.pnas.org/content/116/21/10537)) 238 | - R. C. Wilson, A. Shenhav, M. Straccia, J.D. Cohen. "The Eighty Five Percent Rule for optimal learning". *Nat. Commun.* (2019). ([Nat.Commun.](https://www.nature.com/articles/s41467-019-12552-4)) 239 | 240 | ### Few shot Learning 241 | - A. Cortese, B.D. Martino, M. Kawato. "The neural and cognitive architecture for learning from a small sample". *Curr. Opin. Neurobiol.* **55**, 133–141 (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438818301077)) 242 | 243 | ### A Critique of Pure Learning 244 | - A. Zador. "A Critique of Pure Learning: What Artificial Neural Networks can Learn from Animal Brains". *Nat. Commun.*(2019). ([bioRxiv](https://www.biorxiv.org/content/10.1101/582643v1)). ([Nat. Commun.](https://www.nature.com/articles/s41467-019-11786-6)) 245 | 246 | ## Brain Decoding & Brain-machine interface 247 | - E. Matsuo, I. Kobayashi, S. Nishimoto, S. Nishida, H. Asoh. "Generating Natural Language Descriptions for Semantic Representations of Human Brain Activity". *ACL SRW.* (2016). ([ACL Anthology](https://aclanthology.info/papers/P16-3004/p16-3004)) 248 | - Y. Güçlütürk, U. Güçlü, K. Seeliger, S.E.Bosch, R.J. van Lier, M.A.J. van Gerven. "Reconstructing perceived faces from brain activations with deep adversarial neural decoding". *NIPS* (2017). ([NIPS](https://papers.nips.cc/paper/7012-reconstructing-perceived-faces-from-brain-activations-with-deep-adversarial-neural-decoding)) 249 | - R. Rao. "Towards Neural Co-Processors for the Brain: Combining Decoding and Encoding in Brain-Computer Interfaces". (2018). ([arXiv](https://arxiv.org/abs/1811.11876)) 250 | - G. Shen, T. Horikawa, K. Majima, Y. Kamitani. "Deep image reconstruction from human brain activity". *PLOS* (2019). ([PLOS](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1006633)) 251 | 252 | ## Others 253 | - M.S. Goldman. "Memory without Feedback in a Neural Network". *Neuron* (2009). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0896627308010830?via%3Dihub)) 254 | - R. Yuste. "From the neuron doctrine to neural networks". *Nat. Rev. Neurosci.* 16, 487–497 (2015). ([Nat. Rev. Neurosci.](https://www.nature.com/articles/nrn3962)) 255 | - S. Saxena, J.P. Cunningham. "Towards the neural population doctrine". *Curr. Opin. Neurobiol.* (2019). ([sciencedirect](https://www.sciencedirect.com/science/article/pii/S0959438818300990)) 256 | - D.J. Heeger. "Theory of cortical function". *PNAS*. **114**(8), (2017). ([PNAS](https://www.pnas.org/content/114/8/1773)) 257 | - C.C. Chow, Y. Karimipanah. "Before and beyond the Wilson-Cowan equations". (2019). ([arXiv](https://arxiv.org/abs/1907.07821)) 258 | --------------------------------------------------------------------------------