PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (46)
 

Clipboard (0)
None

Select a Filter Below

Journals
more »
Year of Publication
1.  Nonlinear Hebbian Learning as a Unifying Principle in Receptive Field Formation 
PLoS Computational Biology  2016;12(9):e1005070.
The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common principle, namely nonlinear Hebbian learning. When nonlinear Hebbian learning is applied to natural images, receptive field shapes were strongly constrained by the input statistics and preprocessing, but exhibited only modest variation across different choices of nonlinearities in neuron models or synaptic plasticity rules. Neither overcompleteness nor sparse network activity are necessary for the development of localized receptive fields. The analysis of alternative sensory modalities such as auditory models or V2 development lead to the same conclusions. In all examples, receptive fields can be predicted a priori by reformulating an abstract model as nonlinear Hebbian learning. Thus nonlinear Hebbian learning and natural statistics can account for many aspects of receptive field formation across models and sensory modalities.
Author Summary
The question of how the brain self-organizes to develop precisely tuned neurons has puzzled neuroscientists at least since the discoveries of Hubel and Wiesel. In the past decades, a variety of theories and models have been proposed to describe receptive field formation, notably V1 simple cells, from natural inputs. We cut through the jungle of candidate explanations by demonstrating that in fact a single principle is sufficient to explain receptive field development. Our results follow from two major insights. First, we show that many representative models of sensory development are in fact implementing variations of a common principle: nonlinear Hebbian learning. Second, we reveal that nonlinear Hebbian learning is sufficient for receptive field formation through sensory inputs. The surprising result is that our findings are robust of specific details of a model, and allows for robust predictions on the learned receptive fields. Nonlinear Hebbian learning is therefore general in two senses: it applies to many models developed by theoreticians, and to many sensory modalities studied by experimental neuroscientists.
doi:10.1371/journal.pcbi.1005070
PMCID: PMC5045191  PMID: 27690349
2.  25th Annual Computational Neuroscience Meeting: CNS-2016 
Sharpee, Tatyana O. | Destexhe, Alain | Kawato, Mitsuo | Sekulić, Vladislav | Skinner, Frances K. | Wójcik, Daniel K. | Chintaluri, Chaitanya | Cserpán, Dorottya | Somogyvári, Zoltán | Kim, Jae Kyoung | Kilpatrick, Zachary P. | Bennett, Matthew R. | Josić, Kresimir | Elices, Irene | Arroyo, David | Levi, Rafael | Rodriguez, Francisco B. | Varona, Pablo | Hwang, Eunjin | Kim, Bowon | Han, Hio-Been | Kim, Tae | McKenna, James T. | Brown, Ritchie E. | McCarley, Robert W. | Choi, Jee Hyun | Rankin, James | Popp, Pamela Osborn | Rinzel, John | Tabas, Alejandro | Rupp, André | Balaguer-Ballester, Emili | Maturana, Matias I. | Grayden, David B. | Cloherty, Shaun L. | Kameneva, Tatiana | Ibbotson, Michael R. | Meffin, Hamish | Koren, Veronika | Lochmann, Timm | Dragoi, Valentin | Obermayer, Klaus | Psarrou, Maria | Schilstra, Maria | Davey, Neil | Torben-Nielsen, Benjamin | Steuber, Volker | Ju, Huiwen | Yu, Jiao | Hines, Michael L. | Chen, Liang | Yu, Yuguo | Kim, Jimin | Leahy, Will | Shlizerman, Eli | Birgiolas, Justas | Gerkin, Richard C. | Crook, Sharon M. | Viriyopase, Atthaphon | Memmesheimer, Raoul-Martin | Gielen, Stan | Dabaghian, Yuri | DeVito, Justin | Perotti, Luca | Kim, Anmo J. | Fenk, Lisa M. | Cheng, Cheng | Maimon, Gaby | Zhao, Chang | Widmer, Yves | Sprecher, Simon | Senn, Walter | Halnes, Geir | Mäki-Marttunen, Tuomo | Keller, Daniel | Pettersen, Klas H. | Andreassen, Ole A. | Einevoll, Gaute T. | Yamada, Yasunori | Steyn-Ross, Moira L. | Alistair Steyn-Ross, D. | Mejias, Jorge F. | Murray, John D. | Kennedy, Henry | Wang, Xiao-Jing | Kruscha, Alexandra | Grewe, Jan | Benda, Jan | Lindner, Benjamin | Badel, Laurent | Ohta, Kazumi | Tsuchimoto, Yoshiko | Kazama, Hokto | Kahng, B. | Tam, Nicoladie D. | Pollonini, Luca | Zouridakis, George | Soh, Jaehyun | Kim, DaeEun | Yoo, Minsu | Palmer, S. E. | Culmone, Viviana | Bojak, Ingo | Ferrario, Andrea | Merrison-Hort, Robert | Borisyuk, Roman | Kim, Chang Sub | Tezuka, Taro | Joo, Pangyu | Rho, Young-Ah | Burton, Shawn D. | Bard Ermentrout, G. | Jeong, Jaeseung | Urban, Nathaniel N. | Marsalek, Petr | Kim, Hoon-Hee | Moon, Seok-hyun | Lee, Do-won | Lee, Sung-beom | Lee, Ji-yong | Molkov, Yaroslav I. | Hamade, Khaldoun | Teka, Wondimu | Barnett, William H. | Kim, Taegyo | Markin, Sergey | Rybak, Ilya A. | Forro, Csaba | Dermutz, Harald | Demkó, László | Vörös, János | Babichev, Andrey | Huang, Haiping | Verduzco-Flores, Sergio | Dos Santos, Filipa | Andras, Peter | Metzner, Christoph | Schweikard, Achim | Zurowski, Bartosz | Roach, James P. | Sander, Leonard M. | Zochowski, Michal R. | Skilling, Quinton M. | Ognjanovski, Nicolette | Aton, Sara J. | Zochowski, Michal | Wang, Sheng-Jun | Ouyang, Guang | Guang, Jing | Zhang, Mingsha | Michael Wong, K. Y. | Zhou, Changsong | Robinson, Peter A. | Sanz-Leon, Paula | Drysdale, Peter M. | Fung, Felix | Abeysuriya, Romesh G. | Rennie, Chris J. | Zhao, Xuelong | Choe, Yoonsuck | Yang, Huei-Fang | Mi, Yuanyuan | Lin, Xiaohan | Wu, Si | Liedtke, Joscha | Schottdorf, Manuel | Wolf, Fred | Yamamura, Yoriko | Wickens, Jeffery R. | Rumbell, Timothy | Ramsey, Julia | Reyes, Amy | Draguljić, Danel | Hof, Patrick R. | Luebke, Jennifer | Weaver, Christina M. | He, Hu | Yang, Xu | Ma, Hailin | Xu, Zhiheng | Wang, Yuzhe | Baek, Kwangyeol | Morris, Laurel S. | Kundu, Prantik | Voon, Valerie | Agnes, Everton J. | Vogels, Tim P. | Podlaski, William F. | Giese, Martin | Kuravi, Pradeep | Vogels, Rufin | Seeholzer, Alexander | Podlaski, William | Ranjan, Rajnish | Vogels, Tim | Torres, Joaquin J. | Baroni, Fabiano | Latorre, Roberto | Gips, Bart | Lowet, Eric | Roberts, Mark J. | de Weerd, Peter | Jensen, Ole | van der Eerden, Jan | Goodarzinick, Abdorreza | Niry, Mohammad D. | Valizadeh, Alireza | Pariz, Aref | Parsi, Shervin S. | Warburton, Julia M. | Marucci, Lucia | Tamagnini, Francesco | Brown, Jon | Tsaneva-Atanasova, Krasimira | Kleberg, Florence I. | Triesch, Jochen | Moezzi, Bahar | Iannella, Nicolangelo | Schaworonkow, Natalie | Plogmacher, Lukas | Goldsworthy, Mitchell R. | Hordacre, Brenton | McDonnell, Mark D. | Ridding, Michael C. | Zapotocky, Martin | Smit, Daniel | Fouquet, Coralie | Trembleau, Alain | Dasgupta, Sakyasingha | Nishikawa, Isao | Aihara, Kazuyuki | Toyoizumi, Taro | Robb, Daniel T. | Mellen, Nick | Toporikova, Natalia | Tang, Rongxiang | Tang, Yi-Yuan | Liang, Guangsheng | Kiser, Seth A. | Howard, James H. | Goncharenko, Julia | Voronenko, Sergej O. | Ahamed, Tosif | Stephens, Greg | Yger, Pierre | Lefebvre, Baptiste | Spampinato, Giulia Lia Beatrice | Esposito, Elric | et Olivier Marre, Marcel Stimberg | Choi, Hansol | Song, Min-Ho | Chung, SueYeon | Lee, Dan D. | Sompolinsky, Haim | Phillips, Ryan S. | Smith, Jeffrey | Chatzikalymniou, Alexandra Pierri | Ferguson, Katie | Alex Cayco Gajic, N. | Clopath, Claudia | Angus Silver, R. | Gleeson, Padraig | Marin, Boris | Sadeh, Sadra | Quintana, Adrian | Cantarelli, Matteo | Dura-Bernal, Salvador | Lytton, William W. | Davison, Andrew | Li, Luozheng | Zhang, Wenhao | Wang, Dahui | Song, Youngjo | Park, Sol | Choi, Ilhwan | Shin, Hee-sup | Choi, Hannah | Pasupathy, Anitha | Shea-Brown, Eric | Huh, Dongsung | Sejnowski, Terrence J. | Vogt, Simon M. | Kumar, Arvind | Schmidt, Robert | Van Wert, Stephen | Schiff, Steven J. | Veale, Richard | Scheutz, Matthias | Lee, Sang Wan | Gallinaro, Júlia | Rotter, Stefan | Rubchinsky, Leonid L. | Cheung, Chung Ching | Ratnadurai-Giridharan, Shivakeshavan | Shomali, Safura Rashid | Ahmadabadi, Majid Nili | Shimazaki, Hideaki | Nader Rasuli, S. | Zhao, Xiaochen | Rasch, Malte J. | Wilting, Jens | Priesemann, Viola | Levina, Anna | Rudelt, Lucas | Lizier, Joseph T. | Spinney, Richard E. | Rubinov, Mikail | Wibral, Michael | Bak, Ji Hyun | Pillow, Jonathan | Zaho, Yuan | Park, Il Memming | Kang, Jiyoung | Park, Hae-Jeong | Jang, Jaeson | Paik, Se-Bum | Choi, Woochul | Lee, Changju | Song, Min | Lee, Hyeonsu | Park, Youngjin | Yilmaz, Ergin | Baysal, Veli | Ozer, Mahmut | Saska, Daniel | Nowotny, Thomas | Chan, Ho Ka | Diamond, Alan | Herrmann, Christoph S. | Murray, Micah M. | Ionta, Silvio | Hutt, Axel | Lefebvre, Jérémie | Weidel, Philipp | Duarte, Renato | Morrison, Abigail | Lee, Jung H. | Iyer, Ramakrishnan | Mihalas, Stefan | Koch, Christof | Petrovici, Mihai A. | Leng, Luziwei | Breitwieser, Oliver | Stöckel, David | Bytschok, Ilja | Martel, Roman | Bill, Johannes | Schemmel, Johannes | Meier, Karlheinz | Esler, Timothy B. | Burkitt, Anthony N. | Kerr, Robert R. | Tahayori, Bahman | Nolte, Max | Reimann, Michael W. | Muller, Eilif | Markram, Henry | Parziale, Antonio | Senatore, Rosa | Marcelli, Angelo | Skiker, K. | Maouene, M. | Neymotin, Samuel A. | Seidenstein, Alexandra | Lakatos, Peter | Sanger, Terence D. | Menzies, Rosemary J. | McLauchlan, Campbell | van Albada, Sacha J. | Kedziora, David J. | Neymotin, Samuel | Kerr, Cliff C. | Suter, Benjamin A. | Shepherd, Gordon M. G. | Ryu, Juhyoung | Lee, Sang-Hun | Lee, Joonwon | Lee, Hyang Jung | Lim, Daeseob | Wang, Jisung | Lee, Heonsoo | Jung, Nam | Anh Quang, Le | Maeng, Seung Eun | Lee, Tae Ho | Lee, Jae Woo | Park, Chang-hyun | Ahn, Sora | Moon, Jangsup | Choi, Yun Seo | Kim, Juhee | Jun, Sang Beom | Lee, Seungjun | Lee, Hyang Woon | Jo, Sumin | Jun, Eunji | Yu, Suin | Goetze, Felix | Lai, Pik-Yin | Kim, Seonghyun | Kwag, Jeehyun | Jang, Hyun Jae | Filipović, Marko | Reig, Ramon | Aertsen, Ad | Silberberg, Gilad | Bachmann, Claudia | Buttler, Simone | Jacobs, Heidi | Dillen, Kim | Fink, Gereon R. | Kukolja, Juraj | Kepple, Daniel | Giaffar, Hamza | Rinberg, Dima | Shea, Steven | Koulakov, Alex | Bahuguna, Jyotika | Tetzlaff, Tom | Kotaleski, Jeanette Hellgren | Kunze, Tim | Peterson, Andre | Knösche, Thomas | Kim, Minjung | Kim, Hojeong | Park, Ji Sung | Yeon, Ji Won | Kim, Sung-Phil | Kang, Jae-Hwan | Lee, Chungho | Spiegler, Andreas | Petkoski, Spase | Palva, Matias J. | Jirsa, Viktor K. | Saggio, Maria L. | Siep, Silvan F. | Stacey, William C. | Bernar, Christophe | Choung, Oh-hyeon | Jeong, Yong | Lee, Yong-il | Kim, Su Hyun | Jeong, Mir | Lee, Jeungmin | Kwon, Jaehyung | Kralik, Jerald D. | Jahng, Jaehwan | Hwang, Dong-Uk | Kwon, Jae-Hyung | Park, Sang-Min | Kim, Seongkyun | Kim, Hyoungkyu | Kim, Pyeong Soo | Yoon, Sangsup | Lim, Sewoong | Park, Choongseok | Miller, Thomas | Clements, Katie | Ahn, Sungwoo | Ji, Eoon Hye | Issa, Fadi A. | Baek, JeongHun | Oba, Shigeyuki | Yoshimoto, Junichiro | Doya, Kenji | Ishii, Shin | Mosqueiro, Thiago S. | Strube-Bloss, Martin F. | Smith, Brian | Huerta, Ramon | Hadrava, Michal | Hlinka, Jaroslav | Bos, Hannah | Helias, Moritz | Welzig, Charles M. | Harper, Zachary J. | Kim, Won Sup | Shin, In-Seob | Baek, Hyeon-Man | Han, Seung Kee | Richter, René | Vitay, Julien | Beuth, Frederick | Hamker, Fred H. | Toppin, Kelly | Guo, Yixin | Graham, Bruce P. | Kale, Penelope J. | Gollo, Leonardo L. | Stern, Merav | Abbott, L. F. | Fedorov, Leonid A. | Giese, Martin A. | Ardestani, Mohammad Hovaidi | Faraji, Mohammad Javad | Preuschoff, Kerstin | Gerstner, Wulfram | van Gendt, Margriet J. | Briaire, Jeroen J. | Kalkman, Randy K. | Frijns, Johan H. M. | Lee, Won Hee | Frangou, Sophia | Fulcher, Ben D. | Tran, Patricia H. P. | Fornito, Alex | Gliske, Stephen V. | Lim, Eugene | Holman, Katherine A. | Fink, Christian G. | Kim, Jinseop S. | Mu, Shang | Briggman, Kevin L. | Sebastian Seung, H. | Wegener, Detlef | Bohnenkamp, Lisa | Ernst, Udo A. | Devor, Anna | Dale, Anders M. | Lines, Glenn T. | Edwards, Andy | Tveito, Aslak | Hagen, Espen | Senk, Johanna | Diesmann, Markus | Schmidt, Maximilian | Bakker, Rembrandt | Shen, Kelly | Bezgin, Gleb | Hilgetag, Claus-Christian | van Albada, Sacha Jennifer | Sun, Haoqi | Sourina, Olga | Huang, Guang-Bin | Klanner, Felix | Denk, Cornelia | Glomb, Katharina | Ponce-Alvarez, Adrián | Gilson, Matthieu | Ritter, Petra | Deco, Gustavo | Witek, Maria A. G. | Clarke, Eric F. | Hansen, Mads | Wallentin, Mikkel | Kringelbach, Morten L. | Vuust, Peter | Klingbeil, Guido | De Schutter, Erik | Chen, Weiliang | Zang, Yunliang | Hong, Sungho | Takashima, Akira | Zamora, Criseida | Gallimore, Andrew R. | Goldschmidt, Dennis | Manoonpong, Poramate | Karoly, Philippa J. | Freestone, Dean R. | Soundry, Daniel | Kuhlmann, Levin | Paninski, Liam | Cook, Mark | Lee, Jaejin | Fishman, Yonatan I. | Cohen, Yale E. | Roberts, James A. | Cocchi, Luca | Sweeney, Yann | Lee, Soohyun | Jung, Woo-Sung | Kim, Youngsoo | Jung, Younginha | Song, Yoon-Kyu | Chavane, Frédéric | Soman, Karthik | Muralidharan, Vignesh | Srinivasa Chakravarthy, V. | Shivkumar, Sabyasachi | Mandali, Alekhya | Pragathi Priyadharsini, B. | Mehta, Hima | Davey, Catherine E. | Brinkman, Braden A. W. | Kekona, Tyler | Rieke, Fred | Buice, Michael | De Pittà, Maurizio | Berry, Hugues | Brunel, Nicolas | Breakspear, Michael | Marsat, Gary | Drew, Jordan | Chapman, Phillip D. | Daly, Kevin C. | Bradle, Samual P. | Seo, Sat Byul | Su, Jianzhong | Kavalali, Ege T. | Blackwell, Justin | Shiau, LieJune | Buhry, Laure | Basnayake, Kanishka | Lee, Sue-Hyun | Levy, Brandon A. | Baker, Chris I. | Leleu, Timothée | Philips, Ryan T. | Chhabria, Karishma
BMC Neuroscience  2016;17(Suppl 1):54.
Table of contents
A1 Functional advantages of cell-type heterogeneity in neural circuits
Tatyana O. Sharpee
A2 Mesoscopic modeling of propagating waves in visual cortex
Alain Destexhe
A3 Dynamics and biomarkers of mental disorders
Mitsuo Kawato
F1 Precise recruitment of spiking output at theta frequencies requires dendritic h-channels in multi-compartment models of oriens-lacunosum/moleculare hippocampal interneurons
Vladislav Sekulić, Frances K. Skinner
F2 Kernel methods in reconstruction of current sources from extracellular potentials for single cells and the whole brains
Daniel K. Wójcik, Chaitanya Chintaluri, Dorottya Cserpán, Zoltán Somogyvári
F3 The synchronized periods depend on intracellular transcriptional repression mechanisms in circadian clocks.
Jae Kyoung Kim, Zachary P. Kilpatrick, Matthew R. Bennett, Kresimir Josić
O1 Assessing irregularity and coordination of spiking-bursting rhythms in central pattern generators
Irene Elices, David Arroyo, Rafael Levi, Francisco B. Rodriguez, Pablo Varona
O2 Regulation of top-down processing by cortically-projecting parvalbumin positive neurons in basal forebrain
Eunjin Hwang, Bowon Kim, Hio-Been Han, Tae Kim, James T. McKenna, Ritchie E. Brown, Robert W. McCarley, Jee Hyun Choi
O3 Modeling auditory stream segregation, build-up and bistability
James Rankin, Pamela Osborn Popp, John Rinzel
O4 Strong competition between tonotopic neural ensembles explains pitch-related dynamics of auditory cortex evoked fields
Alejandro Tabas, André Rupp, Emili Balaguer-Ballester
O5 A simple model of retinal response to multi-electrode stimulation
Matias I. Maturana, David B. Grayden, Shaun L. Cloherty, Tatiana Kameneva, Michael R. Ibbotson, Hamish Meffin
O6 Noise correlations in V4 area correlate with behavioral performance in visual discrimination task
Veronika Koren, Timm Lochmann, Valentin Dragoi, Klaus Obermayer
O7 Input-location dependent gain modulation in cerebellar nucleus neurons
Maria Psarrou, Maria Schilstra, Neil Davey, Benjamin Torben-Nielsen, Volker Steuber
O8 Analytic solution of cable energy function for cortical axons and dendrites
Huiwen Ju, Jiao Yu, Michael L. Hines, Liang Chen, Yuguo Yu
O9 C. elegans interactome: interactive visualization of Caenorhabditis elegans worm neuronal network
Jimin Kim, Will Leahy, Eli Shlizerman
O10 Is the model any good? Objective criteria for computational neuroscience model selection
Justas Birgiolas, Richard C. Gerkin, Sharon M. Crook
O11 Cooperation and competition of gamma oscillation mechanisms
Atthaphon Viriyopase, Raoul-Martin Memmesheimer, Stan Gielen
O12 A discrete structure of the brain waves
Yuri Dabaghian, Justin DeVito, Luca Perotti
O13 Direction-specific silencing of the Drosophila gaze stabilization system
Anmo J. Kim, Lisa M. Fenk, Cheng Lyu, Gaby Maimon
O14 What does the fruit fly think about values? A model of olfactory associative learning
Chang Zhao, Yves Widmer, Simon Sprecher,Walter Senn
O15 Effects of ionic diffusion on power spectra of local field potentials (LFP)
Geir Halnes, Tuomo Mäki-Marttunen, Daniel Keller, Klas H. Pettersen,Ole A. Andreassen, Gaute T. Einevoll
O16 Large-scale cortical models towards understanding relationship between brain structure abnormalities and cognitive deficits
Yasunori Yamada
O17 Spatial coarse-graining the brain: origin of minicolumns
Moira L. Steyn-Ross, D. Alistair Steyn-Ross
O18 Modeling large-scale cortical networks with laminar structure
Jorge F. Mejias, John D. Murray, Henry Kennedy, Xiao-Jing Wang
O19 Information filtering by partial synchronous spikes in a neural population
Alexandra Kruscha, Jan Grewe, Jan Benda, Benjamin Lindner
O20 Decoding context-dependent olfactory valence in Drosophila
Laurent Badel, Kazumi Ohta, Yoshiko Tsuchimoto, Hokto Kazama
P1 Neural network as a scale-free network: the role of a hub
B. Kahng
P2 Hemodynamic responses to emotions and decisions using near-infrared spectroscopy optical imaging
Nicoladie D. Tam
P3 Phase space analysis of hemodynamic responses to intentional movement directions using functional near-infrared spectroscopy (fNIRS) optical imaging technique
Nicoladie D.Tam, Luca Pollonini, George Zouridakis
P4 Modeling jamming avoidance of weakly electric fish
Jaehyun Soh, DaeEun Kim
P5 Synergy and redundancy of retinal ganglion cells in prediction
Minsu Yoo, S. E. Palmer
P6 A neural field model with a third dimension representing cortical depth
Viviana Culmone, Ingo Bojak
P7 Network analysis of a probabilistic connectivity model of the Xenopus tadpole spinal cord
Andrea Ferrario, Robert Merrison-Hort, Roman Borisyuk
P8 The recognition dynamics in the brain
Chang Sub Kim
P9 Multivariate spike train analysis using a positive definite kernel
Taro Tezuka
P10 Synchronization of burst periods may govern slow brain dynamics during general anesthesia
Pangyu Joo
P11 The ionic basis of heterogeneity affects stochastic synchrony
Young-Ah Rho, Shawn D. Burton, G. Bard Ermentrout, Jaeseung Jeong, Nathaniel N. Urban
P12 Circular statistics of noise in spike trains with a periodic component
Petr Marsalek
P14 Representations of directions in EEG-BCI using Gaussian readouts
Hoon-Hee Kim, Seok-hyun Moon, Do-won Lee, Sung-beom Lee, Ji-yong Lee, Jaeseung Jeong
P15 Action selection and reinforcement learning in basal ganglia during reaching movements
Yaroslav I. Molkov, Khaldoun Hamade, Wondimu Teka, William H. Barnett, Taegyo Kim, Sergey Markin, Ilya A. Rybak
P17 Axon guidance: modeling axonal growth in T-Junction assay
Csaba Forro, Harald Dermutz, László Demkó, János Vörös
P19 Transient cell assembly networks encode persistent spatial memories
Yuri Dabaghian, Andrey Babichev
P20 Theory of population coupling and applications to describe high order correlations in large populations of interacting neurons
Haiping Huang
P21 Design of biologically-realistic simulations for motor control
Sergio Verduzco-Flores
P22 Towards understanding the functional impact of the behavioural variability of neurons
Filipa Dos Santos, Peter Andras
P23 Different oscillatory dynamics underlying gamma entrainment deficits in schizophrenia
Christoph Metzner, Achim Schweikard, Bartosz Zurowski
P24 Memory recall and spike frequency adaptation
James P. Roach, Leonard M. Sander, Michal R. Zochowski
P25 Stability of neural networks and memory consolidation preferentially occur near criticality
Quinton M. Skilling, Nicolette Ognjanovski, Sara J. Aton, Michal Zochowski
P26 Stochastic Oscillation in Self-Organized Critical States of Small Systems: Sensitive Resting State in Neural Systems
Sheng-Jun Wang, Guang Ouyang, Jing Guang, Mingsha Zhang, K. Y. Michael Wong, Changsong Zhou
P27 Neurofield: a C++ library for fast simulation of 2D neural field models
Peter A. Robinson, Paula Sanz-Leon, Peter M. Drysdale, Felix Fung, Romesh G. Abeysuriya, Chris J. Rennie, Xuelong Zhao
P28 Action-based grounding: Beyond encoding/decoding in neural code
Yoonsuck Choe, Huei-Fang Yang
P29 Neural computation in a dynamical system with multiple time scales
Yuanyuan Mi, Xiaohan Lin, Si Wu
P30 Maximum entropy models for 3D layouts of orientation selectivity
Joscha Liedtke, Manuel Schottdorf, Fred Wolf
P31 A behavioral assay for probing computations underlying curiosity in rodents
Yoriko Yamamura, Jeffery R. Wickens
P32 Using statistical sampling to balance error function contributions to optimization of conductance-based models
Timothy Rumbell, Julia Ramsey, Amy Reyes, Danel Draguljić, Patrick R. Hof, Jennifer Luebke, Christina M. Weaver
P33 Exploration and implementation of a self-growing and self-organizing neuron network building algorithm
Hu He, Xu Yang, Hailin Ma, Zhiheng Xu, Yuzhe Wang
P34 Disrupted resting state brain network in obese subjects: a data-driven graph theory analysis
Kwangyeol Baek, Laurel S. Morris, Prantik Kundu, Valerie Voon
P35 Dynamics of cooperative excitatory and inhibitory plasticity
Everton J. Agnes, Tim P. Vogels
P36 Frequency-dependent oscillatory signal gating in feed-forward networks of integrate-and-fire neurons
William F. Podlaski, Tim P. Vogels
P37 Phenomenological neural model for adaptation of neurons in area IT
Martin Giese, Pradeep Kuravi, Rufin Vogels
P38 ICGenealogy: towards a common topology of neuronal ion channel function and genealogy in model and experiment
Alexander Seeholzer, William Podlaski, Rajnish Ranjan, Tim Vogels
P39 Temporal input discrimination from the interaction between dynamic synapses and neural subthreshold oscillations
Joaquin J. Torres, Fabiano Baroni, Roberto Latorre, Pablo Varona
P40 Different roles for transient and sustained activity during active visual processing
Bart Gips, Eric Lowet, Mark J. Roberts, Peter de Weerd, Ole Jensen, Jan van der Eerden
P41 Scale-free functional networks of 2D Ising model are highly robust against structural defects: neuroscience implications
Abdorreza Goodarzinick, Mohammad D. Niry, Alireza Valizadeh
P42 High frequency neuron can facilitate propagation of signal in neural networks
Aref Pariz, Shervin S. Parsi, Alireza Valizadeh
P43 Investigating the effect of Alzheimer’s disease related amyloidopathy on gamma oscillations in the CA1 region of the hippocampus
Julia M. Warburton, Lucia Marucci, Francesco Tamagnini, Jon Brown, Krasimira Tsaneva-Atanasova
P44 Long-tailed distributions of inhibitory and excitatory weights in a balanced network with eSTDP and iSTDP
Florence I. Kleberg, Jochen Triesch
P45 Simulation of EMG recording from hand muscle due to TMS of motor cortex
Bahar Moezzi, Nicolangelo Iannella, Natalie Schaworonkow, Lukas Plogmacher, Mitchell R. Goldsworthy, Brenton Hordacre, Mark D. McDonnell, Michael C. Ridding, Jochen Triesch
P46 Structure and dynamics of axon network formed in primary cell culture
Martin Zapotocky, Daniel Smit, Coralie Fouquet, Alain Trembleau
P47 Efficient signal processing and sampling in random networks that generate variability
Sakyasingha Dasgupta, Isao Nishikawa, Kazuyuki Aihara, Taro Toyoizumi
P48 Modeling the effect of riluzole on bursting in respiratory neural networks
Daniel T. Robb, Nick Mellen, Natalia Toporikova
P49 Mapping relaxation training using effective connectivity analysis
Rongxiang Tang, Yi-Yuan Tang
P50 Modeling neuron oscillation of implicit sequence learning
Guangsheng Liang, Seth A. Kiser, James H. Howard, Jr., Yi-Yuan Tang
P51 The role of cerebellar short-term synaptic plasticity in the pathology and medication of downbeat nystagmus
Julia Goncharenko, Neil Davey, Maria Schilstra, Volker Steuber
P52 Nonlinear response of noisy neurons
Sergej O. Voronenko, Benjamin Lindner
P53 Behavioral embedding suggests multiple chaotic dimensions underlie C. elegans locomotion
Tosif Ahamed, Greg Stephens
P54 Fast and scalable spike sorting for large and dense multi-electrodes recordings
Pierre Yger, Baptiste Lefebvre, Giulia Lia Beatrice Spampinato, Elric Esposito, Marcel Stimberg et Olivier Marre
P55 Sufficient sampling rates for fast hand motion tracking
Hansol Choi, Min-Ho Song
P56 Linear readout of object manifolds
SueYeon Chung, Dan D. Lee, Haim Sompolinsky
P57 Differentiating models of intrinsic bursting and rhythm generation of the respiratory pre-Bötzinger complex using phase response curves
Ryan S. Phillips, Jeffrey Smith
P58 The effect of inhibitory cell network interactions during theta rhythms on extracellular field potentials in CA1 hippocampus
Alexandra Pierri Chatzikalymniou, Katie Ferguson, Frances K. Skinner
P59 Expansion recoding through sparse sampling in the cerebellar input layer speeds learning
N. Alex Cayco Gajic, Claudia Clopath, R. Angus Silver
P60 A set of curated cortical models at multiple scales on Open Source Brain
Padraig Gleeson, Boris Marin, Sadra Sadeh, Adrian Quintana, Matteo Cantarelli, Salvador Dura-Bernal, William W. Lytton, Andrew Davison, R. Angus Silver
P61 A synaptic story of dynamical information encoding in neural adaptation
Luozheng Li, Wenhao Zhang, Yuanyuan Mi, Dahui Wang, Si Wu
P62 Physical modeling of rule-observant rodent behavior
Youngjo Song, Sol Park, Ilhwan Choi, Jaeseung Jeong, Hee-sup Shin
P64 Predictive coding in area V4 and prefrontal cortex explains dynamic discrimination of partially occluded shapes
Hannah Choi, Anitha Pasupathy, Eric Shea-Brown
P65 Stability of FORCE learning on spiking and rate-based networks
Dongsung Huh, Terrence J. Sejnowski
P66 Stabilising STDP in striatal neurons for reliable fast state recognition in noisy environments
Simon M. Vogt, Arvind Kumar, Robert Schmidt
P67 Electrodiffusion in one- and two-compartment neuron models for characterizing cellular effects of electrical stimulation
Stephen Van Wert, Steven J. Schiff
P68 STDP improves speech recognition capabilities in spiking recurrent circuits parameterized via differential evolution Markov Chain Monte Carlo
Richard Veale, Matthias Scheutz
P69 Bidirectional transformation between dominant cortical neural activities and phase difference distributions
Sang Wan Lee
P70 Maturation of sensory networks through homeostatic structural plasticity
Júlia Gallinaro, Stefan Rotter
P71 Corticothalamic dynamics: structure, number of solutions and stability of steady-state solutions in the space of synaptic couplings
Paula Sanz-Leon, Peter A. Robinson
P72 Optogenetic versus electrical stimulation of the parkinsonian basal ganglia. Computational study
Leonid L. Rubchinsky, Chung Ching Cheung, Shivakeshavan Ratnadurai-Giridharan
P73 Exact spike-timing distribution reveals higher-order interactions of neurons
Safura Rashid Shomali, Majid Nili Ahmadabadi, Hideaki Shimazaki, S. Nader Rasuli
P74 Neural mechanism of visual perceptual learning using a multi-layered neural network
Xiaochen Zhao, Malte J. Rasch
P75 Inferring collective spiking dynamics from mostly unobserved systems
Jens Wilting, Viola Priesemann
P76 How to infer distributions in the brain from subsampled observations
Anna Levina, Viola Priesemann
P77 Influences of embedding and estimation strategies on the inferred memory of single spiking neurons
Lucas Rudelt, Joseph T. Lizier, Viola Priesemann
P78 A nearest-neighbours based estimator for transfer entropy between spike trains
Joseph T. Lizier, Richard E. Spinney, Mikail Rubinov, Michael Wibral, Viola Priesemann
P79 Active learning of psychometric functions with multinomial logistic models
Ji Hyun Bak, Jonathan Pillow
P81 Inferring low-dimensional network dynamics with variational latent Gaussian process
Yuan Zaho, Il Memming Park
P82 Computational investigation of energy landscapes in the resting state subcortical brain network
Jiyoung Kang, Hae-Jeong Park
P83 Local repulsive interaction between retinal ganglion cells can generate a consistent spatial periodicity of orientation map
Jaeson Jang, Se-Bum Paik
P84 Phase duration of bistable perception reveals intrinsic time scale of perceptual decision under noisy condition
Woochul Choi, Se-Bum Paik
P85 Feedforward convergence between retina and primary visual cortex can determine the structure of orientation map
Changju Lee, Jaeson Jang, Se-Bum Paik
P86 Computational method classifying neural network activity patterns for imaging data
Min Song, Hyeonsu Lee, Se-Bum Paik
P87 Symmetry of spike-timing-dependent-plasticity kernels regulates volatility of memory
Youngjin Park, Woochul Choi, Se-Bum Paik
P88 Effects of time-periodic coupling strength on the first-spike latency dynamics of a scale-free network of stochastic Hodgkin-Huxley neurons
Ergin Yilmaz, Veli Baysal, Mahmut Ozer
P89 Spectral properties of spiking responses in V1 and V4 change within the trial and are highly relevant for behavioral performance
Veronika Koren, Klaus Obermayer
P90 Methods for building accurate models of individual neurons
Daniel Saska, Thomas Nowotny
P91 A full size mathematical model of the early olfactory system of honeybees
Ho Ka Chan, Alan Diamond, Thomas Nowotny
P92 Stimulation-induced tuning of ongoing oscillations in spiking neural networks
Christoph S. Herrmann, Micah M. Murray, Silvio Ionta, Axel Hutt, Jérémie Lefebvre
P93 Decision-specific sequences of neural activity in balanced random networks driven by structured sensory input
Philipp Weidel, Renato Duarte, Abigail Morrison
P94 Modulation of tuning induced by abrupt reduction of SST cell activity
Jung H. Lee, Ramakrishnan Iyer, Stefan Mihalas
P95 The functional role of VIP cell activation during locomotion
Jung H. Lee, Ramakrishnan Iyer, Christof Koch, Stefan Mihalas
P96 Stochastic inference with spiking neural networks
Mihai A. Petrovici, Luziwei Leng, Oliver Breitwieser, David Stöckel, Ilja Bytschok, Roman Martel, Johannes Bill, Johannes Schemmel, Karlheinz Meier
P97 Modeling orientation-selective electrical stimulation with retinal prostheses
Timothy B. Esler, Anthony N. Burkitt, David B. Grayden, Robert R. Kerr, Bahman Tahayori, Hamish Meffin
P98 Ion channel noise can explain firing correlation in auditory nerves
Bahar Moezzi, Nicolangelo Iannella, Mark D. McDonnell
P99 Limits of temporal encoding of thalamocortical inputs in a neocortical microcircuit
Max Nolte, Michael W. Reimann, Eilif Muller, Henry Markram
P100 On the representation of arm reaching movements: a computational model
Antonio Parziale, Rosa Senatore, Angelo Marcelli
P101 A computational model for investigating the role of cerebellum in acquisition and retention of motor behavior
Rosa Senatore, Antonio Parziale, Angelo Marcelli
P102 The emergence of semantic categories from a large-scale brain network of semantic knowledge
K. Skiker, M. Maouene
P103 Multiscale modeling of M1 multitarget pharmacotherapy for dystonia
Samuel A. Neymotin, Salvador Dura-Bernal, Alexandra Seidenstein, Peter Lakatos, Terence D. Sanger, William W. Lytton
P104 Effect of network size on computational capacity
Salvador Dura-Bernal, Rosemary J. Menzies, Campbell McLauchlan, Sacha J. van Albada, David J. Kedziora, Samuel Neymotin, William W. Lytton, Cliff C. Kerr
P105 NetPyNE: a Python package for NEURON to facilitate development and parallel simulation of biological neuronal networks
Salvador Dura-Bernal, Benjamin A. Suter, Samuel A. Neymotin, Cliff C. Kerr, Adrian Quintana, Padraig Gleeson, Gordon M. G. Shepherd, William W. Lytton
P107 Inter-areal and inter-regional inhomogeneity in co-axial anisotropy of Cortical Point Spread in human visual areas
Juhyoung Ryu, Sang-Hun Lee
P108 Two bayesian quanta of uncertainty explain the temporal dynamics of cortical activity in the non-sensory areas during bistable perception
Joonwon Lee, Sang-Hun Lee
P109 Optimal and suboptimal integration of sensory and value information in perceptual decision making
Hyang Jung Lee, Sang-Hun Lee
P110 A Bayesian algorithm for phoneme Perception and its neural implementation
Daeseob Lim, Sang-Hun Lee
P111 Complexity of EEG signals is reduced during unconsciousness induced by ketamine and propofol
Jisung Wang, Heonsoo Lee
P112 Self-organized criticality of neural avalanche in a neural model on complex networks
Nam Jung, Le Anh Quang, Seung Eun Maeng, Tae Ho Lee, Jae Woo Lee
P113 Dynamic alterations in connection topology of the hippocampal network during ictal-like epileptiform activity in an in vitro rat model
Chang-hyun Park, Sora Ahn, Jangsup Moon, Yun Seo Choi, Juhee Kim, Sang Beom Jun, Seungjun Lee, Hyang Woon Lee
P114 Computational model to replicate seizure suppression effect by electrical stimulation
Sora Ahn, Sumin Jo, Eunji Jun, Suin Yu, Hyang Woon Lee, Sang Beom Jun, Seungjun Lee
P115 Identifying excitatory and inhibitory synapses in neuronal networks from spike trains using sorted local transfer entropy
Felix Goetze, Pik-Yin Lai
P116 Neural network model for obstacle avoidance based on neuromorphic computational model of boundary vector cell and head direction cell
Seonghyun Kim, Jeehyun Kwag
P117 Dynamic gating of spike pattern propagation by Hebbian and anti-Hebbian spike timing-dependent plasticity in excitatory feedforward network model
Hyun Jae Jang, Jeehyun Kwag
P118 Inferring characteristics of input correlations of cells exhibiting up-down state transitions in the rat striatum
Marko Filipović, Ramon Reig, Ad Aertsen, Gilad Silberberg, Arvind Kumar
P119 Graph properties of the functional connected brain under the influence of Alzheimer’s disease
Claudia Bachmann, Simone Buttler, Heidi Jacobs, Kim Dillen, Gereon R. Fink, Juraj Kukolja, Abigail Morrison
P120 Learning sparse representations in the olfactory bulb
Daniel Kepple, Hamza Giaffar, Dima Rinberg, Steven Shea, Alex Koulakov
P121 Functional classification of homologous basal-ganglia networks
Jyotika Bahuguna,Tom Tetzlaff, Abigail Morrison, Arvind Kumar, Jeanette Hellgren Kotaleski
P122 Short term memory based on multistability
Tim Kunze, Andre Peterson, Thomas Knösche
P123 A physiologically plausible, computationally efficient model and simulation software for mammalian motor units
Minjung Kim, Hojeong Kim
P125 Decoding laser-induced somatosensory information from EEG
Ji Sung Park, Ji Won Yeon, Sung-Phil Kim
P126 Phase synchronization of alpha activity for EEG-based personal authentication
Jae-Hwan Kang, Chungho Lee, Sung-Phil Kim
P129 Investigating phase-lags in sEEG data using spatially distributed time delays in a large-scale brain network model
Andreas Spiegler, Spase Petkoski, Matias J. Palva, Viktor K. Jirsa
P130 Epileptic seizures in the unfolding of a codimension-3 singularity
Maria L. Saggio, Silvan F. Siep, Andreas Spiegler, William C. Stacey, Christophe Bernard, Viktor K. Jirsa
P131 Incremental dimensional exploratory reasoning under multi-dimensional environment
Oh-hyeon Choung, Yong Jeong
P132 A low-cost model of eye movements and memory in personal visual cognition
Yong-il Lee, Jaeseung Jeong
P133 Complex network analysis of structural connectome of autism spectrum disorder patients
Su Hyun Kim, Mir Jeong, Jaeseung Jeong
P134 Cognitive motives and the neural correlates underlying human social information transmission, gossip
Jeungmin Lee, Jaehyung Kwon, Jerald D. Kralik, Jaeseung Jeong
P135 EEG hyperscanning detects neural oscillation for the social interaction during the economic decision-making
Jaehwan Jahng, Dong-Uk Hwang, Jaeseung Jeong
P136 Detecting purchase decision based on hyperfrontality of the EEG
Jae-Hyung Kwon, Sang-Min Park, Jaeseung Jeong
P137 Vulnerability-based critical neurons, synapses, and pathways in the Caenorhabditis elegans connectome
Seongkyun Kim, Hyoungkyu Kim, Jerald D. Kralik, Jaeseung Jeong
P138 Motif analysis reveals functionally asymmetrical neurons in C. elegans
Pyeong Soo Kim, Seongkyun Kim, Hyoungkyu Kim, Jaeseung Jeong
P139 Computational approach to preference-based serial decision dynamics: do temporal discounting and working memory affect it?
Sangsup Yoon, Jaehyung Kwon, Sewoong Lim, Jaeseung Jeong
P141 Social stress induced neural network reconfiguration affects decision making and learning in zebrafish
Choongseok Park, Thomas Miller, Katie Clements, Sungwoo Ahn, Eoon Hye Ji, Fadi A. Issa
P142 Descriptive, generative, and hybrid approaches for neural connectivity inference from neural activity data
JeongHun Baek, Shigeyuki Oba, Junichiro Yoshimoto, Kenji Doya, Shin Ishii
P145 Divergent-convergent synaptic connectivities accelerate coding in multilayered sensory systems
Thiago S. Mosqueiro, Martin F. Strube-Bloss, Brian Smith, Ramon Huerta
P146 Swinging networks
Michal Hadrava, Jaroslav Hlinka
P147 Inferring dynamically relevant motifs from oscillatory stimuli: challenges, pitfalls, and solutions
Hannah Bos, Moritz Helias
P148 Spatiotemporal mapping of brain network dynamics during cognitive tasks using magnetoencephalography and deep learning
Charles M. Welzig, Zachary J. Harper
P149 Multiscale complexity analysis for the segmentation of MRI images
Won Sup Kim, In-Seob Shin, Hyeon-Man Baek, Seung Kee Han
P150 A neuro-computational model of emotional attention
René Richter, Julien Vitay, Frederick Beuth, Fred H. Hamker
P151 Multi-site delayed feedback stimulation in parkinsonian networks
Kelly Toppin, Yixin Guo
P152 Bistability in Hodgkin–Huxley-type equations
Tatiana Kameneva, Hamish Meffin, Anthony N. Burkitt, David B. Grayden
P153 Phase changes in postsynaptic spiking due to synaptic connectivity and short term plasticity: mathematical analysis of frequency dependency
Mark D. McDonnell, Bruce P. Graham
P154 Quantifying resilience patterns in brain networks: the importance of directionality
Penelope J. Kale, Leonardo L. Gollo
P155 Dynamics of rate-model networks with separate excitatory and inhibitory populations
Merav Stern, L. F. Abbott
P156 A model for multi-stable dynamics in action recognition modulated by integration of silhouette and shading cues
Leonid A. Fedorov, Martin A. Giese
P157 Spiking model for the interaction between action recognition and action execution
Mohammad Hovaidi Ardestani, Martin Giese
P158 Surprise-modulated belief update: how to learn within changing environments?
Mohammad Javad Faraji, Kerstin Preuschoff, Wulfram Gerstner
P159 A fast, stochastic and adaptive model of auditory nerve responses to cochlear implant stimulation
Margriet J. van Gendt, Jeroen J. Briaire, Randy K. Kalkman, Johan H. M. Frijns
P160 Quantitative comparison of graph theoretical measures of simulated and empirical functional brain networks
Won Hee Lee, Sophia Frangou
P161 Determining discriminative properties of fMRI signals in schizophrenia using highly comparative time-series analysis
Ben D. Fulcher, Patricia H. P. Tran, Alex Fornito
P162 Emergence of narrowband LFP oscillations from completely asynchronous activity during seizures and high-frequency oscillations
Stephen V. Gliske, William C. Stacey, Eugene Lim, Katherine A. Holman, Christian G. Fink
P163 Neuronal diversity in structure and function: cross-validation of anatomical and physiological classification of retinal ganglion cells in the mouse
Jinseop S. Kim, Shang Mu, Kevin L. Briggman, H. Sebastian Seung, the EyeWirers
P164 Analysis and modelling of transient firing rate changes in area MT in response to rapid stimulus feature changes
Detlef Wegener, Lisa Bohnenkamp, Udo A. Ernst
P165 Step-wise model fitting accounting for high-resolution spatial measurements: construction of a layer V pyramidal cell model with reduced morphology
Tuomo Mäki-Marttunen, Geir Halnes, Anna Devor, Christoph Metzner, Anders M. Dale, Ole A. Andreassen, Gaute T. Einevoll
P166 Contributions of schizophrenia-associated genes to neuron firing and cardiac pacemaking: a polygenic modeling approach
Tuomo Mäki-Marttunen, Glenn T. Lines, Andy Edwards, Aslak Tveito, Anders M. Dale, Gaute T. Einevoll, Ole A. Andreassen
P167 Local field potentials in a 4 × 4 mm2 multi-layered network model
Espen Hagen, Johanna Senk, Sacha J. van Albada, Markus Diesmann
P168 A spiking network model explains multi-scale properties of cortical dynamics
Maximilian Schmidt, Rembrandt Bakker, Kelly Shen, Gleb Bezgin, Claus-Christian Hilgetag, Markus Diesmann, Sacha Jennifer van Albada
P169 Using joint weight-delay spike-timing dependent plasticity to find polychronous neuronal groups
Haoqi Sun, Olga Sourina, Guang-Bin Huang, Felix Klanner, Cornelia Denk
P170 Tensor decomposition reveals RSNs in simulated resting state fMRI
Katharina Glomb, Adrián Ponce-Alvarez, Matthieu Gilson, Petra Ritter, Gustavo Deco
P171 Getting in the groove: testing a new model-based method for comparing task-evoked vs resting-state activity in fMRI data on music listening
Matthieu Gilson, Maria AG Witek, Eric F. Clarke, Mads Hansen, Mikkel Wallentin, Gustavo Deco, Morten L. Kringelbach, Peter Vuust
P172 STochastic engine for pathway simulation (STEPS) on massively parallel processors
Guido Klingbeil, Erik De Schutter
P173 Toolkit support for complex parallel spatial stochastic reaction–diffusion simulation in STEPS
Weiliang Chen, Erik De Schutter
P174 Modeling the generation and propagation of Purkinje cell dendritic spikes caused by parallel fiber synaptic input
Yunliang Zang, Erik De Schutter
P175 Dendritic morphology determines how dendrites are organized into functional subunits
Sungho Hong, Akira Takashima, Erik De Schutter
P176 A model of Ca2+/calmodulin-dependent protein kinase II activity in long term depression at Purkinje cells
Criseida Zamora, Andrew R. Gallimore, Erik De Schutter
P177 Reward-modulated learning of population-encoded vectors for insect-like navigation in embodied agents
Dennis Goldschmidt, Poramate Manoonpong, Sakyasingha Dasgupta
P178 Data-driven neural models part II: connectivity patterns of human seizures
Philippa J. Karoly, Dean R. Freestone, Daniel Soundry, Levin Kuhlmann, Liam Paninski, Mark Cook
P179 Data-driven neural models part I: state and parameter estimation
Dean R. Freestone, Philippa J. Karoly, Daniel Soundry, Levin Kuhlmann, Mark Cook
P180 Spectral and spatial information processing in human auditory streaming
Jaejin Lee, Yonatan I. Fishman, Yale E. Cohen
P181 A tuning curve for the global effects of local perturbations in neural activity: Mapping the systems-level susceptibility of the brain
Leonardo L. Gollo, James A. Roberts, Luca Cocchi
P182 Diverse homeostatic responses to visual deprivation mediated by neural ensembles
Yann Sweeney, Claudia Clopath
P183 Opto-EEG: a novel method for investigating functional connectome in mouse brain based on optogenetics and high density electroencephalography
Soohyun Lee, Woo-Sung Jung, Jee Hyun Choi
P184 Biphasic responses of frontal gamma network to repetitive sleep deprivation during REM sleep
Bowon Kim, Youngsoo Kim, Eunjin Hwang, Jee Hyun Choi
P185 Brain-state correlate and cortical connectivity for frontal gamma oscillations in top-down fashion assessed by auditory steady-state response
Younginha Jung, Eunjin Hwang, Yoon-Kyu Song, Jee Hyun Choi
P186 Neural field model of localized orientation selective activation in V1
James Rankin, Frédéric Chavane
P187 An oscillatory network model of Head direction and Grid cells using locomotor inputs
Karthik Soman, Vignesh Muralidharan, V. Srinivasa Chakravarthy
P188 A computational model of hippocampus inspired by the functional architecture of basal ganglia
Karthik Soman, Vignesh Muralidharan, V. Srinivasa Chakravarthy
P189 A computational architecture to model the microanatomy of the striatum and its functional properties
Sabyasachi Shivkumar, Vignesh Muralidharan, V. Srinivasa Chakravarthy
P190 A scalable cortico-basal ganglia model to understand the neural dynamics of targeted reaching
Vignesh Muralidharan, Alekhya Mandali, B. Pragathi Priyadharsini, Hima Mehta, V. Srinivasa Chakravarthy
P191 Emergence of radial orientation selectivity from synaptic plasticity
Catherine E. Davey, David B. Grayden, Anthony N. Burkitt
P192 How do hidden units shape effective connections between neurons?
Braden A. W. Brinkman, Tyler Kekona, Fred Rieke, Eric Shea-Brown, Michael Buice
P193 Characterization of neural firing in the presence of astrocyte-synapse signaling
Maurizio De Pittà, Hugues Berry, Nicolas Brunel
P194 Metastability of spatiotemporal patterns in a large-scale network model of brain dynamics
James A. Roberts, Leonardo L. Gollo, Michael Breakspear
P195 Comparison of three methods to quantify detection and discrimination capacity estimated from neural population recordings
Gary Marsat, Jordan Drew, Phillip D. Chapman, Kevin C. Daly, Samual P. Bradley
P196 Quantifying the constraints for independent evoked and spontaneous NMDA receptor mediated synaptic transmission at individual synapses
Sat Byul Seo, Jianzhong Su, Ege T. Kavalali, Justin Blackwell
P199 Gamma oscillation via adaptive exponential integrate-and-fire neurons
LieJune Shiau, Laure Buhry, Kanishka Basnayake
P200 Visual face representations during memory retrieval compared to perception
Sue-Hyun Lee, Brandon A. Levy, Chris I. Baker
P201 Top-down modulation of sequential activity within packets modeled using avalanche dynamics
Timothée Leleu, Kazuyuki Aihara
Q28 An auto-encoder network realizes sparse features under the influence of desynchronized vascular dynamics
Ryan T. Philips, Karishma Chhabria, V. Srinivasa Chakravarthy
doi:10.1186/s12868-016-0283-6
PMCID: PMC5001212  PMID: 27534393
3.  Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons 
PLoS Computational Biology  2016;12(2):e1004761.
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.
Author Summary
Over the last decades, a variety of simplified spiking models have been shown to achieve a surprisingly high performance in predicting the neuronal responses to in vitro somatic current injections. Because of the complex adaptive behavior featured by cortical neurons, this success is however restricted to limited stimulus ranges: model parameters optimized for a specific input regime are often inappropriate to describe the response to input currents with different statistical properties. In the present study, a new spiking neuron model is introduced that captures single-neuron computation over a wide range of input statistics and explains different aspects of the neuronal dynamics within a single framework. Our results indicate that complex forms of single neuron adaptation are mediated by the nonlinear dynamics of the firing threshold and that the input-output transformation performed by cortical pyramidal neurons can be intuitively understood in terms of an enhanced Generalized Linear Model in which both the input filter and the spike-history filter adapt to the input statistics.
doi:10.1371/journal.pcbi.1004761
PMCID: PMC4764342  PMID: 26907675
4.  A Model of Synaptic Reconsolidation 
Reconsolidation of memories has mostly been studied at the behavioral and molecular level. Here, we put forward a simple extension of existing computational models of synaptic consolidation to capture hippocampal slice experiments that have been interpreted as reconsolidation at the synaptic level. The model implements reconsolidation through stabilization of consolidated synapses by stabilizing entities combined with an activity-dependent reservoir of stabilizing entities that are immune to protein synthesis inhibition (PSI). We derive a reduced version of our model to explore the conditions under which synaptic reconsolidation does or does not occur, often referred to as the boundary conditions of reconsolidation. We find that our computational model of synaptic reconsolidation displays complex boundary conditions. Our results suggest that a limited resource of hypothetical stabilizing molecules or complexes, which may be implemented by protein phosphorylation or different receptor subtypes, can underlie the phenomenon of synaptic reconsolidation.
doi:10.3389/fnins.2016.00206
PMCID: PMC4870270  PMID: 27242410
reconsolidation; synaptic plasticity; neuron modeling; reduced model; memory dynamics
8.  Automated High-Throughput Characterization of Single Neurons by Means of Simplified Spiking Models 
PLoS Computational Biology  2015;11(6):e1004275.
Single-neuron models are useful not only for studying the emergent properties of neural circuits in large-scale simulations, but also for extracting and summarizing in a principled way the information contained in electrophysiological recordings. Here we demonstrate that, using a convex optimization procedure we previously introduced, a Generalized Integrate-and-Fire model can be accurately fitted with a limited amount of data. The model is capable of predicting both the spiking activity and the subthreshold dynamics of different cell types, and can be used for online characterization of neuronal properties. A protocol is proposed that, combined with emergent technologies for automatic patch-clamp recordings, permits automated, in vitro high-throughput characterization of single neurons.
Author Summary
Large-scale, high-throughput data acquisition is revolutionizing the field of neuroscience. Single-neuron electrophysiology is moving from the situation where a highly skilled experimentalist can patch a few cells per day, to a situation where robots will collect large amounts of data. To take advantage of this quantity of data, this technological advance requires a paradigm shift in the experimental design and analysis. Presently, most single-neuron experimental studies rely on old protocols—such as injections of steps and ramps of current—that rarely inform theoreticians and modelers interested in emergent properties of the brain. Here, we describe an efficient protocol for high-throughput in vitro electrophysiology as well as a set of mathematical tools that neuroscientists can use to directly translate experimental data into realistic spiking neuron models. The efficiency of the proposed method makes it suitable for high-throughput data analysis, allowing for the generation of a standardized database of realistic single-neuron models.
doi:10.1371/journal.pcbi.1004275
PMCID: PMC4470831  PMID: 26083597
9.  Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks 
Nature Communications  2015;6:6922.
Synaptic plasticity, the putative basis of learning and memory formation, manifests in various forms and across different timescales. Here we show that the interaction of Hebbian homosynaptic plasticity with rapid non-Hebbian heterosynaptic plasticity is, when complemented with slower homeostatic changes and consolidation, sufficient for assembly formation and memory recall in a spiking recurrent network model of excitatory and inhibitory neurons. In the model, assemblies were formed during repeated sensory stimulation and characterized by strong recurrent excitatory connections. Even days after formation, and despite ongoing network activity and synaptic plasticity, memories could be recalled through selective delay activity following the brief stimulation of a subset of assembly neurons. Blocking any component of plasticity prevented stable functioning as a memory network. Our modelling results suggest that the diversity of plasticity phenomena in the brain is orchestrated towards achieving common functional goals.
The brain exhibits a diversity of plasticity mechanisms across different timecales that constitute the putative basis for learning and memory. Here, the authors demonstrate how these different plasticity mechanisms are orchestrated to support the formation of robust and stable neural cell assemblies.
doi:10.1038/ncomms7922
PMCID: PMC4411307  PMID: 25897632
10.  Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules 
Classical Hebbian learning puts the emphasis on joint pre- and postsynaptic activity, but neglects the potential role of neuromodulators. Since neuromodulators convey information about novelty or reward, the influence of neuromodulators on synaptic plasticity is useful not just for action learning in classical conditioning, but also to decide “when” to create new memories in response to a flow of sensory stimuli. In this review, we focus on timing requirements for pre- and postsynaptic activity in conjunction with one or several phasic neuromodulatory signals. While the emphasis of the text is on conceptual models and mathematical theories, we also discuss some experimental evidence for neuromodulation of Spike-Timing-Dependent Plasticity. We highlight the importance of synaptic mechanisms in bridging the temporal gap between sensory stimulation and neuromodulatory signals, and develop a framework for a class of neo-Hebbian three-factor learning rules that depend on presynaptic activity, postsynaptic variables as well as the influence of neuromodulators.
doi:10.3389/fncir.2015.00085
PMCID: PMC4717313  PMID: 26834568
STDP; plasticity; neuromodulation; reward learning; novelty; spiking neuron networks; synaptic plasticity (LTP/LTD)
12.  The role of interconnected hub neurons in cortical dynamics 
BMC Neuroscience  2014;15(Suppl 1):P158.
doi:10.1186/1471-2202-15-S1-P158
PMCID: PMC4125080
14.  Connection-type-specific biases make uniform random network models consistent with cortical recordings 
Journal of Neurophysiology  2014;112(8):1801-1814.
Uniform random sparse network architectures are ubiquitous in computational neuroscience, but the implicit hypothesis that they are a good representation of real neuronal networks has been met with skepticism. Here we used two experimental data sets, a study of triplet connectivity statistics and a data set measuring neuronal responses to channelrhodopsin stimuli, to evaluate the fidelity of thousands of model networks. Network architectures comprised three neuron types (excitatory, fast spiking, and nonfast spiking inhibitory) and were created from a set of rules that govern the statistics of the resulting connection types. In a high-dimensional parameter scan, we varied the degree distributions (i.e., how many cells each neuron connects with) and the synaptic weight correlations of synapses from or onto the same neuron. These variations converted initially uniform random and homogeneously connected networks, in which every neuron sent and received equal numbers of synapses with equal synaptic strength distributions, to highly heterogeneous networks in which the number of synapses per neuron, as well as average synaptic strength of synapses from or to a neuron were variable. By evaluating the impact of each variable on the network structure and dynamics, and their similarity to the experimental data, we could falsify the uniform random sparse connectivity hypothesis for 7 of 36 connectivity parameters, but we also confirmed the hypothesis in 8 cases. Twenty-one parameters had no substantial impact on the results of the test protocols we used.
doi:10.1152/jn.00629.2013
PMCID: PMC4200009  PMID: 24944218
neuronal network models; random connectivity; layer 2/3 sensory cortex
15.  Stochastic variational learning in recurrent spiking networks 
The ability to learn and perform statistical inference with biologically plausible recurrent networks of spiking neurons is an important step toward understanding perception and reasoning. Here we derive and investigate a new learning rule for recurrent spiking networks with hidden neurons, combining principles from variational learning and reinforcement learning. Our network defines a generative model over spike train histories and the derived learning rule has the form of a local Spike Timing Dependent Plasticity rule modulated by global factors (neuromodulators) conveying information about “novelty” on a statistically rigorous ground. Simulations show that our model is able to learn both stationary and non-stationary patterns of spike trains. We also propose one experiment that could potentially be performed with animals in order to test the dynamics of the predicted novelty signal.
doi:10.3389/fncom.2014.00038
PMCID: PMC3983494  PMID: 24772078
neural networks; variational learning; spiking neurons; synapses; action potentials
16.  Spike-timing prediction in cortical neurons with active dendrites 
A complete single-neuron model must correctly reproduce the firing of spikes and bursts. We present a study of a simplified model of deep pyramidal cells of the cortex with active dendrites. We hypothesized that we can model the soma and its apical dendrite with only two compartments, without significant loss in the accuracy of spike-timing predictions. The model is based on experimentally measurable impulse-response functions, which transfer the effect of current injected in one compartment to current reaching the other. Each compartment was modeled with a pair of non-linear differential equations and a small number of parameters that approximate the Hodgkin-and-Huxley equations. The predictive power of this model was tested on electrophysiological experiments where noisy current was injected in both the soma and the apical dendrite simultaneously. We conclude that a simple two-compartment model can predict spike times of pyramidal cells stimulated in the soma and dendrites simultaneously. Our results support that regenerating activity in the apical dendritic is required to properly account for the dynamics of layer 5 pyramidal cells under in-vivo-like conditions.
doi:10.3389/fncom.2014.00090
PMCID: PMC4131408  PMID: 25165443
dendrites; neuron models; cortical neurons; spike train analysis; models; theoretical
17.  Limits to high-speed simulations of spiking neural networks using general-purpose computers 
To understand how the central nervous system performs computations using recurrent neuronal circuitry, simulations have become an indispensable tool for theoretical neuroscience. To study neuronal circuits and their ability to self-organize, increasing attention has been directed toward synaptic plasticity. In particular spike-timing-dependent plasticity (STDP) creates specific demands for simulations of spiking neural networks. On the one hand a high temporal resolution is required to capture the millisecond timescale of typical STDP windows. On the other hand network simulations have to evolve over hours up to days, to capture the timescale of long-term plasticity. To do this efficiently, fast simulation speed is the crucial ingredient rather than large neuron numbers. Using different medium-sized network models consisting of several thousands of neurons and off-the-shelf hardware, we compare the simulation speed of the simulators: Brian, NEST and Neuron as well as our own simulator Auryn. Our results show that real-time simulations of different plastic network models are possible in parallel simulations in which numerical precision is not a primary concern. Even so, the speed-up margin of parallelism is limited and boosting simulation speeds beyond one tenth of real-time is difficult. By profiling simulation code we show that the run times of typical plastic network simulations encounter a hard boundary. This limit is partly due to latencies in the inter-process communications and thus cannot be overcome by increased parallelism. Overall, these results show that to study plasticity in medium-sized spiking neural networks, adequate simulation tools are readily available which run efficiently on small clusters. However, to run simulations substantially faster than real-time, special hardware is a prerequisite.
doi:10.3389/fninf.2014.00076
PMCID: PMC4160969  PMID: 25309418
spiking neural networks; network simulator; synaptic plasticity; STDP; parallel computing; computational neuroscience
18.  Synaptic Plasticity in Neural Networks Needs Homeostasis with a Fast Rate Detector 
PLoS Computational Biology  2013;9(11):e1003330.
Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes.
Author Summary
Learning and memory in the brain are thought to be mediated through Hebbian plasticity. When a group of neurons is repetitively active together, their connections get strengthened. This can cause co-activation even in the absence of the stimulus that triggered the change. To avoid run-away behavior it is important to prevent neurons from forming excessively strong connections. This is achieved by regulatory homeostatic mechanisms that constrain the overall activity. Here we study the stability of background activity in a recurrent network model with a plausible Hebbian learning rule and homeostasis. We find that the activity in our model is unstable unless homeostasis reacts to rate changes on a timescale of minutes or faster. Since this timescale is incompatible with most known forms of homeostasis, this implies the existence of a previously unknown, rapid homeostatic regulatory mechanism capable of either gating the rate of plasticity, or affecting synaptic efficacies otherwise on a short timescale.
doi:10.1371/journal.pcbi.1003330
PMCID: PMC3828150  PMID: 24244138
21.  Reinforcement Learning Using a Continuous Time Actor-Critic Framework with Spiking Neurons 
PLoS Computational Biology  2013;9(4):e1003024.
Animals repeat rewarded behaviors, but the physiological basis of reward-based learning has only been partially elucidated. On one hand, experimental evidence shows that the neuromodulator dopamine carries information about rewards and affects synaptic plasticity. On the other hand, the theory of reinforcement learning provides a framework for reward-based learning. Recent models of reward-modulated spike-timing-dependent plasticity have made first steps towards bridging the gap between the two approaches, but faced two problems. First, reinforcement learning is typically formulated in a discrete framework, ill-adapted to the description of natural situations. Second, biologically plausible models of reward-modulated spike-timing-dependent plasticity require precise calculation of the reward prediction error, yet it remains to be shown how this can be computed by neurons. Here we propose a solution to these problems by extending the continuous temporal difference (TD) learning of Doya (2000) to the case of spiking neurons in an actor-critic network operating in continuous time, and with continuous state and action representations. In our model, the critic learns to predict expected future rewards in real time. Its activity, together with actual rewards, conditions the delivery of a neuromodulatory TD signal to itself and to the actor, which is responsible for action choice. In simulations, we show that such an architecture can solve a Morris water-maze-like navigation task, in a number of trials consistent with reported animal performance. We also use our model to solve the acrobot and the cartpole problems, two complex motor control tasks. Our model provides a plausible way of computing reward prediction error in the brain. Moreover, the analytically derived learning rule is consistent with experimental evidence for dopamine-modulated spike-timing-dependent plasticity.
Author Summary
As every dog owner knows, animals repeat behaviors that earn them rewards. But what is the brain machinery that underlies this reward-based learning? Experimental research points to plasticity of the synaptic connections between neurons, with an important role played by the neuromodulator dopamine, but the exact way synaptic activity and neuromodulation interact during learning is not precisely understood. Here we propose a model explaining how reward signals might interplay with synaptic plasticity, and use the model to solve a simulated maze navigation task. Our model extends an idea from the theory of reinforcement learning: one group of neurons form an “actor,” responsible for choosing the direction of motion of the animal. Another group of neurons, the “critic,” whose role is to predict the rewards the actor will gain, uses the mismatch between actual and expected reward to teach the synapses feeding both groups. Our learning agent learns to reliably navigate its maze to find the reward. Remarkably, the synaptic learning rule that we derive from theoretical considerations is similar to previous rules based on experimental evidence.
doi:10.1371/journal.pcbi.1003024
PMCID: PMC3623741  PMID: 23592970
22.  Changing the responses of cortical neurons from sub- to suprathreshold using single spikes in vivo 
eLife  null;2:e00012.
Action Potential (APs) patterns of sensory cortex neurons encode a variety of stimulus features, but how can a neuron change the feature to which it responds? Here, we show that in vivo a spike-timing-dependent plasticity (STDP) protocol—consisting of pairing a postsynaptic AP with visually driven presynaptic inputs—modifies a neurons' AP-response in a bidirectional way that depends on the relative AP-timing during pairing. Whereas postsynaptic APs repeatedly following presynaptic activation can convert subthreshold into suprathreshold responses, APs repeatedly preceding presynaptic activation reduce AP responses to visual stimulation. These changes were paralleled by restructuring of the neurons response to surround stimulus locations and membrane-potential time-course. Computational simulations could reproduce the observed subthreshold voltage changes only when presynaptic temporal jitter was included. Together this shows that STDP rules can modify output patterns of sensory neurons and the timing of single-APs plays a crucial role in sensory coding and plasticity.
DOI: http://dx.doi.org/10.7554/eLife.00012.001
eLife digest
Nerve cells, called neurons, are one of the core components of the brain and form complex networks by connecting to other neurons via long, thin ‘wire-like’ processes called axons. Axons can extend across the brain, enabling neurons to form connections—or synapses—with thousands of others. It is through these complex networks that incoming information from sensory organs, such as the eye, is propagated through the brain and encoded.
The basic unit of communication between neurons is the action potential, often called a ‘spike’, which propagates along the network of axons and, through a chemical process at synapses, communicates with the postsynaptic neurons that the axon is connected to. These action potentials excite the neuron that they arrive at, and this excitatory process can generate a new action potential that then propagates along the axon to excite additional target neurons. In the visual areas of the cortex, neurons respond with action potentials when they ‘recognize’ a particular feature in a scene—a process called tuning. How a neuron becomes tuned to certain features in the world and not to others is unclear, as are the rules that enable a neuron to change what it is tuned to. What is clear, however, is that to understand this process is to understand the basis of sensory perception.
Memory storage and formation is thought to occur at synapses. The efficiency of signal transmission between neurons can increase or decrease over time, and this process is often referred to as synaptic plasticity. But for these synaptic changes to be transmitted to target neurons, the changes must alter the number of action potentials. Although it has been shown in vitro that the efficiency of synaptic transmission—that is the strength of the synapse—can be altered by changing the order in which the pre- and postsynaptic cells are activated (referred to as ‘Spike-timing-dependent plasticity’), this has never been shown to have an effect on the number of action potentials generated in a single neuron in vivo. It is therefore unknown whether this process is functionally relevant.
Now Pawlak et al. report that spike-timing-dependent plasticity in the visual cortex of anaesthetized rats can change the spiking of neurons in the visual cortex. They used a visual stimulus (a bar flashed up for half a second) to activate a presynaptic cell, and triggered a single action potential in the postsynaptic cell a very short time later. By repeatedly activating the cells in this way, they increased the strength of the synaptic connection between the two neurons. After a small number of these pairing activations, presenting the visual stimulus alone to the presynaptic cell was enough to trigger an action potential (a suprathreshold response) in the postsynaptic neuron—even though this was not the case prior to the pairing.
This study shows that timing rules known to change the strength of synaptic connections—and proposed to underlie learning and memory—have functional relevance in vivo, and that the timing of single action potentials can change the functional status of a cortical neuron.
DOI: http://dx.doi.org/10.7554/eLife.00012.002
doi:10.7554/eLife.00012
PMCID: PMC3552422  PMID: 23359858
synaptic plasticity; STDP; visual cortex; circuits; in vivo; spiking patterns; rat
23.  The Silent Period of Evidence Integration in Fast Decision Making 
PLoS ONE  2013;8(1):e46525.
In a typical experiment on decision making, one out of two possible stimuli is displayed and observers decide which one was presented. Recently, Stanford and colleagues (2010) introduced a new variant of this classical one-stimulus presentation paradigm to investigate the speed of decision making. They found evidence for “perceptual decision making in less than 30 ms”. Here, we extended this one-stimulus compelled-response paradigm to a two-stimulus compelled-response paradigm in which a vernier was followed immediately by a second vernier with opposite offset direction. The two verniers and their offsets fuse. Only one vernier is perceived. When observers are asked to indicate the offset direction of the fused vernier, the offset of the second vernier dominates perception. Even for long vernier durations, the second vernier dominates decisions indicating that decision making can take substantial time. In accordance with previous studies, we suggest that our results are best explained with a two-stage model of decision making where a leaky evidence integration stage precedes a race-to-threshold process.
doi:10.1371/journal.pone.0046525
PMCID: PMC3549915  PMID: 23349660
24.  Reward-based learning under hardware constraints—using a RISC processor embedded in a neuromorphic substrate 
In this study, we propose and analyze in simulations a new, highly flexible method of implementing synaptic plasticity in a wafer-scale, accelerated neuromorphic hardware system. The study focuses on globally modulated STDP, as a special use-case of this method. Flexibility is achieved by embedding a general-purpose processor dedicated to plasticity into the wafer. To evaluate the suitability of the proposed system, we use a reward modulated STDP rule in a spike train learning task. A single layer of neurons is trained to fire at specific points in time with only the reward as feedback. This model is simulated to measure its performance, i.e., the increase in received reward after learning. Using this performance as baseline, we then simulate the model with various constraints imposed by the proposed implementation and compare the performance. The simulated constraints include discretized synaptic weights, a restricted interface between analog synapses and embedded processor, and mismatch of analog circuits. We find that probabilistic updates can increase the performance of low-resolution weights, a simple interface between analog synapses and processor is sufficient for learning, and performance is insensitive to mismatch. Further, we consider communication latency between wafer and the conventional control computer system that is simulating the environment. This latency increases the delay, with which the reward is sent to the embedded processor. Because of the time continuous operation of the analog synapses, delay can cause a deviation of the updates as compared to the not delayed situation. We find that for highly accelerated systems latency has to be kept to a minimum. This study demonstrates the suitability of the proposed implementation to emulate the selected reward modulated STDP learning rule. It is therefore an ideal candidate for implementation in an upgraded version of the wafer-scale system developed within the BrainScaleS project.
doi:10.3389/fnins.2013.00160
PMCID: PMC3778319  PMID: 24065877
neuromorphic hardware; wafer-scale integration; large-scale spiking neural networks; spike-timing dependent plasticity; reinforcement learning; hardware constraints analysis
25.  Inference of neuronal network spike dynamics and topology from calcium imaging data 
Two-photon calcium imaging enables functional analysis of neuronal circuits by inferring action potential (AP) occurrence (“spike trains”) from cellular fluorescence signals. It remains unclear how experimental parameters such as signal-to-noise ratio (SNR) and acquisition rate affect spike inference and whether additional information about network structure can be extracted. Here we present a simulation framework for quantitatively assessing how well spike dynamics and network topology can be inferred from noisy calcium imaging data. For simulated AP-evoked calcium transients in neocortical pyramidal cells, we analyzed the quality of spike inference as a function of SNR and data acquisition rate using a recently introduced peeling algorithm. Given experimentally attainable values of SNR and acquisition rate, neural spike trains could be reconstructed accurately and with up to millisecond precision. We then applied statistical neuronal network models to explore how remaining uncertainties in spike inference affect estimates of network connectivity and topological features of network organization. We define the experimental conditions suitable for inferring whether the network has a scale-free structure and determine how well hub neurons can be identified. Our findings provide a benchmark for future calcium imaging studies that aim to reliably infer neuronal network properties.
doi:10.3389/fncir.2013.00201
PMCID: PMC3871709  PMID: 24399936
calcium; action potential; reconstruction; connectivity; scale-free; hub neurons

Results 1-25 (46)