Generic placeholder image

Current Bioinformatics

Editor-in-Chief

ISSN (Print): 1574-8936
ISSN (Online): 2212-392X

Review Article

Supervised Learning in Spiking Neural Networks with Synaptic Delay Plasticity: An Overview

Author(s): Yawen Lan and Qiang Li*

Volume 15, Issue 8, 2020

Page: [854 - 865] Pages: 12

DOI: 10.2174/1574893615999200425230713

Price: $65

Abstract

Throughout the central nervous system (CNS), the information communicated between neurons is mainly implemented by the action potentials (or spikes). Although the spike-timing based neuronal codes have significant computational advantages over rate encoding scheme, the exact spike timing-based learning mechanism in the brain remains an open question. To close this gap, many weight-based supervised learning algorithms have been proposed for spiking neural networks. However, it is insufficient to consider only synaptic weight plasticity, and biological evidence suggest that the synaptic delay plasticity also plays an important role in the learning progress in biological neural networks. Recently, many learning algorithms have been proposed to consider both the synaptic weight plasticity and synaptic delay plasticity. The goal of this paper is to give an overview of the existing synaptic delay-based learning algorithms in spiking neural networks. We described the typical learning algorithms and reported the experimental results. Finally, we discussed the properties and limitations of each algorithm and made a comparison among them.

Keywords: Action potentials, spike-timing, spiking neural networks, biological neural networks, supervised learning, synaptic delay plasticity.

Graphical Abstract

[1]
Cariani PA. Temporal codes and computations for sensory representation and scene analysis. IEEE Trans Neural Netw 2004; 15(5): 1100-11.
[http://dx.doi.org/10.1109/TNN.2004.833305] [PMID: 15484887]
[2]
Hopfield JJ. Pattern recognition computation using action potential timing for stimulus representation. Nature 1995; 376(6535): 33-6.
[http://dx.doi.org/10.1038/376033a0] [PMID: 7596429]
[3]
Gautrais J, Thorpe S. Rate coding versus temporal order coding: a theoretical approach. Biosystems 1998; 48(1-3): 57-65.
[http://dx.doi.org/10.1016/S0303-2647(98)00050-1] [PMID: 9886632]
[4]
Zhang Y, Geng T, Zhang M, Wu X, Zhou J, Qu H. Efficient and robust supervised learning algorithm for spiking neural networks. Sens Imaging 2018; 19(1): 8.
[http://dx.doi.org/10.1007/s11220-018-0192-0]
[5]
Berry MJ II, Meister M. Refractoriness and neural precision. J Neurosci 1998; 18(6): 2200-11.
[http://dx.doi.org/10.1523/JNEUROSCI.18-06-02200.1998] [PMID: 9482804]
[6]
Uzzell VJ, Chichilnisky EJ. Precision of spike trains in primate retinal ganglion cells. J Neurophysiol 2004; 92(2): 780-9.
[http://dx.doi.org/10.1152/jn.01171.2003] [PMID: 15277596]
[7]
Gollisch T, Meister M. Rapid neural coding in the retina with relative spike latencies. Science 2008; 319(5866): 1108-11.
[http://dx.doi.org/10.1126/science.1149639] [PMID: 18292344]
[8]
Reinagel P, Reid RC. Temporal coding of visual information in the thalamus. J Neurosci 2000; 20(14): 5392-400.
[http://dx.doi.org/10.1523/JNEUROSCI.20-14-05392.2000] [PMID: 10884324]
[9]
Bair W, Koch C. Temporal precision of spike trains in extrastriate cortex of the behaving macaque monkey. Neural Comput 1996; 8(6): 1185-202.
[http://dx.doi.org/10.1162/neco.1996.8.6.1185] [PMID: 8768391]
[10]
Wang W, Subagdja B, Tan A-H, Starzyk JA. Neural modeling of episodic memory: encoding, retrieval, and forgetting. IEEE Trans Neural Netw Learn Syst 2012; 23(10): 1574-86.
[http://dx.doi.org/10.1109/TNNLS.2012.2208477] [PMID: 24808003]
[11]
Zhang M, Qu H, Xie X, Kurths J. Supervised learning in spiking neural networks with noise-threshold. Neurocomputing 2017; 219: 333-49.
[http://dx.doi.org/10.1016/j.neucom.2016.09.044]
[12]
Zhang M, Qu H, Belatreche A, et al. EMPD: An efficient membrane potential driven supervised learning algorithm for spiking neurons. IEEE Transactions on Cognitive and Develop-mental Systems 2017; 10(2): 151-62.
[13]
Nguyen VA, Starzyk JA, Goh W-B, Jachyra D. Neural network structure for spatio-temporal long-term memory. IEEE Trans Neural Netw Learn Syst 2012; 23(6): 971-83.
[http://dx.doi.org/10.1109/TNNLS.2012.2191419] [PMID: 24806767]
[14]
Bermak A. Vlsi implementation of a neuromorphic spiking pixel and investigation of various focal-plane excitation schemes. Int J Robot Autom 2004; 19(4): 197-205.
[http://dx.doi.org/10.2316/Journal.206.2004.4.206-2715]
[15]
Perkel DH, Bullock TH. Neural coding. Neurosci Res Program Bull 1968.
[16]
Singer W. Time as coding space? Curr Opin Neurobiol 1999; 9(2): 189-94.
[http://dx.doi.org/10.1016/S0959-4388(99)80026-9] [PMID: 10322191]
[17]
Meister M, Berry MJ II. The neural code of the retina. Neuron 1999; 22(3): 435-50.
[http://dx.doi.org/10.1016/S0896-6273(00)80700-X] [PMID: 10197525]
[18]
Gerstner W, Kistler WM. Spiking neuron models: single neurons, populations, plasticity. 1st ed. UK: Cambridge University Press 2002.
[http://dx.doi.org/10.1017/CBO9780511815706]
[19]
Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 1952; 117(4): 500-44.
[http://dx.doi.org/10.1113/jphysiol.1952.sp004764] [PMID: 12991237]
[20]
Izhikevich EM. Simple model of spiking neurons. IEEE Trans Neural Netw 2003; 14(6): 1569-72.
[http://dx.doi.org/10.1109/TNN.2003.820440] [PMID: 18244602]
[21]
Kempter R, Gerstner W, van Hemmen JL. Hebbian learning and spiking neurons. Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics 1999; 59(4): 4498-514.
[http://dx.doi.org/10.1103/PhysRevE.59.4498]
[22]
Song S, Miller KD, Abbott LF. Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat Neurosci 2000; 3(9): 919-26.
[http://dx.doi.org/10.1038/78829] [PMID: 10966623]
[23]
Bi G, Poo M. Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu Rev Neurosci 2001; 24: 139-66.
[http://dx.doi.org/10.1146/annurev.neuro.24.1.139] [PMID: 11283308]
[24]
Legenstein R, Naeger C, Maass W. What can a neuron learn with spike-timing-dependent plasticity? Neural Comput 2005; 17(11): 2337-82.
[http://dx.doi.org/10.1162/0899766054796888] [PMID: 16156932]
[25]
Knudsen EI. Supervised learning in the brain. J Neurosci 1994; 14(7): 3985-97.
[http://dx.doi.org/10.1523/JNEUROSCI.14-07-03985.1994] [PMID: 8027757]
[26]
Thach WT. On the specific role of the cerebellum in motor learning and cognition: Clues from PET activation and lesion studies in man. Behav Brain Sci 1996; 19(3): 411-31.
[http://dx.doi.org/10.1017/S0140525X00081504]
[27]
Ito M. Mechanisms of motor learning in the cerebellum. Brain Res 2000; 886(1-2): 237-45.
[http://dx.doi.org/10.1016/S0006-8993(00)03142-5] [PMID: 11119699]
[28]
Bohte SM, Kok JN, Poutre HL. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 2002; 50(1–4): 17-37.
[http://dx.doi.org/10.1016/S0925-2312(01)00658-0]
[29]
Ponulak F, Kasiński A. Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting. Neural Comput 2010; 22(2): 467-510.
[http://dx.doi.org/10.1162/neco.2009.11-08-901] [PMID: 19842989]
[30]
Yu Q, Tang H, Tan KC, Li H. Precise-spike-driven synaptic plasticity: learning hetero-association of spatiotemporal spike patterns. PLoS One 2013; 8(11)e78318
[http://dx.doi.org/10.1371/journal.pone.0078318] [PMID: 24223789]
[31]
Mohemmed A, Schliebs S, Matsuda S, Kasabov N. Span: spike pattern association neuron for learning spatio-temporal spike patterns. Int J Neural Syst 2012; 22(4)1250012
[http://dx.doi.org/10.1142/S0129065712500128] [PMID: 22830962]
[32]
Florian RV. The chronotron: a neuron that learns to fire temporally precise spike patterns. PLoS One 2012; 7(8)e40233
[http://dx.doi.org/10.1371/journal.pone.0040233] [PMID: 22879876]
[33]
Gütig R, Sompolinsky H. The tempotron: a neuron that learns spike timing-based decisions. Nat Neurosci 2006; 9(3): 420-8.
[http://dx.doi.org/10.1038/nn1643] [PMID: 16474393]
[34]
Xu Y, Zeng X, Zhong S. A new supervised learning algorithm for spiking neurons. Neural Comput 2013; 25(6): 1472-511.
[http://dx.doi.org/10.1162/NECO_a_00450] [PMID: 23517101]
[35]
Zhang M, Qu H, Belatreche A, Chen Y, Yi Z. A highly effective and robust membrane potential-driven supervised learning method for spiking neurons. IEEE Trans Neural Netw Learn Syst 2019; 30(1): 123-37.
[http://dx.doi.org/10.1109/TNNLS.2018.2833077]
[36]
Mohapatra S, Gotzig H, Yogamani S, Milz S, Zollner R. . Exploring Deep Spiking Neural Networks for Automated Driving Applications. arXiv preprint arXiv:190302080 2019.
[37]
Song Z, Wu X, Yuan M, Tang H. An Unsupervised Spiking Deep Neural Network for Object Recognition. International Symposium on Neural Networks 361-70.
[http://dx.doi.org/10.1007/978-3-030-22808-8_36]
[38]
Wu J, Chua Y, Zhang M, Yang Q, Li G, Li H. Deep Spiking Neural Network with Spike Count based Learning Rule. arXiv preprint arXiv:190205705 2019.
[39]
Lee C, Sarwar SS, Roy K. Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures. arXiv preprint arXiv:190306379 2019.
[40]
Sengupta A, Ye Y, Wang R, Liu C, Roy K. Going deeper in spiking neural networks: VGG and residual architectures. Front Neurosci 2019; 13: 95.
[http://dx.doi.org/10.3389/fnins.2019.00095] [PMID: 30899212]
[41]
Shrestha A, Fang H, Wu Q, Qiu Q. Approximating back-propagation for a biologically plausible local learning rule in spiking neural networks. Proceedings of the International Conference on Neuromorphic Systems 10
[http://dx.doi.org/10.1145/3354265.3354275]
[42]
Jin Y, Zhang W, Li P. Hybrid macro/micro level backpropagation for training deep spiking neural networks. Adv Neural Inf Process Syst 2018; 7005-15.
[43]
Steve K. Esser, Rathinakumar Appuswamy, Paul Merolla, John V Arthur, Dharmendra S Modha. Backpropagation for energy-efficient neuromorphic computing. Adv Neural Inf Process Syst 2015; 1117-25.
[44]
Eric Hunsberger and Chris Eliasmith. Spiking deep networks with lif neurons. arXiv preprint arXiv:151008829 2015.
[45]
Esser SK, Merolla PA, Arthur JV, et al. Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci USA 2016; 113(41): 11441-6.
[http://dx.doi.org/10.1073/pnas.1604850113] [PMID: 27651489]
[46]
O’Connor P, Neil D, Liu S-C, Delbruck T, Pfeiffer M. Real-time classification and sensor fusion with a spiking deep belief network. Front Neurosci 2013; 7: 178.
[http://dx.doi.org/10.3389/fnins.2013.00178] [PMID: 24115919]
[47]
Liu Q, Chen Y, Furber S. . Noisy softplus: an activation function that enables snns to be trained as anns. arXiv preprint arXiv:170603609 2017.
[48]
Diehl PU, Neil D, Binas J, Cook M, Liu S-C, Pfeiffer M. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. International Joint Conference on Neural Networks (IJCNN) 1-8.
[http://dx.doi.org/10.1109/IJCNN.2015.7280696]
[49]
Peter U. Diehl, Bruno U Pedroni, Andrew Cassidy, Paul Merolla, Emre Neftci, and Guido Zarrella. Truehappiness: Neuromorphic emotion recognition on truenorth. International Joint Conference on Neural Networks (IJCNN) 4278-85.
[50]
Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 2017; 11: 682.
[http://dx.doi.org/10.3389/fnins.2017.00682] [PMID: 29375284]
[51]
Rueckauer B, Liu S-C. Conversion of analog to spiking neural networks using sparse temporal coding. IEEE International Symposium on Circuits and Systems (ISCAS) 1-5.
[http://dx.doi.org/10.1109/ISCAS.2018.8351295]
[52]
Emre O. Surrogate gradient learning in spiking neural networks. arXiv preprint arXiv:190109948 2019.
[53]
Panda P, Roy K. Unsupervised regenerative learning of hierarchical features in spiking deep networks for object recognition Int Joint Conf Neural Networks (IJCNN) 299-306.
[http://dx.doi.org/10.1109/IJCNN.2016.7727212]
[54]
Lee JH, Delbruck T, Pfeiffer M. Training deep spiking neural networks using backpropagation. Front Neurosci 2016; 10: 508.
[http://dx.doi.org/10.3389/fnins.2016.00508] [PMID: 27877107]
[55]
Zenke F, Ganguli S. Superspike: Supervised learning in multilayer spiking neural networks. Neural Comput 2018; 30(6): 1514-41.
[http://dx.doi.org/10.1162/neco_a_01086] [PMID: 29652587]
[56]
Patiño-Saucedo A, Rostro-Gonzalez H, Serrano-Gotarredona T, Linares-Barranco B. Event-driven implementation of deep spiking convolutional neural networks for supervised classification using the SpiNNaker neuromorphic platform. Neural Netw 2020; 121: 319-28.
[http://dx.doi.org/10.1016/j.neunet.2019.09.008] [PMID: 31590013]
[57]
Shrestha SB, Song Q. Robust spike-train learning in spike-event based weight update. Neural Netw 2017; 96: 33-46.
[http://dx.doi.org/10.1016/j.neunet.2017.08.010] [PMID: 28957730]
[58]
Shrestha SB, Song Q. Robustness to training disturbances in spikeprop learning. IEEE Trans Neural Netw Learn Syst 2018; 29(7): 3126-39.
[http://dx.doi.org/10.1109/TNNLS.2017.2713125] [PMID: 28692992]
[59]
Xu Y, Zeng X, Han L, Yang J. A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks. Neural Netw 2013; 43: 99-113.
[http://dx.doi.org/10.1016/j.neunet.2013.02.003] [PMID: 23500504]
[60]
Hong C, Wei X, Wang J, Deng B, Yu H, Che Y. Training spiking neural networks for cognitive tasks: A versatile framework compatible with various temporal codes. IEEE Trans Neural Netw Learn Syst 2019.
[http://dx.doi.org/10.1109/TNNLS.2019.2919662] [PMID: 31247574]
[61]
Saeed Reza Kheradpisheh and Timothée Masquelier. S4nn: temporal backpropagation for spiking neural networks with one spike per neuron. arXiv preprint arXiv:191009495 2019.
[62]
Zuo L, Chen S, Qu H, Zhang M. A Fast Precise-Spike and Weight-Comparison Based Learning Approach for Evolving Spiking Neural Networks. International Conference on Neural Information Processing 797-804.
[http://dx.doi.org/10.1007/978-3-319-70090-8_81]
[63]
Glackin B, Wall JA, McGinnity TM, Maguire LP, McDaid LJ. A spiking neural network model of the medial superior olive using spike timing dependent plasticity for sound localization. Front Comput Neurosci 2010; 4: 18.
[http://dx.doi.org/10.3389/fncom.2010.00018] [PMID: 20802855]
[64]
Xu B, Gong Y, Wang B. Delay-induced ring behavior and transitions in adaptive neuronal networks with two types of synapses. Sci China Chem 2013; 56(2): 222-9.
[http://dx.doi.org/10.1007/s11426-012-4710-y]
[65]
Gilson M, Bürck M, Burkitt AN, van Hemmen JL. Frequency selectivity emerging from spike-timing-dependent plasticity. Neural Comput 2012; 24(9): 2251-79.
[http://dx.doi.org/10.1162/NECO_a_00331] [PMID: 22734488]
[66]
Taherkhani A, Belatreche A, Li Y, Maguire LP. Dl-resume: a delay learning-based remote supervised method for spiking neurons. IEEE Trans Neural Netw Learn Syst 2015; 26(12): 3137-49.
[http://dx.doi.org/10.1109/TNNLS.2015.2404938] [PMID: 25794401]
[67]
Ghosh-Dastidar S, Adeli H. A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection. Neural Netw 2009; 22(10): 1419-31.
[http://dx.doi.org/10.1016/j.neunet.2009.04.003] [PMID: 19447005]
[68]
Steuber V, Willshaw D. A biophysical model of synaptic delay learning and temporal pattern recognition in a cerebellar Purkinje cell. J Comput Neurosci 2004; 17(2): 149-64.
[http://dx.doi.org/10.1023/B:JCNS.0000037678.26155.b5] [PMID: 15306737]
[69]
Sporea I, Grüning A. Supervised learning in multilayer spiking neural networks. Neural Comput 2013; 25(2): 473-509.
[http://dx.doi.org/10.1162/NECO_a_00396] [PMID: 23148411]
[70]
Schreiber S, Fellous JM, Whitmer D, Tiesinga P, Sejnowski TJ. A new correlation-based measure of spike timing reliability. Neurocomputing 2003; 52-54: 925-31.
[http://dx.doi.org/10.1016/S0925-2312(02)00838-X]
[71]
Devalle F, Montbrió E, Pazó D. Dynamics of a large system of spiking neurons with synaptic delay. Phys Rev E 2018; 98(4)042214
[http://dx.doi.org/10.1103/PhysRevE.98.042214]
[72]
Taherkhani A, Belatreche A, Li Y, Maguire LP. A supervised learning algorithm for learning precise timing of multiple spikes in multilayer spiking neural networks. IEEE Trans Neural Netw Learn Syst 2018; 29(11): 5394-407.
[http://dx.doi.org/10.1109/TNNLS.2018.2797801]
[73]
Taherkhani A, Belatreche A, Li Y, et al. EDL: An extended delay learning based remote supervised method for spiking neurons[C International Conference on Neural Information Processing.
[http://dx.doi.org/10.1007/978-3-319-26535-3_22]
[74]
Wang X, Lin X, Dang X. A delay learning algorithm based on spike train kernels for spiking neurons. Front Neurosci 2019; 13: 252.
[http://dx.doi.org/10.3389/fnins.2019.00252]
[75]
Zhang M, Li J, Wang Y, Gao G. R-tempotron: A robust tempotron learning rule for spike timing-based decisions. 13th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP) 139-42.
[http://dx.doi.org/10.1109/ICCWAMTIP.2016.8079823]
[76]
Zhang M, Qu H, Li J, Xie X. A new supervised learning algorithm for spiking neurons. Proceedings of the 18th Asia Pacific Symposium on Intelligent and Evolutionary Systems 1: 171-84.
[http://dx.doi.org/10.1007/978-3-319-13359-1_14]
[77]
Zhang M, Wu J, Chua Y, et al. MPD-AL: an efficient membrane potential driven aggregate-label learning algorithm for spiking neurons. Proc Conf AAAI Artif Intell 2019; 33: 1327-34.
[http://dx.doi.org/10.1609/aaai.v33i01.33011327]
[78]
McKennoch S, Liu D, Bushnell LG. Fast modifications of the spikeprop algorithm. IEEE International Joint Conference on Neural Network Proceedings 3970-7.
[79]
Ghosh-Dastidar S, Adeli H. Improved spiking neural networks for EEG classification and epilepsy and seizure detection. Integr Comput Aided Eng 2007; 14(3): 187-212.
[http://dx.doi.org/10.3233/ICA-2007-14301]
[80]
Russell BC, Torralba A, Murphy KP, Freeman WT. LabelMe: a database and web-based tool for image annotation. Int J Comput Vis 2008; 77: 157-73.
[http://dx.doi.org/10.1007/s11263-007-0090-8]
[81]
Nadasdy Z. Information encoding and reconstruction from the phase of action potentials. Front Syst Neurosci 2009; 3: 6.
[http://dx.doi.org/10.3389/neuro.06.006.2009] [PMID: 19668700]
[82]
LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015; 521(7553): 436-44.
[http://dx.doi.org/10.1038/nature14539] [PMID: 26017442]
[83]
Esteva A, Robicquet A, Ramsundar B, et al. A guide to deep learning in healthcare. Nat Med 2019; 25(1): 24-9.
[http://dx.doi.org/10.1038/s41591-018-0316-z] [PMID: 30617335]
[84]
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition 770-8.
[85]
Abdel-Hamid O, Abdel-rahman M, Jiang H, Deng L, Penn G, Yu D. Convolutional neural networks for speech recognition. IEEE/ACM Trans Audio Speech Lang Process 2014; 22(10): 1533-45.
[http://dx.doi.org/10.1109/TASLP.2014.2339736]
[86]
Max WY. Xie Chen, Shoukang Hu, Jianwei Yu, Xunying Liu, Helen Meng. Gaussian process lstm recurrent neural network language models for speech recognition. ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 7235-9.
[87]
Tom Young, Devamanyu Hazarika, Soujanya Poria, Erik Cambria. Recent trends in deep learning based natural language processing. ieee Computational intelligenCe magazine 2018; 13(3): 55-75.
[88]
Ghaeini R, Sadid A, Hasan VD, Liu J, et al. Dr-bilstm: Dependent reading bidirectional lstm for natural language inference. / arXiv preprint arXiv:180205577 2018.
[89]
Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature 2017; 542(7639): 115-8.
[http://dx.doi.org/10.1038/nature21056] [PMID: 28117445]
[90]
Zhao R, Yan R, Chen Z, et al. Deep learning and its applications to machine health monitoring. Mech Syst Signal Process 2019; 115: 213-37.
[http://dx.doi.org/10.1016/j.ymssp.2018.05.050]
[91]
Feldmann J, Youngblood N, Wright CD, Bhaskaran H, Pernice WHP. All-optical spiking neurosynaptic networks with self-learning capabilities. Nature 2019; 569(7755): 208-14.
[http://dx.doi.org/10.1038/s41586-019-1157-8] [PMID: 31068721]
[92]
Slayer: Spike layer error reassignment in time. Adv Neural Inf Process Syst 2018; 1412-21.
[93]
Wu J, Chua Y, Zhang M, et al. A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks. arXiv preprint arXiv:190701167 2019.
[94]
Wu X, Wang Y, Tang H, Yan R. A structure-time parallel implementation of spike-based deep learning. Neural Netw 2019; 113: 72-8.
[http://dx.doi.org/10.1016/j.neunet.2019.01.010] [PMID: 30785011]
[95]
Park S, Kim S, Choe H, Yoon S. Fast and efficient information transmission with burst spikes in deep spiking neural networks. Proceedings of the 56th Annual Design Automation Conference
[http://dx.doi.org/10.1145/3316781.3317822]
[96]
Vaila R, Chiasson J, Saxena V. Deep Convolutional Spiking Neural Networks for Image Classification. / arXivpreprint arXiv: 190312272 2019.
[97]
Comsa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J. Temporal coding in spiking neural networks with alpha synaptic function. arXiv preprint arXiv:190713223 2019.
[98]
Kumarasinghe K, Kasabov N, Taylor D. Deep learning and deep knowledge representation in spiking neural networks for brain-computer interfaces. Neural Netw 2020; 121: 169-85.
[http://dx.doi.org/10.1016/j.neunet.2019.08.029] [PMID: 31568895]
[99]
Rotermund D, Pawelzik KR. Back-propagation learning in deep Spike-By-Spike networks. Front Comput Neurosci 2019; 13: 55.
[http://dx.doi.org/10.3389/fncom.2019.00055] [PMID: 31456677]
[100]
Wang S, Chen H. A novel deep learning method for the classification of power quality disturbances using deep convolutional neural network. Appl Energy 2019; 235: 1126-40.
[http://dx.doi.org/10.1016/j.apenergy.2018.09.160]
[101]
Tavanaei A, Maida A. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing 2019; 330: 39-47.
[http://dx.doi.org/10.1016/j.neucom.2018.11.014]

Rights & Permissions Print Cite
© 2024 Bentham Science Publishers | Privacy Policy