ebook img

Eigenvalue Distributions for a Class of Covariance Matrices with Applications to Bienenstock-Cooper-Munro Neurons Under Noisy Conditions PDF

0.21 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Eigenvalue Distributions for a Class of Covariance Matrices with Applications to Bienenstock-Cooper-Munro Neurons Under Noisy Conditions

EIGENVALUE DISTRIBUTIONS FOR A CLASS OF COVARIANCE MATRICES WITH APPLICATIONS TO BIENENSTOCK-COOPER-MUNRO NEURONS UNDER NOISY CONDITIONS Armando Bazzani and Gastone C Castellani ∗ Department of Physics and National Institute of Nuclear Physics, University of Bologna and Institute for Brain and Neural Systems, Brown University Leon N Cooper Department of Physics, Brown University, Providence, RI 02912 and 0 Institute for Brain and Neural Systems, Brown University 1 (Dated: January 26, 2010) 0 2 We analyze the effects of noise correlations in the input to, or among, BCM neurons using the Wigner semicircular law to construct random, positive-definitesymmetric correlation matrices and n compute their eigenvalue distributions. In the finite dimensional case, we compare our analytic a results with numerical simulations and show the effects of correlations on the lifetimes of synaptic J strengths in various visual environments. These correlations can be due either to correlations in 6 the noise from the input LGN neurons, or correlations in the variability of lateral connections in a 2 network of neurons. In particular, we find that for fixed dimensionality, a large noise variance can give rise to long lifetimes of synaptic strengths. This may be of physiological significance. ] h p PACSnumbers: 87.18.Tt,05.40.-a,87.18.Sn - o i I. INTRODUCTION oncalciuminfluxthroughNMDAreceptorsandphospho- b rylation state of AMPA receptors and that both recep- . s torsarewidelydistributedwithinthebrain[10]. Thisbio- c Receptive field changes in visual cortex, observed in i the early period of an animal’s postnatal development, physicalmechanismisatleastpartlyshared,bytheplas- s ticity rule STDP (Spike-timing-dependent plasticity)[11] y arethoughttodependonsynapticplasicity[1,2]; the de- h tailed dynamics of such receptive field modifiability has that describes the synaptic functional change on the ba- p beenusedtoinferthepreciseformofthisplasticity[3,4]. sisofactionpotentialstiminginconnectedneurons. The [ main difference between STDP and BCM is that BCM In a classic paradigm, called monocular deprivation, vi- is an averagetime rule and thus does not include micro- 1 sion through one eye is deprived in early development. scopic temporal structures (i.e. it works with rates not v Inthiscasecellsinvisualcortextendto disconnectfrom 8 deprived eye[5]. Experiments have shown that if monoc- spikes)[12, 13]. A further analysis that considers the re- 0 lation between BCM and STDP rules will be considered ular deprivation is produced by monocular lid closure 7 in a future work. then a rapid loss of response to the deprived eye oc- 4 curs, while if the retina is inactivated by a Tetrodotoxin The standard theoretical analysis of monocular depriva- . 1 (TTX) injection, significantly less loss is observed[6, 7]. tion experimental results, accordingto BCM (see for ex- 0 These results are consistent with the form of synaptic ample [4]) relies on the, seemingly reasonable, assump- 0 plasticity proposed by Bienenstock, Cooper and Munro tion that in the inactivated situation, activity in the lat- 1 : (BCM)[8]. The BCM theory was originally proposed to eral geniculate nucleus (LGN), which is the cortical in- v describe plasticity processes in visual cortex as observed put, is reduced compared to the lid closure case; as a Xi by Hubel and Wiesel [5]. One of the main postulates of consequence there is a less rapid loss of response to the this theory is the existence of a critical threshold (the deprived eye. This assumption has been questioned by r a sliding threshold) that depends on past neuronalhistory new experimental results.[14] in a non-linear way. This nonlinearity is necessary to In this recent study the activity of neurons in LGN ensure stability of the synaptic weights. The main pre- has been measured during normal vision, when the eye- dictions of the BCM theory have been confirmed in hip- lid of the experimental animals was sutured and when pocampalslices andvisualcortex andrecently in invivo TTX was injected into the lid sutured eye. The record- inhibitoryavoidancelearningexperiments[9]. Theexten- ings were made on awake animals while they watched sion of these results to other brain areas,and ultimately movie clips and sinusoidal gratings. A surprising result to the whole brain, is not confirmed but is under active ofthese experiments is that inactivationdoes not reduce study. One motivation for this research is that a pro- mean activity in LGN compared to lid suture; however posed biophysical mechanism for the BCM rule is based inactivationproducedanincreaseincorrelationsbetween differentcellswithintheLGN.Previousexperimentalre- sults in ferret LGN[15, 16], and further results in mouse LGN[14]indicatethattheactivityofnearbycellsinLGN ∗Electronicaddress: [email protected] are correlated,that this activity falls off as a function of 2 the distancebetweenthe receptivefields ofthe cells,and φ thatthesecorrelationsexistevenintheabsenceofretinal activity. Arecentpaper[17]hasexaminedtheimpactofinputcor- relations during deprivation experiments and has shown that correlations in LGN noise can significantly slow down the loss of response to the deprived eye for BCM neurons[8], in agreement with experiments. This paper alsoexamines the effect of suchcorrelationson a classof y θ PCAtypelearningrules. ThuscorrelatedLGNnoisethe- M oretically leads to persistence of synaptic strength for a period that depends on the level of noise correlation. As a consequence, noise at neuronal level could play a fun- damental role in synaptic plasticity. The effect of white noise on BCM has been previously studied[18]. In this − paper we show that noise correlations in the input or FIG. 1: The BCM function φ(y); when y is close to zero, we among BCM neurons can be studied by using the eigen- can linearize thesystem (1) by approximating φ≃−ǫy. value distributions of random positive-definite symmet- ric matrices. In a simple but generic case, we explicitly compute the distribution by using the Wigner semicir- the simplest form the function φ is a quadratic function cular law[19, 20], pointing out the role of correlations φ(y) = y2 yθ and the dynamic threshold θ is the m m and the existence of a thermodynamic limit. In the fi- time-averag−ed<y2 > of the second moment of the neu- nite dimensional case, the analytic results are compared ron activity, which can be replaced by the expectation withnumericalsimulationswithapplicationstorealcon- value over the input probability space E(y2) under the ditions. We also discuss a transition in the eigenvalue slow-learning assumption. During MD regime, the in- distribution when the noise level is increased. This phe- put signal x to the closed eye can be represented by a nomenon implies the existence of states with very long stochastic process with zero mean value and small vari- lifetimes; these could have physiological significance in ance(<x2 > 1). Onecannumericallystudy the effect ≪ the development of neuronal systems. ofnoisecorrelationbyintegratingthe system(1). Letus introduce the input noise II. BCM NEURON IN MONOCULAR x(t)=Aξ(t) (2) DEPRIVATION where ξ(t) is a stochastic process in RN defined by i.i.d. random variables with zero mean value and normalized We briefly review the behavior of a Bienenstock, variance. We considera symmetricmatrix Aofthe form Cooper and Munro (BCM) neuron[8] in monocular de- privation. Let w be the synaptic weights and x the in- 1 q a ... q a put signals received by the synapses, the BCM synaptic √N 12 √N 1N 1 q a 1 ... q a modification rule has the form A= √N 12 √N 2N N 1 1+q2 .. .. ... .. ≫ w˙ =xφ(y,θ ) (1)  q a q a ... 1  m p √N 1N √N 2N    (3) wherethe modificationfunction φ(y,θ )depends onthe m where the coefficients a are independent realizations of neuronactivitylevely x w(itisassumedalinearpro- ij portionality between th∝e in·put x and the output y) and aconvoarrmiaanlciezemdartarnixdoomf tvhaerinaobilseewxi(tth)ziesromeanvalue. The on a moving threshold θ , which is a super-linear func- m tion of the cell activity history (in a stationary situation 1 2q a ... 2q a θ can be related to the time averaged value < yk > √N(1+q2) 12 √N(1+q2) 1N m 2q a 1 ... 2q a where k > 1 of a non-linear moment of the neuron ac- √N(1+q2) 12 √N(1+q2) 2N AAT tivity distribution)[4]. The modification function φ is a ≃  non-linear function of the postsynaptic activity y which  .. .. ... ..  has two zero crossings, one at y=0 and the other at √N(21q+q2)a1N √N(21q+q2)a2N ... 1  y = θm (see fig. 1). When the neuron activity is above   the threshold θ we have LTP, whereas LTD appears Then for a given N, we can vary the correlation among m when the activity is below the threshold. The nonlinear the noise components by varying q [0 : 1] keeping ∈ dependence of the threshold on neuron activity solves fixed the noise variance. We have simulated the weight the stability problem of Hebb’s learning rule, preventing dynamics in eq. (1) by using the modification poten- a dynamical divergence of synaptic weights (y = θ is tial φ(y) = y(y 1) where we set the neuron activity m an attractive fixed point for a stationary input)[4]. In y = (x w)/N to−study the limit N 1. In this simple · ≫ 3 0.4 bydiagonalizingthecovariancematrixC andexpandthe synaptic weights w in the eigenvector base. Then each component w in this base will evolve according to λ 0.3 ue wλ(t)=wλ(0)e−λt al v are 0.2 whereλisthecorrespondingeigenvalueofthecovariance u n sq matrix. Asaconsequencewhenλ≪1wehavelongtime ea nontrivialpersistentstatesforthesynapticweights. Ac- M 0.1 cordingto numericalsimulations shownin the fig. 2 this q=1.00 is the case for correlated noisy input. In order to per- q=0.25 form explicit calculations we assume that the covariance q=0.50 q=0.75 0 matrixisarandompositivedefinite symmetricmatrixof 0 2000 4000 6000 8000 10000 Time the form FIG.2: EvolutionofthenormofBCMsynapticweightswhen C =n2(C0 +mV +O(m2)) (6) kl kl kl the input is the correlated noise defined by eq. (2). We consider increasing values for q: q = 0.25,0.50,0.75,1.00 for where N = 100. The curves give the evolution of the mean square value of the synaptic weights w when the input noise corre- 1 q ... q lation increases keeping the noise variance fixed (time is in q 1 ... q C0 = q 1 (7) arbitraryunits). Wenotetheexistenceofnonzerolongtime .. .. ... .. ≤ persistentstateswhenthecorrelationishigh,whereasforlow q q ... 1 correlation value the system tends to the equilibrium state     w=0. and V is a symmetric random matrix with zero mean kl value, normalized variance and i.i.d. elements. The parameter q [0,1] determines the average correlation model, if the activity y is below the thresholdθ =1 we expect LTD and the synaptic weights w relaxmtowards among the in∈puts and the parameter m2 is the variance of fluctuations of the covariance matrix. The matrix (7) 0. To characterize the relaxation process, we compute the mean squarevalue ofthe vector w(t); the results are has the following eigenvalues plottedinfig. 2forincreasingvaluesofq. Inallcasesthe system (1) tends to a stationary solution. For low corre- 1+(N 1)q i=1 λ = − (8) lated noise we recover the equilibrium solution w=0 as i (1 q i=2,..,N − expected,butwhentheinputnoiseisstronglycorrelated, the simulations show the existence of non zero long time Therefore when q 1 (i.e. increase of the average → persistent states for the synaptic weights. Moreover the correlation) the eigenvalues λ tend to 0 linearly in 2,..,N numerical simulations suggest that the persistent states (1 q)andthecorrelationmatrixC0isdegenerate. When − correspondtoeigenvectorsofthenoisecovariancematrix m = 0, the effect of correlations is to increase the decay with very smalleigenvalues. These results are consistent time of w(k) 1/(1 q). In the next section we ana- ∝ − withexperimentalobservations[14]ontheactivityofneu- lyze the effect of fluctuations V on the eigenvalue dis- kl rons in LGN in the case of monocular deprivation. tributions ofthe matrix(6) inthe thermodynamicslimit A theoretical approach linearizes the system (1) around N . →∞ y 0byconsideringθ constant. Theneq. (1)becomes m ≃ (cfr. fig. 1) III. EIGENVALUE DISTRIBUTIONS w˙ = ǫx(w x) (4) − · so that on average we have We consider the eigenvalue distributions of the matrix C (seeeq. (6))usingtheunperturbedcovariancematrix kl N (7). Since C is a symmetric positive definite matrix we kl w˙i = ǫ Cijwj (5) can introduce the following representation − j=1 X C =(√C0+mW)2 =C0+m(√C0W+W√C0)+m2W2 where C is the noise covariance matrix. In the case (9) of uncorrelated input (C = n2δ where n2 is the i- ij i ij i where √C is the symmetric square root of the matrix component noise variance) equation (4) becomes 0 (7) and and W is a symmetric randommatrix with i.i.d. w˙ = ǫn2w elements and zero mean value. By direct calculation we i − i i obtain the relation between V and W and w (t) tends exponentially to zero with a scale time i 1/(ǫn2). Inagenericcasethesystem(5)canbesolved V =√C0W +W√C0, ≃ i 4 Then W is uniquely determined as a function of V and where we have the relation the expectation value of the covariance matrix λ=(1+m µ)2(1 q) (14) 0 − <C >=C0+Nm2I N and µ are the eigenvalues of a Wigner matrix scaled by In order to control the noise variance in the thermody- √N. As a consequence we have the following Lemma: namiclimit,N ,weintroducethescalingparameter →∞ m 0 Any eigenvalue λ of the matrix (11) can be written in 1 q m=m − (10) theform(14)where√Nµisaneigenvalueofthe Wigner 0 N r matrix W so that µ ( 2,2) ∈ − This is consistentwiththe noise definitionin the numer- The relation(14) leads to the existence oftwo regionsin ical simulations (cfr. eq. (3)). The √1 q dependence the λ spectrum according to the sign of m0µ+1. When allows us to control the limit q 1 w−hen the unper- µ > 1/m0 we invert the relation (14) by taking the turbed matrix C0 is degenerate. →In real systems both q positi−ve branch of the square root and N are fixed by physical conditions and m is pro- 0 √λ √1 q portionaltothefluctuationsofthecovariancematrix. In µ= − − (15) m √1 q order to study the eigenvalue distribution of the covari- 0 − ance matrixC we considerthe case whenW is a Wigner but when µ < 1/m we have to consider the negative 0 matrix[21]. Then Wigner’s theorem[19,20]on the eigen- − branch value distribution can be stated as: √λ √1 q µ= − − − (16) m √1 q 0 − Let Wij be a symmetric N × N random matrix with Thevalueµ=1/m0 isacriticalvaluefortheλspectrum i.i.d. elements whose common distribution has zero andweexpectasingularityintheeigenvaluedistribution. mean value, normalized variance and is even, then the The second case occurs only if m > 1/2, so that there 0 probability distributionof the eigenvaluesin the interval exists a critical threshold in the noise level that induces [√Nα,√Nβ] a transition in the λ distribution. By applying Wigner’s theorem to obtain the distribution density of µ in the PN(√Nα<µ<√Nβ) 2<α<β <2 thermodynamic limit we obtain − is in the thermodynamics limit (4 µ2)1/2 lim ρ(µ)= − β (4 µ2)1/2 N→∞ 2π lim P (√Nα<µ<√Nβ)= − dλ N sothatinthecasem <1/2thedistributionρ(λ)follows N→∞ Zα 2π from the definition (105) according to[22, 23] Totreattheproblemofcharacterizingtheeigenvaluedis- tributionfunction whenthe covariancematrixC hasthe ρ(λ)=ρ(µ)dµ = 4m20−( λ/(1−q)−1)2 (17) form(9), we firstperformthe orthogonaltransformation dλ q 4 (1p q)λm2π Othatdiagonalizestheunperturbedmatrix(7). Wethen − 0 restrict our analysis to the invariant subspace that cor- where p responds to the degenerateeigenvalue 1−q. In the ther- (1 q)(1 2m0)2 <λ<(1 q)(1+2m0)2 modynamic limit the eigenvalue 1+(N 1)q is singular − − − − anddecouplesthecorrespondinginvariantsubspace. The In the case m > 1/2 we split the λ distribution in two 0 covariance matrix (9) has the form parts: when 2 (1 q)(2m 1)2 <λ<(1 q)(1+2m )2 C′ =OCOT = 1−qIN +m0r1N−qWˆ! N ≫1 ρ(λ) is giv−en by eq0.−(17), whereas−when 0 p (11) 0<λ<(1 q)(2m 1)2 where Wˆ =OWOT. The eigenvalue equation is − 0− the distribution is 2 1 q det λ 1 qI +m − Wˆ =0 (12) 4m2 ( λ/(1 q) 1)2  − − N 0r N !  ρ(λ) = 0− − − p q 4 (1p q)λm2π   − 0 This is equivalent to the condition 4m2p( λ/(1 q)+1)2 + 0− − (18) det[(√NµIN −Wˆ)]=0 (13) q 4 (1p−q)λm20π p 5 8 1.5 6 Ρ 1.0 Ρ 4 0.5 0.0 2 0.0 0.2 0.4 0.6 0.8 1.0 Λ 0 1.2 0.0 0.2 0.4 0.6 0.8 1.0 1.0 Λ 0.8 Ρ 0.6 FIG. 3: Plots of eigenvalue distributions ρ(λ) (see defini- 0.4 tion (17)) for the correlation matrix (3) using m0 = .2 and q = .5,.7,.9 respectively the dashed-dotted, the dotted and 0.2 the dashed line. As the average correlation q increases the 0.0 eigenvalue spectrum is squeezed towards theorigin. 0.0 0.5 1.0 1.5 Λ 1.5 Inthesecondcasethedistributioniscontinuousforλ>0 andsingularatλ 0. Thisresultleadstothepossibility thatthecorrelatio→nmatrix(9)haseigenvaluesarbitrarily 1.0 Ρ close to zero and largerthan one (corresponding to anti- correlation effects in the noise). The unperturbed corre- 0.5 lationq amongthe initialprocessesx(t) introduceswhat is essentially a scaling factor that changes the existence 0.0 interval of the λ spectrum. The critical value m0 =1/2, 0.0 0.5 1.0 1.5 2.0 correspondstoacriticalvalueofthefluctuationvariance Λ m2 (cfr. definition (10)) 1.5 1 q m2 = − (19) crit 4N 1.0 Ρ Fromabiophysicalpointofview,theresults(19)implies 0.5 that for sufficiently correlatednoise there exist eigenvec- tors w(k) of the synaptic weights that are preserved for a very long time. This might be a possible mechanisms 0.0 of maintaining synaptic weights in neuronal systems in 0.0 0.5 1.0 1.5 2.0 2.5 3.0 correlated noisy environments. Λ FIG.4: Plotoftheeigenvaluedistributionforthecorrelation IV. NUMERICAL RESULTS ON FINITE matrix(3)usingq=.5andm0=.2,.4,.6,.8fromlefttoright andtoptobottom. Theeffectofthetransitionatthecritical DIMENSION COVARIANCE MATRICES value m0 = .5 is clearly seen: the eigenvalues accumulate at theorigin. Thecontinuouslineisthetheoretical distribution To study the effects of finite dimensions on the theo- (17) whereas thehistograms are numerically computed using reticalresults (17)and(18) wehaveperformedsome nu- a random matrix of order N =103. mericalsimulationsusingmatricesofdimensionN =103 that might be realistic for a neural network in LGN. In fig. 3 we show the eigenvalue distribution (17) for m = 0 .2 and different values of the correlation (q = .5,.7,.9). Toseetheeffectofthetransitioninthedistributionfunc- tionatthe criticalvalue (19), wecomputedthe eigenval- ues distribution in the cases q = .5 and m = .4,.6,.8. 0 The results are shown in fig. 4 Weobserve,asinthecaseoflowcorrelation,theeigen- m 1/2the distributionbecomessingularandthe dis- 0 → value distributions become larger than in the correlated tribution peak tends to zero, so that there exist eigen- caseandhavevaluesbeyond1whenm increases. When vectors w(k) with extremely long decay times in eq. (4). 0 6 V. GENERALIZATION TO A NEURAL is still a symmetric positive definite matrix. If the input NETWORK signals are independent (C =σ2δ ), we obtain kl k kl For a network of neurons, eq. (1) can be generalized w˙(k) =−ǫ(k)n2k(I −L)−kk1w(k) k=1,...,N [24] to and the w(k) exponentially decay towards zero. Again w˙(k) =x(k)φ(y(k),θ(k)) k =1,...,N (20) the interesting case is when the covariance matrix is not m diagonaland the connections L are defined by a random wheretheindexkreferstothek-thneuronofthenetwork matrix; from a biological point of view we are modeling and N is the number of neurons. We then have a ensemble of neurons which are stimulated by corre- N latedinputs andhavebilateralconnections with random y(k) =x(k) w(k)+ L y(l) (21) weights. When Lˆ has the form (cfr. eq. (6)) kl · l=1 X Lˆ =C0 +mV +O(m2) (28) where we have introduced lateral connections, L, among kl kl kl the neurons. We can have both excitatory or inhibitory we can apply the method used for a single neuron to neural connections according to the sign of L . In what kl study the eigenvalue distribution. But in the network follows we assume that L is a symmetric matrix with casewehaveanewbiophysicalinterpretationofthecrit- L 1 where denotes the usual matrix norm. ical value (19): if number of neurons is sufficiently large k k ≪ k k Therefore the neurons form a bidirectional symmetric and/or the fluctuations in the bilateral neural connec- network; we can invert the relation (21) to obtain tions exceed a threshold, the network is able to develop w(k) eigenvectors with extremely long lifetimes in pres- N y(k) = (I−L)−kl1x(l)·w(l) (22) egnesctesomfeachnaonisiysmcsorfroerlalotendginneptuwto.rTkhliifsetpimheensoimnennooinsysueng-- l=1 X vironments. The possibility thatcorrelatednoiseplays a For monocular deprivation eq. (21) becomes a network role in the dynamics of a neural network has been inves- generalization of eq. (4) tigated in [25] using different models. w˙(k) = ǫ(k)x(k)y(k) k =1,...,N (23) − VI. CONCLUSIONS where ǫ(k) are suitable positive constants. By substitut- ing eq. (22) into (23), we get the linear system Wehaveanalyzedtheeffectsofnoisecorrelationsinthe N inputto,oramong,BCMneuronsusingtheWignersemi- w˙(k) =−ǫ(k)x(k) (I−L)−kl1x(l)·w(l) k =1,...,N circular law to construct random, positive-definite sym- l=1 metric correlation matrices and computing their eigen- X (24) value distributions. In the finite dimensional case, our In what follows we regard the inter-neuron connections analytic results are compared with numerical simula- L, as not-modifiable. Moreover for simplicity we reduce tions. We thus show the effects of correlations on the external input to one dimension for each neuron: then lifetimes of the synaptic strengths in various visual envi- both w(k) and x(k) are scalar. In order to study the ronments. Inthecaseofasingleneuronthenoisecorrela- dynamical properties of the solutions we average on fast tionsarisefromtheinputLGNneurons,whereasthecor- scale variations of the noise; the equations (24) become relationsarisealsofromthe lateralconnectionsinaneu- ron network. If the dimensionality of system is fixed, we N w˙(k) =−ǫ(k) (I −L)−kl1Cklw(l) k=1,...,N (25) sehxocewedthaatcwrihtiecnalththerfleushctouldatsioynnsapofsetshewcitohvalroinangcleifmetaimtreixs l=1 X arise. These results may be of physiological significance where Ckl is the noise covariancematrix between the in- and can be tested experimentally. puts of the k and l neuron. We consider the connection matrixLasasymmetricrandommatrix,sothataccord- ingtoourhypothesesthematrix(I L) 1isasymmetric − − positivedefinite matrix. Then(25)canbe writteninthe form N w˙(k) = ǫ(k) Lˆ w(l) k=1,...,N (26) kl − l=1 X where Lˆkl =(I −L)−kl1Ckl (27) 7 [1] M. F. Bear and C. D. Rittenhouse, J Neurobiol 41(1), [13] W. Gerstner and W. M. Kistler, Biological Cybernetics 83 (1999). 87, 404 (2002). [2] F. Sengpiel and P. C. Kind, Curr Biol 12(23), R818 [14] M. L. Arnold, J. Heynen, R. H. Haslinger, and M. F. (2002). Bear, Nature Neuroscience 12, 390 (2009). [3] B. S. Blais, N. Intrator, H. Shouval, and L. N. Cooper, [15] M. Weliky and L. C. Katz, Science 285, 599 (1999). ProceedingsoftheNationalAcademyofSciences10(7), [16] T. Ohshiro and M. Weliky, Nat Neurosci 9(12), 1541 1797 (1998). (2006). [4] L.N.Cooper,N.Intrator,B.S.Blais,andH.Z.Shouval, [17] B. S.Blais, L. N.Cooper, andH.Shouval,Effect of cor- Theoryofcorticalplasticity (WorldScientificNewJersey, related lgn firing rates on predictions for monocular eye 2004). closure vs monocular retinal inactivation, submitted for [5] T.Wieseland D.Hubel,Journal ofPhysiology 180,106 publication. (1962). [18] A.Bazzani,D.Remondini,N.Intrator,andG.C.Castel- [6] C. D. Rittenhouse, H. Z. Shouval, M. A. Paradiso, and lani, Neural Computation 15(7), 1621 (2003). M. F. Bear, Nature397, 347 (1999). [19] E. P. Wigner, The Annals of Mathematics 67-2, 325 [7] M. Y.Frenkel and M. F. Bear, Neuron 44, 917 (2004). (1958). [8] E.L.Bienenstock,L.N.Cooper,andP.W.Munro,Jour- [20] E. P.Wigner, SIAMReview 9, 1 (1967). nal of Neuroscience 2, 32 (1982). [21] M. L. Mehta, Random Matrices, vol. 142 (Elsevier Pure [9] J. R. Whitlockand, A. J. Heynen, M. G. S. MG, and and Applied Mathematics, 2004). M. F. Bear, Science 313, 1093 (2006). [22] V. Marchenko and L. Pastur, Mat. Sb.72, 507 (1967). [10] G.C.Castellani,E.M.Quinlan,L.N.Cooper,andH.Z. [23] L. A. Pastur, Annales de l’I.H.P. section A 64, 325 Shouval, Proceedings of the National Academy of Sci- (1996). ences 98(22), 12772 (2001). [24] G. C. Castellani, N. Intrator, H. Shouval, and L. N. [11] L. F. Abbott and S. B.Nelson, Nature Neuroscience 3, Cooper, Networks:Comput. Neur.Syst.10, 111 (1999). 1178 (2000). [25] E.Glatt,H.Busch,F.Kaiser,andA.Zaikin,Phys.Rev. [12] W. Gerstner, R. Kempter, J. L. van Hemmen, and E 73, 026216 (2006). H.Wagner, Nature 383, 76 (1996). EIGENVALUE DISTRIBUTIONS FOR A CLASS OF COVARIANCE MATRICES WITH APPLICATIONS TO BIENENSTOCK-COOPER-MUNRO NEURONS UNDER NOISY CONDITIONS Armando Bazzani and Gastone C Castellani ∗ Department of Physics and National Institute of Nuclear Physics, University of Bologna and Institute for Brain and Neural Systems, Brown University Leon N Cooper Department of Physics, Brown University, Providence, RI 02912 and 0 Institute for Brain and Neural Systems, Brown University 1 (Dated: January 26, 2010) 0 2 We analyze the effects of noise correlations in the input to, or among, BCM neurons using the Wigner semicircular law to construct random, positive-definitesymmetric correlation matrices and n compute their eigenvalue distributions. In the finite dimensional case, we compare our analytic a results with numerical simulations and show the effects of correlations on the lifetimes of synaptic J strengths in various visual environments. These correlations can be due either to correlations in 6 the noise from the input LGN neurons, or correlations in the variability of lateral connections in a 2 network of neurons. In particular, we find that for fixed dimensionality, a large noise variance can give rise to long lifetimes of synaptic strengths. This may be of physiological significance. ] h p PACSnumbers: 87.18.Tt,05.40.-a,87.18.Sn - o i I. INTRODUCTION cium influx through NMDA receptors and phosphoryla- b tionstateofAMPAreceptorsandthatbothreceptorsare . s widely distributed within the brain[? ]. This biophysi- c Receptive field changes in visual cortex, observed in i the early period of an animal’s postnatal development, cal mechanism is at least partly shared, by the plastic- ys are thought to depend on synaptic plasicity[? ? ]; the ity rule STDP (Spike-timing-dependent plasticity)[? ] h detaileddynamicsofsuchreceptivefieldmodifiabilityhas that describes the synaptic functional change on the ba- p beenusedtoinferthepreciseformofthisplasticity[? ? ]. sisofactionpotentialstiminginconnectedneurons. The [ main difference between STDP and BCM is that BCM In a classic paradigm, called monocular deprivation, vi- is an averagetime rule and thus does not include micro- 1 sionthroughoneeyeisdeprivedinearlydevelopment. In scopic temporal structures (i.e. it works with rates not v thiscasecellsinvisualcortextendtodisconnectfromde- 8 prived eye[? ]. Experiments have shown that if monocu- spikes)[? ? ]. A further analysis that considers the re- 0 lation between BCM and STDP rules will be considered lardeprivationisproducedbymonocularlidclosurethen 7 in a future work. a rapidlossofresponseto the deprivedeye occurs,while 4 if the retina is inactivated by a Tetrodotoxin (TTX) in- The standard theoretical analysis of monocular depri- . 1 jection, significantly less loss is observed[? ? ]. These vation experimental results, according to BCM (see for 0 resultsareconsistentwiththeformofsynapticplasticity example [? ]) relies on the, seemingly reasonable, as- 0 proposed by Bienenstock, Cooper and Munro (BCM)[? sumption that in the inactivated situation, activity in 1 : ]. The BCM theory was originally proposed to describe the lateral geniculate nucleus (LGN), which is the corti- v plasticityprocessesinvisualcortexasobservedbyHubel calinput, is reducedcomparedto the lid closurecase; as Xi and Wiesel [? ]. One of the main postulates of this the- aconsequencethereis alessrapidlossofresponsetothe ory is the existence of a critical threshold (the sliding deprived eye. This assumption has been questioned by ar threshold) that depends on past neuronal history in a new experimental results.[? ] non-linear way. This nonlinearity is necessary to ensure In this recent study the activity of neurons in LGN stability of the synaptic weights. The main predictions has been measured during normal vision, when the eye- of the BCM theory have been confirmedin hippocampal lid of the experimental animals was sutured and when slicesandvisualcortexandrecentlyininvivoinhibitory TTX was injected into the lid sutured eye. The record- avoidance learning experiments[? ]. The extension of ings were made on awake animals while they watched these results to other brain areas, and ultimately to the movie clips and sinusoidal gratings. A surprising result whole brain, is not confirmed but is under active study. ofthese experiments is that inactivationdoes not reduce One motivation for this research is that a proposed bio- mean activity in LGN compared to lid suture; however physical mechanism for the BCM rule is based on cal- inactivationproducedanincreaseincorrelationsbetween differentcellswithintheLGN.Previousexperimentalre- sults in ferret LGN[? ? ], and further results in mouse LGN[? ] indicatethattheactivityofnearbycellsinLGN ∗Electronicaddress: [email protected] are correlated,that this activity falls off as a function of

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.