Program

You can import the events of FCPNLO to you Google calendar by pressing the Subscribe button below-right.

Sep
28
Fri
Luz Roncal
Sep 28 @ 9:30 am – 10:30 am

On extension problems and Hardy inequalities in the Heisenberg group.

We prove Hardy-type inequalities for fractional powers of the sublaplacian in the Heisenberg group. In order to get these inequalities, we study the extension problem associated to the sublaplacian. Solutions of the extension problem are written down explicitly and used to establish a trace Hardy inequality that will lead to a Hardy inequality with sharp constants.

Several new results concerning the extension problem in the Heisenberg group are also attained, including characterisations of all solutions of the extension problem satisfiying \(L^p\) integrability, and the study of the higher order extension problem.

This is a joint work with S. Thangavelu (Indian Institute of Science of Bangalore, India).

Enrico Valdinoci
Sep 28 @ 10:30 am – 11:30 am

Chaotic orbits for nonlocal equations and applications to atom dislocation dynamics in crystals.

We consider a nonlocal equation driven by a perturbed periodic potential. We construct multibump solutions that connect one integer point to another one in a prescribed way.

In particular, heteroclinic, homoclinic and chaotic trajectories are constructed.

This result regarding symbolic dynamics in a fractional framework is part of a study of the Peierls-Nabarro model for crystal dislocations. The associated evolution equation can be studied in the mesoscopic and macroscopic limit. Namely, the dislocation function has the tendency to concentrate at single points of the crystal, where the size of the slip coincides with the natural periodicity of the medium.

These dislocation points evolve according to the external stress and an interior potential, which can be either repulsive or attractive, depending on the relative orientations of the dislocations. For opposite orientations, collisions occur, after which the system relaxes exponentially fast.
Coffee Break
Sep 28 @ 11:30 am – 11:45 am
Serena Dipierro
Sep 28 @ 11:45 am – 12:30 pm

Symmetry properties for long-range phase coexistence models.

We discuss some recent results on nonlocal phase transitions
modelled by the fractional Allen-Cahn equation, also in connection with the surfaces minimising a nonlocal perimeter functional. In particular, we consider the “genuinely nonlocal regime” in which the difusion operator is of order less than 1 and present some rigidity and symmetry results.

Laura Sacerdote
Sep 28 @ 12:30 pm – 1:30 pm

A consistency problem in neural modelling. Coherence between input and output can be obtained using heavy tails distributions.

The coherence between the input and the output of the single units is a problem, sometimes underestimated, in network modeling.

An example in this direction is given by Integrate and Fire models used to describe the membrane potential dynamics of a neuron in a network. The focus of these models concerns the description of the inter-times between events (the InterSpike Intervals, ISIs), i.e. of the output of the neuron. This type of models describes the membrane potential evolution through a suitable stochastic process and the output of the neuron corresponds to the First Passage Time of the considered stochastic process through a boundary. However, the input mechanism determining the membrane potential dynamics often disregards its origin as function of the output of other units.

The seminal idea for these models goes back to 1964 when Gernstein and Mandelbrot proposed the Integrate and Fire model to account for the observed stable behavior of the Inter-spike Interval distribution. They suggested to model the membrane potential dynamics through a Wiener process in order to get the Inverse Gaussian distribution for the inter-times between the successive spikes of the neuron, i.e. its output. The use of the Wiener process was first motivated by its property to be the continuous limit of a random walk, later the randomized random walk was proposed to account for the continuous time characterizing the membrane potential dynamics. In this model, the arrival of inputs from the network determines jumps of fixed size on the membrane potential value. When the membrane potential attains a threshold value \(S\), the neuron releases a spike and the process restart anew. Furthermore, the inter-times between jumps are exponentially distributed.

Unfortunately, this last hypothesis is contradictory with the heavy tail distribution of the output, since the incoming inputs are output of other neurons. Later many variants of the original model appeared in the literature. Their aim was to improve the realism of the model but
unfortunately they forgot the initial clue for it, the heavy tails of the observed output distribution.

However, the ISIs models (or their variants) are generally recognized to be a good compromise between the realism and its easy use and have been proposed to model large networks. These facts motivate us to rethink the ISIs model allowing heavy tail distributions both for the ISIs of the neurons surrounding the modeled neuron and for its output.

Here, we propose to start the model formulation from the main property, i.e. the heavy tails exhibited by the ISIs. This approach allows us to propose here an Integrate and Fire model coherent with these features. The ideal framework for this rethinking involves regularly varying random variables and vectors.
We assume that each input to a unit corresponds to the output of one of \(N < \infty \) neurons of the network. The inter-times between spikes of the same neuron are independent random variables, with regularly varying distribution. Different neurons of the network are not independent, due to the network connections. The only hypothesis that we introduce to account for the dependence is very general, i.e. their ISIs determine a regularly varying vector. Based on these hypothesis we prove that the output inter-times of the considered neuron, described through an Integrate and Fire model, is a regularly varying random variable.

The next step of this modeling procedure requests a suitable rescaling of the obtained process to obtain a time fractional limit for the process describing the membrane potential evolution. We already have some result in this direction, allowing to write down the Laplace transform of the first passage time of the rescaled process through the threshold \(S\) but some mathematical step should be improved to account for the dependence between jumps times also for the limiting process.

References
[1] Bingham, N. H., Goldie, C. M. and Teugels, J. L., Regular variation, volume 27 of Encyclopedia of Mathematics and its Applications. Cambridge University Press, Cambridge, 1989. ISBN 0-521-37943-1.
[2] Gal A., and Marom S. Entrainment of the Intrinsic Dynamics of Single Isola Neurons by Natural-Like Input,. The Journal of Neurosciences 33 (18) pp. 7912-7918, 2013.
[3] Gernstein G.L., Mandelbrot B. Random walk models for the activity of a single neuron., Biophys. J. 4 pp. 41-68, 1964.
[4] Holden, A.V. A Note on Convolution and Stable Distributions in the Nervous System., Biol. Cybern. 20 pp. 171-173, 1975.
[5] Jessen, A. H., and Mikosch, T. Regularly varying functions. Publ. Inst. Math. (Beograd) (N.S.), 80(94):171–192, 2006. ISSN 0350-1302.
[6] Lindner B. Superposition of many independent spike trains is generally not a Poisson process, . Physical review E 73 pp. 022901, 2006.
[7] Kyprianou, A. Fluctuations of Lévy Processes with Applications., Springer-Verlag, 2014.
[8] Persi E., Horn D., Volman V., Segev R. and Ben-Jacob E. Modeling of Synchronized Bursting Events: the importance of Inhomogeneity., Neural Computation 16 pp. 2577-2595, 2004.
[9] Tsubo Y., Isomura Y. and Fukai T. Power-Law Inter-Spike Interval Distributions infer a Conditional Maximization of Entropy in Cortical Neurons,. PLOS Computational Biology 8,4 pp. e1002461, 2012.

Lunch
Sep 28 @ 3:15 pm – 5:00 pm