Program

You can find here the preliminary list of abstracts attached to this page. The contributions have been roughly divided into macro areas: Biophysics (those mainly dealing with cells or cell constituents), Sociophysics (those meanly dealing with human behavior), Physics of Complexity (those dealing with more abstract systems), but of course the same contribution covers in general more than one field.

Contents

  1. 1 1 Wednesday 27/6 morning
    1. 1.1 1.1 Can we improve the traffic flow by an optimal behavior of the drivers?
    2. 1.2 1.2 A numerical model for the bibliometric H-index
    3. 1.3 1.3 Structural properties of DNA promoters
    4. 1.4 1.4 Analysis of noise-induced bistability in Michaelis Menten single-step enzymatic cycle
    5. 1.5 1.5 Cognitive modelling of epidemics
  2. 2 2 Wednesday 27/6 afternoon
    1. 2.1 2.1 The Lyapunov spectrum of cellular automata unravelled
    2. 2.2 2.2 The role of Transposable Elements in shaping the combinatorial interaction of Transcription Factors.
    3. 2.3 2.3 Reconstruction and analysis of protein-protein interaction networks in Plasmodium
    4. 2.4 2.4 Collective dynamics, extensivity and non-additivity in sparse networks
  3. 3 3 Thursday 28/6 morning
    1. 3.1 3.1 From lattice gas cellular automata to the lattice Boltzmann equation method
    2. 3.2 3.2 Stochastic Turing Patterns on a Network
    3. 3.3 3.3 Expanding the transfer entropy to identify information subgraphs in complex systems
    4. 3.4 3.4 Large scale organization of chromatin
    5. 3.5 3.5 Modelling gene evolution
  4. 4 4 Thursday 28/6 afternoon
    1. 4.1 4.1 Experiments of science dissemination
    2. 4.2 4.2 Non-Gaussian fluctuations in stochastic models with absorbing barriers
    3. 4.3 4.3 Enhanced Stochastic Oscillation in a Model of Cellular Calcium Dynamics
  5. 5 5 Thursday 28/6 afternoon - Poster Session
    1. 5.1 5.1 Thermodynamics formalism for chemical master equations
    2. 5.2 5.2 A micro-environmental study of the Zn+2 - Aβ 1-16 structural properties
    3. 5.3 5.3 Mathematical modeling of miRNA mediated sponge interaction.
    4. 5.4 5.4 Community-detection cellular automata with local and long-range connectivity
    5. 5.5 5.5 Small group dynamics: a minority game experiment
  6. 6 6 Thursday 28/6 18:00 Public event
    1. 6.1 6.1 La fisica nella vita di tutti i giorni - Physics in everyday life
  7. 7 7 Friday 29/6 morning
    1. 7.1 7.1 Landslide modeling: application to warning system for Civil Protection purposes and theoretical approach based on molecular dynamics
    2. 7.2 7.2 Agent based mobility models of the Etrurian protocities formation: from satellite photos and thematic maps to the comprehension of the born of the modern society. A forecasting system for the territory and cultural management and conservation.
    3. 7.3 7.3 Randomness perception: representativeness or encoding?
    4. 7.4 7.4 From the Social Cognition and the Cognitive Heuristics to the modelling of the Self Awareness: the Tri-Partite Model

1 Wednesday 27/6 morning

1.1 Can we improve the traffic flow by an optimal behavior of the drivers?

Bastien Chopard and Christophe Hanggeli
Computer Science Department, University of Geneva, Switzerland

We consider a traffic situation where a perturbation limits the flow for a given period of time. Such a perturbation often produces stop-and-go waves which are unpleasant for the drivers. A question is whether, by informing the cars in advance of the presence of a jam and asking them the adopt an optimal behavior, we can increase the traffic flow upstream of the perturbation. We propose an analytical study of the growth and depletion of a traffic queue, based of the incoming and outgoing flows, and the density of cars. We validate our theoretical result numerically with a cellular automata simulation. We show that changing the car density within the congested region cannot change the travel time to reach the end of the perturbation. However, the movement of the car can be made smoother.

1.2 A numerical model for the bibliometric H-index

Georgia Ionescu, Bastien Chopard
Computer Science Department, University of Geneva, Switzerland

The H-index is a metrics aimed at capturing the quality of scientific production. For a given set of publication, the H-index is computed as the maximum number H of these publications that have received at least H citations. Whether this metrics is a fair way to estimate the quality of a scientist or a group of scientists has been discussed a lot in the literature [2]. Here we consider a simple agent-based model that mimics a population of scientists, or a population of papers. We show that simple rules can be proposed to explain the power law distribution observed by Redner [3]. From this model, we can predict the value of the H-index as a function of the number N of papers and number M of citations. Our results can be compared to the real data and to the formula

     ( M  2) 13 H  =   ----        4N

found in the literature [1]. Our model allows us to consider other questions such as (1) what is the H-index of a community as a function of the H-index of its members? (2) For scientists with a high H-index, what is the part of the citations they received which is due to scientists with a low H-index? In other words, is it possible to quantify how famous researchers need less famous researchers to establish their reputation? Or could the famous researchers be seen as forming self- sufficient elite groups, where the citations received from within the group would be sufficient to establish their high H-index?

References

[1] Juan E Iglesias and Carlos Pecharromn. Scaling the h-index for different scientific isi fields. Scientometrics, 73(3):303–320, 2007.

[2] Franck Laloe and Remy Mosseri. Bibliometric evaluation of individual re- searchers: not even right... not even wrong! Europhysics News, 40(5):26–29, 2009.

[3] S. Redner. How popular is your paper? an empirical study of the citation distribution. The European Physical Journal B, 1998.

1.3 Structural properties of DNA promoters

Lucia Pettinato
Università di Firenze

We show that the combination of spectral methods allows to organize DNA promoters of any species into equivalence classes, that correspond to the presence of specific regular subsequences.

1.4 Analysis of noise-induced bistability in Michaelis Menten single-step enzymatic cycle

Enrico Giampieri
Physics Dept. of Bologna University and INFN Bologna

In this presentation we discuss noise-induced bistability in a specific circuit with many biological im- plications, namely a single-step enzymatic cycle described by Michaelis Menten equations with quasi-steady state assumption. We study the biological feasibility of this phenomenon, considering a small and discrete number of molecules involved in the circuit, and we characterize the conditions necessary for it. We show that intrinsic noise (due to the stochastic character of the Master Equation approach) of one-dimensional substrate reaction only is not sufficient to achieve bistability, then we characterize analytically the necessary conditions on enzyme number fluctuations. We implement numerically two biologically plausible circuits that show bistability over different parameter windows as predicted by our results, providing hints about how such a phenomenon could be exploited in real biological systems.

1.5 Cognitive modelling of epidemics

Andrea Guazzini
Department of Psychology, University of Florence.

The Self Awareness topic represents one of the most attractive and fascinating attribute of the Human Cognition. The understanding of its key features will have presumably a relevant impact on the design of the future self-aware ICT application, and the forecasting of the systems dominated by self-aware entities. The modelling of these concepts would have a large impact on the knowledge of social systems, ranging from the small group and the micro trading dynamics to the modelling of cultural evolution and of societies. In the case study we first introduce a new theoretical framework to modelling the probabilistic reasoning and the Cognitive Heuristics at a computational level, and some already developed functions within the framework. Then, two relevant examples of application will be faced: the ”Local Community Detection problem” and the ”Epidemics Forecasting problem”.

2 Wednesday 27/6 afternoon

2.1 The Lyapunov spectrum of cellular automata unravelled

J.M. Baetens, B. De Baets
KERMIT, Department of Mathematical Modelling, Statistics and Bioinformatics
Ghent University, Coupure links 653, 9000 Gent, Belgium

Notwithstanding the maximum Lyapunov exponent (MLE) has been used so far to gain insight into cellular automaton (CA) dynamics, CAs are higher-dimensional dynamical systems that are tied up with an entire Lyapunov spectrum. Motivated by the important role that the full Lyapunov spectrum plays in the characterization of dynamical systems that are based upon a continuous phase space, it might be worthwhile to address the meaning of the Lyapunov spectrum in the framework of two-state CAs. Of course, the meaning of these Lyapunov spectra should be investigated more closely before any statements can be made about its importance for characterizing CAs. Preliminary results on the spectra of rules for which it holds that Jacobian matrix is constant throughout the CA evolution, such as rules 15, 90, 105, 150 and 255, indicate that all exponents, except for the MLE, yield information on the way defects that originate from a single initial perturbation are distributed across the cellular space. More specifically, upon ordering the cells of a CA based on the number of defects they bear, these exponents seem to give insight into the ratio of the difference between the number of defects in two consecutive cells of this ordered list to the difference between the number of defects in two other consecutive cells, whereas the MLE yields information on the total number of defects. As such, it ap- pears that the Lyapunov spectrum of two-state CAs allows for assessing how defects are spread across the cellular space. However, as soon as the Jacobian matrix is not constant throughout the CA evolution, this conclusion does not seem to hold, which can probably be attributed to the fact that the above reasoning needs to be supplemented with a kind of mean-field approximation that accounts for the time-varying Jacobian.

2.2 The role of Transposable Elements in shaping the combinatorial interaction of Transcription Factors.

Michele Caselle
Dip. Fisica, Università di Torino

In the last few years several studies showed that transposable elements (TEs) in the human genome are significantly associated with transcription factor binding sites and that in several cases their expansion within the genome led to a substantial rewiring of the regulatory network. Here we suggest another possible role played by TEs in the evolution of the regulatory networks. We discuss a set of evidences supporting the idea that the evolution of particular patterns on combinatorial interactions among Transcription Factors (TFs) was mediated and supported by the expansion of specific classes of TEs in the human genome.

To address this issue we studied the binding of Estrogen Receptor alpha (ERalpha) to DNA using two chromatin immunoprecipitation sequencing (ChIP-seq) public datasets on MCF7 cell lines corresponding to different modalities of exposure to estrogen. We performed a genome-wide analysis of Transposable Elements overlapping ChIP-seq binding peaks and found a remarkable enrichment of a few well defined types and classes of transposable elements . Among these enriched TEs a prominent role was played by MIR (Mammalian Interspersed Repeats) transposons. These TEs underwent a dramatic expansion at the beginning of the mammalian radiation and then stabilized. We conjecture that the special affinity of ERalpha for the MIR class of TEs could be at the origin of the important role which ERalpha assumed in mammalians.

We then performed a genome-wide scan for putative transcription factor binding sites (TFBSs) overlapping ChIP-seq peaks and repetitive regions, employing canonical Positional Weight Matrices (PWMs). We found strong enrichment and correlated presence of a few TFBSs within the ChiP-seq binding peaks. In several cases these TFs correspond to known cofactors of ERalpha, thus supporting the idea of a co-regulatory role of these co-localized TFs. Most of these correlations turned out to be strictly associated to specific classes of TEs thus suggesting the presence of a well defined “transposon code” within the regulatory network.

Altogether our results support the idea that transposition events, besides rewiring the network, also played a central role in the emergence and success of combinatorial gene regulation in complex eukaryotes and that the evolution of specific combinations of TFs interactions was actually mediated and driven by the expansion of a few specific classes of Transposable Elements.

2.3 Reconstruction and analysis of protein-protein interaction networks in Plasmodium

Elisabetta Pizzi
Dipartimento di Malattie Infettive, Parassitarie ed Immunomediate Istituto Superiore di Sanità

Reconstruction of protein-protein interaction networks constitutes one of the most promising computational tools for exploring biological processes. The large amount of post-genomics data collected in the recent years makes now possible to approach this difficult task also in the case of malaria parasites. Plasmodium parasites are characterized by a very complicated life cycle which includes diverse and different cellular forms. In this work we adopted a Bayesian approach to recostruct protein-protein interaction networks reflecting the interactomes of three different stages of P.falciparum. The analysis of these probabilistic networks allowed us to establish the overall architectures of the possible interactomes and to highlight sub-networks reflecting biological processes related to the sexual development of the parasite cell.

2.4 Collective dynamics, extensivity and non-additivity in sparse networks

Stefano Luccioli, Simona Olmi, Antonio Politi, Alessandro Torcini
ISC-CNR Firenze

The dynamics of sparse networks is investigated both at the microscopic and macroscopic level, upon varying the connectivity. In all cases (chaotic maps, Stuart-Landau oscillators, and leaky integrate-and-fire neuron models), we find that a few tens of random connections are sufficient to sustain a nontrivial (and possibly irregular) collective dynamics. At the same time, the microscopic evolution turns out to be extensive, both in the presence and absence of a macroscopic evolution. This result is quite remarkable, considered the non-additivity of the underlying dynamical rule.

Ref: S. Luccioli, S. Olmi, A. Politi and A. Torcini, “Collective dynamics in sparse networks”, submitted to Phys. Rev. Lett.

3 Thursday 28/6 morning

3.1 From lattice gas cellular automata to the lattice Boltzmann equation method

Raul Rechtman
Centro de Investigación en Energía, Universidad Nacional Autónoma de México, Apdo. Postal 34, 62580 Temixco, Mor., Mexico.

The lattice Boltzmann equation method (LBEM) is a finite difference scheme used to numerically simulate flows. The method evolved historically from lattice gas cellular automata, simple models that exhibit fluid dynamics behavior. Some aspects of the history of the method will be addressed in this talk. The method has been used extensively in the simulation of the most varied flows. A first example is thermal levitation, a particle immersed in a fluid with density and temperature larger than those of the fluid can float or levitate. A second example is the the study of vortex induced vibrations of a cylinder attached to a spring.

3.2 Stochastic Turing Patterns on a Network

Malbor Asllani(1) Francesca Di Patti(2), Duccio Fanelli(2)
1 Dipartimento di Scienza e Alta Tecnologia, Universit‘ degli Studi dell’Insubria, via Valleggio 11, 22100 Como, Italy
2 Dipartimento di Energetica “Sergio Stecco”, Universit‘ degli Studi di Firenze, via S. Marta 3, 50139 Firenze, Italy and INFN, Sezione di Firenze

The process of stochastic Turing instability on a network is dis- cussed for a specific case study, the stochastic Brusselator model. The system is shown to spontaneously differentiate into activator-rich and activator-poor nodes, outside the region of parameters classically de- puted to the deterministic Turing instability. This phenomenon, as re- vealed by direct stochastic simulations, is explained analytically, and eventually traced back to the finite size corrections stemming from the inherent graininess of the scrutinized medium.

3.3 Expanding the transfer entropy to identify information subgraphs in complex systems

Sebastiano Stramaglia
Università di Bari (Italy)

We propose a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by an high value will be associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be associated to the sign of the contribution.

3.4 Large scale organization of chromatin

Mario Nicodemi
Università di Napoli “Federico II”

(missing)

3.5 Modelling gene evolution

Giulia Menichetti
Physics Dept. of Bologna University

We show preliminary results on the simulation of the evolution of gene length in human beings, and moreover, in some eukaryotes. We start by characterizing the gene length distribution in the various chromosomes and then we propose a simplified model for reproducing such distribution. We suppose that the intronic regions undergo a partial copy-paste process, and consider the coding part not subjected to the same kind of evolution: the latter is thus considered constant and negligible for determining gene lengths. The main idea of this work is to have a first approach to the comprehension of eukaryotic genome evolution.

4 Thursday 28/6 afternoon

4.1 Experiments of science dissemination

Giovanna Pacini, Franco Bagnoli
Dep. Energetica and CSDC, Università di Firenze

A science café is a meeting on a scientific and / or technology topic between the public and experts, held in an informal place as a pub, a restaurant or a bar. A science café is not a conference, experts introduce themselves and the theme of the discussion, but this part is limited to a minimum. The engine of the meeting are always the questions, actions and discussions of the public, muffled-animated by a moderator. In Florence this activity is carried on by “Associazione Culturale caffè-scienza”, founded and kept alive by many academics and researchers of the CNR but also by many “ordinary” people. The association is participating, through the CSDC, Interdepartmental Centre for the Study of Complex Dynamics, University of Florence, in the European Project titled “Scicafè” The project’s main targets are that of promoting the idea of science cafe as an effective communication tool for science and technology; creating an European network of Science Cafes in places of different geographical, demographic and cultural characteristics, and of operating as a vehicle for the promotion of the public understanding of science and of the public debate on scientific issues.

Within the European Project we have started experimenting some new techniques and modalities of science dissemination to extend the dissemination of good practices and to increase local public. We shall illustrate some of these experiments like audio and video streaming and radio broadcasting.

4.2 Non-Gaussian fluctuations in stochastic models with absorbing barriers

Claudia Cianci, Duccio Fanelli, Francesca Di Patti
Dip. Energetica and CSDC, Università di Firenze

The dynamics of a one-dimensional stochastic model is studied in presence of an absorbing boundary. The distribution of fluctuations is analytically characterized within the generalized van Kampen expansion, accounting for higher order corrections beyond the conventional Gaussian approximation. The theory is shown to successfully capture the non Gaussian traits of the sought distribution returning an excellent agreement with the simulations, for all times and arbitrarily close to the absorbing barrier. At large times, a compact analytical solution for the distribution of fluctuations is also obtained, bridging the gap with previous investigations, within the van Kampen picture and without resorting to alternative strategies, as elsewhere hypothesized.

4.3 Enhanced Stochastic Oscillation in a Model of Cellular Calcium Dynamics

Laura Cantini, Emma Massi, Claudia Cianci, Duccio Fanelli
Dip. Energetica and CSDC, Università di Firenze

Calcium oscillations in cells play a role of paramount importance and are thought to be implicated in a large variety of cellular processes. Mathematical models have been proposed that reproduce the observed dynamics, so enabling to gain insight into the scrutinized phenomenon. These models are often deterministic in nature. The interacting molecules are ideally assumed to yield to continuous concentrations, that evolve self-consistently as dictated by ordinary or partial coupled differential equations. Single individual effects, stemming from the intimate discreteness of the analyzed medium, can prove however crucial by modifying significantly the approximate mean-field predictions. The stochastic component of the microscopic dynamics can in particular induce the emergence of regular macroscopic patterns, both in time and space. In this paper, we shall present a stochastic model for calcium dynamics and show that self-organized quasi-cycle can spontaneously emerge, in a region of parameters for which the corresponding deterministic dynamics converges to a stable fixed point for the concentrations amount. The study is carried out both analytically and numerically.The master equation which governs the underlying stochastic dynamics is studied via the celebrated van Kampen system size expansion and the power spectrum of fluctuations calculated analytically. The theory predictions are challenged versus stochastic simulations returning an excellent quantitative agreement.

5 Thursday 28/6 afternoon - Poster Session

5.1 Thermodynamics formalism for chemical master equations

Luciana de Oliveira
Physics Dept. of Bologna University

Chemical master equations (CMEs) are a relevant theoretical tools to describe the fluctuations effects on biochemical reactions. The corresponding stationary states contain information on the long term behavior of the system. The CME can describe the relaxation process toward a equilibrium state characterized by the detailed balance condition, or toward a non-equilibrium stationary process (NESS) characterized by the presence of chemical currents. The possible significance of the NESS in biochemical reactions is actually debated to understand its relation with the plasticity mechanisms of biological systems. We propose the use of a thermodynamics formalism to study the properties of NESSs from a point of view of entropy production and energy exchange with the environment.

5.2 A micro-environmental study of the Zn+2 - 1-16 structural properties

A. Maiorana(1), T. Marino(2), V. Minicozzi(3), S. Morante(3), N. Russo(2)
1) UniversitàCattolica del Sacro Cuore - Roma, Italy
2) Dipartimento di Chimica, Università della Calabria - Rende (CS), Italy
3) Dipartimento di Fisica Università di Roma “Tor Vergata” - Roma, Italy INFN, Sezione di Roma “Tor Vergata” - Roma, Italy

Relying on a combination of classical and ab-initio methods, we study the influence of the nature of the local physico-chemical environment on the structural features of β-amyloid peptides complexed with Zn+2 ions. The analysis is carried out by comparing the different metal coordination modes and the long-range peptide folding structures that are obtained in extensive classical as well as ab initio simulations, when the system is either in water or is in the so-called “gas-phase”, and/or when different force fields for the Zn+2 and its ligands are used. Two are the main results of this investigation. The first is that the precise Zn+2 coordination mode emerging from classical simulations, markedly depends on the partial charge attributed to the ion and the atoms surrounding it, but these structural differences are completely washed out when the resulting classical configurations are submitted to a quantum minimization. Secondly, although the presence of water does not affect the Zn+2 inner coordination shell, it significantly influences the long-range peptide folding propensity.

5.3 Mathematical modeling of miRNA mediated sponge interaction.

Andrea Riba
Dip. Fisica, Università di Torino

Abstract: We discuss, using stochastic equations, the behaviour of a particular class of miRNA mediated regulatory circuits in which a master miRNA regulates a Transcription Factor and together with it a target protein coding gene. We show that, keeping into account the so called sponge effect, this circuit is able to accelerate the expression of the target gene and to correlate the stochastic fluctations of Transcription Factor and target gene thus improving the stability and robustness of this transcriptional regulation.

5.4 Community-detection cellular automata with local and long-range connectivity

Franco Bagnoli(1,3), Andrea Guazzini(2,3), Emanuele Massaro(1,3)
1 Dept. Energy Università di Firenze. Also INFN, sez. Firenze.
2 Dept. Psychology and CSDC, Università di Firenze
3 CSDC, Università di Firenze.

We explore a community-detection cellular automata algorithm inspired by human heuristics, based on information diffusion and a non-linear processing phase with a dynamics inspired by human heuristics. The main point of the methods is that of furnishing different “views” of the clustering levels from an individual point of view. We apply the method to networks with local connectivity and long-range rewiring.

5.5 Small group dynamics: a minority game experiment

A. Cini(1) and A. Guazzini(1,2)
1) CSDC, University of Florence, via S. Marta 3, I-50139 Firenze, Italy.
2) Department of Psychology, University of Florence, Via di San Salvi 12, 50100, Firenze, Italy.

Recently, the concept of Cognitive Heuristic has been connected to a new approach for the exploration of the human social interactions. The idea is to consider the cognitive systems as a satisfier more than an optimizer one. The implicit assumption is that people tries to make the minimal effort to optimize their social interaction only with respect to the particular task they are facing.

We investigated this aspect experimentally by studying the behaviour of a small group of people interacting in a virtual environment (an improved chat systems[ref]). We designed such an experimental set-up in order to keep under control most of communication aspects, leaving little space to non-controlled communication. Moreover, we concentrate on the non-semantic aspects of communication, in order to be more context-free as possible.

We present here the results of a minority game situation, in which there is no winning strategy for reaching consensus in the majority of participants, and we confront the outcome of this experiments with that of similar set-ups without any task (blank modality) and a majority game.

The main goal of the present work is the characterization of how a little group of people builds/structures their communication network and the related affinities, during a short virtual group interaction, and what differences can be revealed by comparing different conditions.

We show how our experimental framework captures some fundamental aspects of the subject’s behaviour in a small group virtual dynamics.

We exposed 150 different subjects to three experimental modality. All the experiments where constituted by a web based chat session with 10 participants, all instructed in the same way, and associated with a random and neutral identity. Subjects were physically separated so to avoid non-controlled communication.

In the first blank modality, we proposed the subjects to engage just a free chatting, without any restriction. The only requirement was to accomplish the assessment of their affinity space after the end of the session, reporting it on their “private radar”. In the second topic modality, we introduced a polarizing subject in the discussion, and we asked the participant to develop their own opinion about the topic. In the third modality we proposed a frustrated minority game based on a voting procedure where only the second biggest clusters were awarded. The subjects have been asked to vote three times for experiment, every 15 minutes, about different topics concerning the task. After the first two training votes, the subjects were informed about the results . Finally, the third vote was considered the valid one, and the subject were invited to try to be part of the winning cluster(s).

A classical statistical approach has been used to test the experimental hypothesis and to refine the useful observables. We used the product-moment (r.)-correlation of Bravais Pearson, in order to test the relation among the quantitative variables, and we compared the different experimental conditions using the ANOVA and the t-Student tests. We fit the “models” of the cognitive strategies with a preliminary linear regression method. For what concern the voting game, we show how the subjects develop a very good ability to face with the “frustrated task” they were participating. We compared the results of the experimental votes whit those produced by an appropriate random or null model, measuring the Z-scores in order to assess the randomness of the player’s behaviour.

Considering first the peculiar results of the voting modality, we have observed that all the participants are able to belong in the third vote to a cluster with an high probability of victory. During the first two votes subjects apparently adopt other strategies to vote, and the distribution of the final clusters’ sizes reveals that only in the third vote the subjects try to win, determining only small clusters composed by one, two or three components.

Subjects’ strategies seem approximate effectively the distribution of the probability of victory of the cluster size in the case of a random process of vote, but making a sort of correction on it and voting not at random,

Noteworthy, in the voting experiment, the votes were not associated with the affinity. In other words the affinity between subjects appeared in this modality less able to affect the communicative dynamics of the group with respect to the others two experimental conditions.

In summary, the first third of the experimenta seem to correspond to the characteristic time for the construction of the first “social structure”, which is also in this experiment maintained until the end of the experiments.

The centrality degree of the communication network is the measure which confirms some relevant differences among the experimental conditions.

Although the final state of the public spaces of communication for all the modalities is always a full connected network, defining the links in a continuous way, the average values of the node’s centrality allows to discriminate between the modalities. Subjects belonging to the voting and to the blank modalities show a significantly greater centrality with respect to the topic modality also in the public channels of communication.

Noteworthy, the average centrality in the affinity spaces couples together the voting and topic modalities. This result suggest a greater final degree of segregation for the voting and the topic modalities, with respect to the affinity space, regardless of the number and of the kind of the interactions among subjects.

The betweenness degree has been used beyond the degree of centrality as a measure of the segregation of the networks under scrutiny.

The average betweenness in the affinity space shows that the average degree of separation of the network is greater in the voting modality, while the blank and the topic ones are not distinguishable. Despite this, the affinity space does not appear correlated with any composition of the clusters generated by the three votes, neither with the real preferences expressed after the sessions. This last result suggest that the affinity dynamics is not correlated or affecting the voting task.

For what concern the communicative variables, the betweenness delineates a different scenario with respect to the centrality, suggesting that the communication in the blank and in the topic modality follows a different regime with respect to the voting one, where the average betweenness in the private channels is greater than the others.

The affinity among individuals appears to be sensitive to different aspects related to the task, and is apparently assessed by the subjects in different ways, depending on the nature of the task. The subjects appear to adapt the cognitive heuristics used to assess the affinity with the others, depending on the constraints imposed by the task.

A linear regression method has been used to test such hypotheses. The three significant best resulting models indicate different strategies adopted by the subjects. In particular the explained variance of the models is significantly greater for the blank modality (70%), where the affinity dynamics appears related to the number of interaction and to their moods, regardless to the contents. At the contrary in the Topic and in the Voting modality the explained variances are respectively of 33% and 43%.

In summary, results show that in the blank modality it is possible to forecast the final affinity between any two subjects, while this is not possible in more structured tasks. The interpretation of this result is that, in th absence of a specific task, people tends to structure their communication space according with their affinity, while for structured tasks other dimensions become more important

6 Thursday 28/6 18:00 Public event

6.1 La fisica nella vita di tutti i giorni - Physics in everyday life

Franco Bagnoli
Università di Firenze

Esperimenti con materiali alla portata di tutti che mostrano come fenomeni apparentemente diversi possono essere ricondotti alle stesse leggi fisiche. Esperimenti di Meccanica, elettromagnetismo, ottica, dinamica dei fluidi, struttura della materia.

Experiments using common materials showing how apparently different phenomena can be explained using common physical laws. Experiment on mechanics, electromagnetism, optics, fluid dynamics, matter physics.

7 Friday 29/6 morning

7.1 Landslide modeling: application to warning system for Civil Protection purposes and theoretical approach based on molecular dynamics

Gianluca Martelloni, Franco Bagnoli
University of Florence, Department of Energy Engineering and CSDC, Florence (IT)

In this work we propose a landslide modeling at regional and national scale for Civil Protection purposes. In addition a 2D computational mesoscopic modeling approach, based on molecular dynamics (MD) is developed for shallow and deep landslides triggered by rainfall. The former model are based on statistical rainfall thresholds for the forecasting of shallow and deep landslide triggering. This model, nominated SIGMA, is built to operate in a warning system at regional scale: the model has been calibrated through landslide events of period 2004-2007 and validated by means of a further set of data (2008-2010). The originality of the model is the calibration method of the rainfall thresholds based on a optimization technique to reduce the false alarms of the regional warning system. The results of validation are very good and therefore the SIGMA model will be implemented within the first half of 2012 in the alert system of Emilia- Romagna region (Italy). Also for the correct computation of the snow melting induced landslides, a snow melt modeling (SMM) has been developed to its integration with the statistical model based on rainfall thresholds. The SMM is completely original as the only available data is the temporal series of temperature, rainfall and snowpack depth; hence the model is built without the data necessary for this type of modeling. The SMM are calibrated with an heuristic algorithm of optimization (the optimized flexible simplex) and validate using some set of snowpack depth data. The simulations shows that the integrated system SIGMA-SMM is globally more efficient than only SIGMA as many landslide events from snow melting are correctly detected. Then a modified version of SIGMA is built to operate at national level in an integrated system for shallow landslide forecasting. Concerning a 2D MD model, it is based on interacting particles and it describes the features of a (fictitious) granular material along a slope: in case of shallow landslip, a horizontal layer with thickness of one particle is simulated, while in case of deep landslide a vertical section, with wider thickness, is considered. For shallow instability movements, we consider that the triggering is caused by the decrease of the static friction along the sliding surface. The triggering of landslip is caused by the passing of two conditions: a threshold speed of the particles and a condition on the static friction between particles and slope surface, this latter based on the Mohr- Coulomb failure criterion. Moreover the interaction force between particles is defined trough a potential that, in the absence of experimental data, we have modeled as the Lennard-Jones 2-1 potential. In addition, only for deep landslide modeling, a filtration model is considered in order to take into account the increase of the pore pressure that is the real cause of triggering. For the prediction of the particle positions, after and during a rainfall, we use a MD method which results very suitable to simulate this type of systems. The outcome of simulations are quite satisfactory and we can claim that this types of modeling can represent a new method to simulate landslides triggered by rainfall. In our simulations emerging phenomena such as fractures, detachments and arching can be observed. In particular, the model reproduces well the energy and time distribution of avalanches, analogous to the observed Gutenberg-Richter and Omori power law distributions for earthquakes. In particular, the distribution of the mean kinetic energy of a landslide shows a transition from Gaussian to log-normal to power law, with the decreasing of the coefficient of viscosity up to zero. This behavior is compatible with slow (high viscosity) and rapid landslides (low viscosity). The main advantage of these Lagrangian methods consists in the capability of following the trajectory of a single particle, possibly identifying its dynamical properties. Finally,

for a large range of values of the parameters of the model, we observe a characteristic velocity pattern, with acceleration increments, typical of real landslides. It is therefore possible to apply the method of the inverse surface displacement velocity (Fukuzono, 1985) for predicting the failure time.

Keywords: landslide, rainfall thresholds, snow melt modeling, molecular dynamics, Lagrangian modeling, particle based model, power law.

7.2 Agent based mobility models of the Etrurian protocities formation: from satellite photos and thematic maps to the comprehension of the born of the modern society. A forecasting system for the territory and cultural management and conservation.

G. Pelfer(1), A. Guazzini(1,2), G. Martelloni(1,3)
1 University of Florence and Centre for the study of complex dynamics.
2 Department of Psychology, University of Florence.
3 Department of Energetics, University of Florence.

Introduction: The transition from the household societies to the protocities is one of the most attractive topics within the archaeological and anthropological domains. Nowadays, models of paleodemography, paleoeconomy and paleoproductivity can be combinated with sociophysical and econophysical models, in order to understand and to explain the dynamics of phenomenon known as “Synecism”. Such phenomenon is considered as the motor for the formation process of large and complex protourban settlements, starting from the last phase of Final Bronze age until to the beginning of First Iron age. Among the others, the Villanovan Period, concerning the ancient Etruria in Middle Tyrrenian Area during the mentioned age, is considered as one of the most representative of such process (i.e. Villanovan Revolution). The modern systems of satellite and Remote Sensing observations, coupled with appropriate forecasting models (sociophysical and econophysical), allow to effectively investigate the major theoretical assumptions in the study area. Through numerical simulations it will be maximized the usefulness of such databases, as the optimization and validation of models and theories themselves. As a benchmark for the outputs of the models, we calculate spatial correlations by means of the available spatial and topographical attributes (elevation, vegetation, resources, slope, etc ...). The latter will be merged with the historical and archaeological information. Finally such general framework would satisfy the requirements for the Cultural Resource management, valorization and preservation, as well as for the environmental management in general.

Methodology: The methodological approach to the modelling can be summarized in three different areas, as follows: the first area has to face with the representation of the environment features and critical factors. The available maps can be imported in a GIS environment and eventually it is possible to implement the models with MATLAB software that can be interfaced with the GIS system. Another possibility is the use of IDL language programming, optimized for image processing and provided with a graphical user interface as MATLAB, where it is possible to write scripts in native language or a C routine. Otherwise, it could be possible to import the elaborated maps in NETLOGO space for the implementation of the mentioned models.

The choice of the most suitable platform will depend on the evaluation about the speed, the efficiency and the effectiveness of such integrated systems. The second area will represent the mobile part of the model. The sociophysical aspects will be represented as a Cellular Automata, and will incorporate the most relevant cognitive aspects related to the cultural and to the demographic evolutionary dynamics, as well as those related with the mobility and the environment exploration. A last fundamental ingredient for a comprehensive model of protocities formation is represented by the econophysical aspects of the system dynamics. Such constraints will be inspired by the archaeological and hystorical theories and will affect the population evolution and the “gain” functions that rule over it accordingly.

Expected Results:

Management and valorization of the territory and of Cultural and Archaeological italian Heritage. Understandment of sociocultural phenomena affecting the origin of the Cities and Protocities and involving the social, ecological and economical parameters that helped the development of such a new form of social organization grown, definitively, into a paradigm of modern human society. Local forecasting systems and Predictive Models that can be applied to different domains related to territory and to Cultural Heritage Management. A dedicated toolbox for the Archaelogical Users can be developed for the forecasting of the protocities birth and formation in space and in the time, according to the dynamic of such historical processes. Beginning from an historical dated map the software will be able to reproduce protocities formation using an optimization algorithm based on geographical and geospatial information for the determination of the initial distribution of households.

Keywords:

Cultural Antropology, Computational Archaeology, GIS Geographical Information Systems, Sociophysics of Protocities, Econophysics of ancient societies

7.3 Randomness perception: representativeness or encoding?

Giorgio Gronchi
Dip. Psicologia, Università di Firenze

The probabilistic analysis of cognition is a recent framework that employs Bayesian statistics to model various aspects of the human cognitive system. After a brief description of this perspective, we present a Bayesian model of randomness perception (Griffiths and Tenenbaum, 2003, 2004). The randomness perception task is addressed in terms of the statistical problem of model selection: given a string, inferring whether the process that generated it was random or regular. A basic finding is that people rate sequences with an excess of alternation as more random than prescribed by information theory (overalternating bias). There are two explanations: local representativeness (Kahneman and Tversky, 1972) and the implicit encoding hypothesis (Falk and Konold, 1997). The measure random(X) of the Bayesian model was used in order to compare predictions derived from the explanations in a series of reaction times experiments. Results are discussed in relation to relevant methodological issues and future research.

7.4 From the Social Cognition and the Cognitive Heuristics to the modelling of the Self Awareness: the Tri-Partite Model

Franco Bagnoli(1,3) and Andrea Guazzini(2,3)
1 Department of Energetics, University of Florence. Also INFN, sez. Firenze.
2 Department of Psychology, University of Florence.
3 University of Florence and Centre for the study of complex dynamics.

The common questions of the psychological research throughout the past century have been frequently concerning the way an organism is aware of the environment and able to make decision inferring unknown aspects of it.

The scientific advancements in modern Cognitive Sciences have been coupled with new concepts and ideas that have made this discipline more scientifically rigorous, even if frequently too much qualitative to be implemented/nested into other domains [1]. Among these key concepts probably the most attractive and recently quite inflated is that of Cognitive Heuristics [2].

This lecture present a general framework based on the idea that the cognitive brain relies on predictions based on the memory that are continuously generated, either based on the information gathered from the senses or from the knowledge. The framework integrates three primary components. The first connects the domain related concepts of associations, which are formed by a life-time practice of extracting repeating patterns and statistical regularities from our environment, and storing them as a particular form of memory [3]. The second is the concept of analogies, which is the term that represents the process of seeking correlations between an event and existing representations in memory/knowledge. Finally, these analogies activate associated representations that translate into predictions or inference processes. In this work we propose a relative shift of perspective from the previous and ”classical” one, maintaining a three partitioned structure for the model representing the cognitive system, and introducing some recent results and insights coming from both the neuropsychological and the cognitive literature.

[1] Neuberg, S.L., Kenrick, D.T., Schaller, M. Evolutionary social psychology. In S. T. Fiske, D. Gilbert, & G. Lindzey (Eds.), Handbook of Social Psychology (5th ed., pp. 761796). New York: John Wiley & Sons, (2010).

[2] Simon, H.A. A Behavioral Model of Rational Choice. The Quarterly Journal of Economics, Vol. 69, No. 1, pp. 99-118, (1955).

[3] Rao, R.P. and Ballard, D.H. Predictive coding in the visual cortex: a functional interpretation of some extraclassical receptive- field, Nat. Neurosci. 2, 7987, (1999).


Ċ
Franco Bagnoli,
21 Jun 2012, 07:09
Comments