Telecom Paris
Dep. Informatique & Réseaux J-L. Dessalles ← Home page November 2019 |
Module Athens TPT-09
Emergence in Complex Systems
From nature to engineering
→ other AI courses
In this story, the swallows’ tendency to perch (on a wire, say) increases smoothly with time. But swallows tend to imitate each other: their tendency to join a group of perched birds is proportional to the group’s size. The simulation shows that at some point, a collective decision takes place. This offers an insight into the kind of collective processes involved when birds decide to migrate away.
The program Swallows.py (in the directory Other/Swallows) implements collective decision. You can run it by executing the local command Starter in that directory. The Configuration Editor allows you to change parameter values.
The blue curve shows the proportion of the population perched. Observe that the number of swallows on the wire increases dramatically at some point in time. The yellow curve shows the increasing perching probability (x 1000). Run the simulation several times with the same parameters. Observe that the decision point varies.
Look at the definition of the various parameters (NbAgents, TimeSlope, GroupSlope, GroupOffset, Inertia). How does collective decision time vary depending on each of these parameters? |
In this story, ‘ants’ move in search for food. They move randomly, avoiding cells that have been already visited and marked by a repulsive pheromone. When they find food, they head back toward the colony while laying down attractive pheromone in a decreasing way.
Run the program by executing the local command Starter. You should see that after a while, ants reach food sources and exploit them until they are exhausted.
This implementation makes use two kinds of pheromones, a negative one and a positive one. Set the quantity of negative pheromone to zero. Do you notice any change? To make comparisons, you may set the Simulation/RandomSeed to any non-zero value to make the program deterministic. |
→ Reinstall negative pheromone.
Locate food sources deterministically at various distances from the nest (e.g. along a line, not aligned with the nest). You may also vary the number of food sources. Copy the corresponding portion of code in the box below. Does the ant colony exhaust near sources before far sources? Describe your observations in a few precise words. |
Some parameters are involved in pheromone control (NPDeposit, PPMax, Evaporation, SniffingDistance, Saturation, Exploration). Choose one of them, study its effect and make a precise description of your results. |
In classical networks:
The routing table r gives the probability at node i, when heading to node d, of choosing n as next node
The table is updated by δr by an upcoming ant.
The table is updated by an upcoming ant. An ant originating from d and coming from n_{0} will increase the probability for a message to choose n_{0} as next node by δr (and all probabilities are divided by 1 + δr so that they sum up to 1).
δr decreases with the ant’s age.
The program AntNet.py (in the directory Other/AntNet) is a preliminary implementation of the AntNet algorithm. ’Ants’ move probabilistically through the network and get older each time they reach a node. At each node, they update the local routing table.
Run the program by executing the local command Starter. You may run the program on a simple network (described in the file Network1.ntw) by loading the configuration file AntnetTest_Typical.evo. Or you can generate a random network by loading the configuration file Antnet_Typical.evo. You should see that after a short while, a test message (in blue) finds its way through the network between the two nodes indicated by larger dots. You may change the speed of moving ants (to accelerate display) using the ruler.
Diminish the number of ants in either experiment (test or random). How many ants are necessary for the network to find the shortest path ? Provide an example. |
Many networks of real life are randomly generated: the Facebook network of friendship, the Web, the Twitter network of followers, the sex-relation graph (who had sex with whom), the citation graph (who cites whom in scientific papers). These "natural networks" (natural in the sense that they are generated by different individuals without global concertation) look similar. They are all "scale-free" networks.
A network is said "scale-free" if its degree-distribution follow a power-law: A huge number of nodes have only a few neighbours, while a small number of nodes have a lot of neighbours.
If you take the example of a graph where nodes are Facebook users and vertices are friendship links between them, then this network is a scale-free network: A few people have a huge number of friends, and most of us have only from 50 to 500 friends.
This property of natural networks can be explain by the notion of "Preferential Attachment". It means that, when a node choses its neighbours, it will prefer nodes with higher degree. For example if you are in a group of people, and you don’t know anyone, you will naturally try to be friend with the most popular person of the group.
To simulate preferential attachment, go to the Evolife/Other/PreferentialAttachment folder and run the program through the local command starter.
This program simulates a population of individuals who have to choose other individuals to link to.
Choose strategy ‘random’ in the ‘Network’ section. Each node send a (bidirectional) link towards several other nodes that are randomly chosen. Run the program with 400 nodes and 4 links per nodes.
Observe the distribution of degree
(note: the scale is logarithmic (base 2); the distribution starts from LinksPerNode+1 and not from 0). Explain why the maximum of the degree distribution corresponds to 2 * LinksPerNode. |
Choose the ‘PA’ strategy (preferential attachment) and set SampleSize to a large size, typically one quarter of the total number of nodes. Run the program again. You should observe that some nodes get more popular and have more neighbours.
What can you say about the curve formed by the blue dots (degree distribution)?
Why is it so? |
How does it compare with the ‘random attachment’ case? |
The preceding method is time consuming. Each agent has to spend time studying a representative sample of the population. If you set SampleSize to a lower value such as one tenth of the network’s size, then you may observe that the effect disappears.
Preferential attachment may result from much lighter mechanisms.
Suppose that our nodes represent scientists. Two scientists in the graph are connected if they wrote a paper together. Imagine you are a scientist, and you want to find someone to write a paper with. You could use the above method
and select the most popular author among all your fellow scientists, i.e. the one who wrote the most papers with your other friends. But you are a lazy scientist! So you adopt a much easier strategy. You just pick up a random paper in your domain and decide to work with one of the scientists who wrote it. In other words, you just select an edge of the graph and choose one of the nodes connected by this edge.
In the findNeighbour function of the Graph class, implement this method for finding a neighour
(the edge list self.Links is a list of couples representing connected nodes). Test the method by choosing this ‘picking’ strategy. copy your method below. |
What do you observe concerning the blue dot curve?
Do you have any idea why it is so? |
(click on image to animate)
In a cocktail party, everyone must raise the voice to cover the noise of neighbouring conversations. Inevitably, the global noise level increases and conversational circles shrink.At the beginning, agents just talk (or sing) by themselves. You can see that agents progressively increase their voice level. The situation may stabilize at an intermediary level or evolve to a situation in which everyone shouts at the maximum level.
Note that the system is quite sensitive to parameter values:
Consider the Influence parameter. Below a minimal value (e.g. 20), neighbouring cells cannot even hear the minimal voice level (e.g. 5).
Observe that even that minimal value will most probably lead eventually to noise surge.
Try to find a density threshold and influence combination below which noise remains localized.
Open Cocktail.py and locate the attenuation function. As you can see, voice attenuation depends linearly on the inverse of distance.
Change this for a steeper decrease, e.g. inverse of the square of the distance (copy the python line in the box below). Can you easily create a situation in which noise does emerge but remains localized? For which values of the parameters? Can you see a difference with the inverse-linear case? |