by Paul Curzon, Queen Mary University of London
Ant colonies are really good at adapting to changing situations: far better than humans. Sameena Shah wondered if Artificial Intelligence agents might do better by learning their intelligent behaviour from ants rather than us. She has suggested we could learn from the ants too.
Inspired by staring at ants adapting to new routes to food in the mud as a child, and then later as adult ants raided her milk powder, Sameena Shah studied for her PhD how a classic problem in computer science, that of finding the shortest path between points in a network, is solved by ant colonies. For ants this involves finding the shortest paths between food and the nest: something they are very good at. When foraging ants find a source of food they leave a pheromone (i.e., scent) trail as they return, a bit like Hansel and Gretel leaving a trail of breadcrumbs. Other ants follow existing trails to find the food as directly as possible, leaving their own trails as they do. Ants mostly follow the trail containing most pheromone, though not always. Because shorter paths are followed more quickly, there and back, they gain more pheromone than longer ones, so yet more ants follow them. This further reinforces the shortest trail as the one to follow.
There are lots of variations on the way ants actually behave. These variations are being explored by computer scientists as ways for AI agents to work together to solve problems. Sameena devised a new algorithm called EigenAnt to investigate such ant colony-based problem solving. If the above ant algorithm is used, then it turns out longer trails do not disappear even when a shorter path is found, particularly if it is found after a long delay. The original best path has a very strong trail so that it continues to be followed even after a new one is found. Computer-based algorithms add a step whereby all trails fade away at the same rate so that only ones still being followed stay around. This is better but still not perfect. Sameena’s EigenAnt algorithm instead removes pheromone trails selectively. Her software ants select paths using probabilities based on the strength of the trail. Any existing trail could be chosen but stronger trails are more likely to be. When a software ant chooses a trail, it adds its own pheromones but also removes some of the existing pheromone from the trail in a way that depends on the probability of the path being chosen in the first place. This mirrors what real ants do, as studies have shown they leave less pheromone on some trails than others.
Sameena proved mathematical properties of her algorithm as well as running simulations of it. This showed that EigenAnt does find the shortest path and never settles on something less than the best. Better still, it also adapts to changing situations. If a new shorter path arises then the software ants switch to it!
Sameena won the award
for the best PhD in India
There are all sorts of computer science uses for this kind of algorithm, such as in ever-changing computer networks, where we always want to route data via the current quickest route. Sameena, however, has also suggested we humans could learn from this rather remarkable adaptability of ants. We are very bad at adapting to new situations, often getting stuck on poor solutions because of our initial biases. The more successful a particular life path has been for us the more likely we will keep following it, behaving in the same way, even when the situation changes. Sameena found this out when she took her dream job as a Hedge Fund manager. It didn’t go well. Since then, after changing tack, she has been phenomenally successful, first developing AIs for news providers, and then more recently for a bank. As she says: don’t worry if your current career path doesn’t lead to success, there are many other paths to follow. Be willing to adapt and you will likely find something better. We need to nurture lots of possible life paths, not just blindly focus on one.
More on …
Related Magazines …
EPSRC supports this blog through research grant EP/W033615/1.





