AI Detecting the Scribes of the Dead Sea Scrolls

Computer science and artificial intelligence have provided a new way to do science: it was in fact one of the earliest uses of the computer. They are now giving new ways for scholars to do research in other disciplines such as ancient history, too. Artificial Intelligence has been used in a novel way to help understand how the Dead Sea Scrolls were written, and it turns out scribes in ancient Judea worked in teams.

The Dead Sea Scrolls are a collection of almost a thousand ancient documents written several thousand years ago that were found in caves near the Dead Sea. The collection includes the oldest known written version of the Bible.

The cave where most of the Dead Sea Scrolls were found.

Researchers from the University of Groningen used artificial intelligence techniques to analyse a digitised version of the longest scroll in the collection, known as the Great Isaiah Scroll. They picked one letter, aleph, that appears thousands of times through the document, and analysed it in detail.

Two kinds of artificial intelligence programs were used. The first, feature extraction, based on computer vision and image processing was needed to recognize features in the images. At one level this is the actual characters, but also more subtly here, the aim was that the features corresponded to ink traces based on the actual muscle movements of the scribes.

The second was machine learning. Machine Learning programs are good at spotting patterns in data – grouping the data into things that are similar and things that are different. A typical text book example would be giving the program images of cats and of dogs. It would spot the patterns of features that correspond to dogs, and the different pattern of features that corresponds to cats and group each image into one or the other pattern.

Here the data was all those alephs or more specifically the features extracted from them. Essentially the aim was to find patterns that were based on the muscle movements of the original scribe of each letter. To the human eye the writing throughout the document looks very, very uniform, suggesting a single scribe wrote the whole document. If that was the case, only one pattern would be found that all letters were part of with no clear way to split them. Despite this, the artificial intelligence evidence suggests there were actually two scribes involved. There were two patterns.

The research team found, by analysing the way the letters were written, that there were two clear groupings of letters. One group were written in one way and the other in a slightly different way. There were very subtle differences in the way strokes were written, such as in their thickness and the positions of the connections between strokes. This could just be down to variations in the way a single writer wrote letters at different times. However, the differences were not random, but very clearly split at a point halfway through the scroll. This suggests there were two writers who each worked on the different parts. Because the characters were otherwise so uniform, those two scribes must have been making an effort to carefully mirror each other’s writing style so the letters looked the same to the naked eye.

The research team have not only found out something interesting about the Dead Sea Scrolls, but also demonstrated a new way to study ancient hand writing. With a few exceptions, the scribes who wrote the ancient documents, like the Dead Sea Scrolls, that have survived to the modern day, are generally anonymous, but thanks to leading-edge Computer Science, we have a new way to find out more about them.

Explore the digitised version of the Dead Sea Scrolls yourself at www.deadseascrolls.org.il

– Paul Curzon, Queen Mary University of London

Losing the match? Follow the science. Change the kit!

Artificial Intelligence software has shown that two different Manchester United gaffers got it right believing that kit and stadium seat colours matter if the team are going to win.

It is 1996. Sir Alex Ferguson’s Manchester United are doing the unthinkable. At half time they are losing 3-0 to lowly Southampton. Then the team return to the pitch for the second half and they’ve changed their kit. No longer are they wearing their normal grey away kit but are in blue and white, and their performance improves (if not enough to claw back such a big lead). The match becomes infamous for that kit change: the genius gaffer blaming the team’s poor performance on their kit seemed silly to most. Just play better football if you want to win!

Jump forward to 2021, and Manchester United Manager Ole Gunnar Solskjaer, who originally joined United as a player in that same year, 1996, tells a press conference that the club are changing the stadium seats to improve the team’s performance!

Is this all a repeat of previously successful mind games to deflect from poor performances? Or superstition, dressed up as canny management, perhaps. Actually, no. Both managers were following the science.

Ferguson wasn’t just following some gut instinct, he had been employing a vision scientist, Professor Gail Stephenson, who had been brought in to the club to help improve the players’ visual awareness, getting them to exercise the muscles in their eyes not just their legs! She had pointed out to Ferguson that the grey kit would make it harder for the players to pick each other out quickly. The Southampton match was presumably the final straw that gave him the excuse to follow her advice.

She was very definitely right, and modern vision Artificial Intelligence technology agrees with her! Colours do make it easier or harder to notice things and slows decision making in a way that matters on the pitch. 25 years ago the problem was grey kit merging into the grey background of the crowd. Now it is that red shirts merge into the background of an empty stadium of red seats.

It is all about how our brain processes the visual world and the saliency of objects. Saliency is just how much an object stands out and that depends on how our brain processes information. Objects are much easier to pick out if they have high contrast, for example, like a red shirt on a black background.

Peter McOwan and Hamit Soyel at Queen Mary combined vision research and computer science, creating an Artificial Intelligence (AI) that sees like humans in the sense that it predicts what will and won’t stand out to us, doing it in real time (see DragonflyAI: I see what you see). They used the program to analyse images from that infamous football match before and after the kit change and showed that the AI agreed with Gail Stephenson and Alex Ferguson. The players really were much easier for their team mates to see in the second half (see the DragonflyAI version of the scenes below).

Dragonfly highlights areas of a scene that are more salient to humans so easier to notice. Red areas stand out the most. In the left image when wearing the grey kit, Ryan Giggs merges into the background. He is highly salient (red) in the right image where he is in the blue and white kit.

Details matter and science can help teams that want to win in all sorts of ways. That includes computer scientists and Artificial Intelligence. So if you want an edge over the opposition, hire an AI to analyse the stadium scene at your next match. Changing the colour of the seats really could make a difference.

Find out more about DragonflyAI: https://dragonflyai.co/ [EXTERNAL]

– Paul Curzon, Queen Mary University of London

DragonflyAI: I see what you see

What use is a computer that sees like a human? Can’t computers do better than us? Well, such a computer can predict what we will and will not see, and there is BIG money to be gained doing that!

The Hong Kong Skyline


Peter McOwan’s team at Queen Mary spent 10 years doing exploratory research understanding the way our brains really see the world, exploring illusions, inventing games to test the ideas, and creating a computer model to test their understanding. Ultimately they created a program that sees like a human. But what practical use is a program that mirrors the oddities of the way we see the world? Surely a computer can do better than us: noticing all the things that we miss or misunderstand? Well, for starters the research opens up exciting possibilities for new applications, especially for marketeers.

The Hong Kong Skyline as seen by DragonflyAI


A fruitful avenue to emerge is ‘visual analytics’ software: applications that predict what humans will and will not notice. Our world is full of competing demands, overloading us with information. All around us things vie to catch our attention, whether a shop window display, a road sign warning of danger or an advertising poster.

Imagine, a shop has a big new promotion designed to entice people in, but no more people enter than normal. No-one notices the display. Their attention is elsewhere. Another company runs a web ad campaign, but it has no effect, as people’s eyes are pulled elsewhere on the screen. A third company pays to have its products appear in a blockbuster film. Again, a waste of money. In surveys afterwards no one knew the products had been there. A town council puts up a new warning sign at a dangerous bend in the road but the crashes continue. These are examples of situations where predicting where people look in advance allows you to get it right. In the past this was either done by long and expensive user testing, perhaps using software that tracks where people look, or by having teams of ‘experts’ discuss what they think will happen. What if a program made the predictions in a fraction of a second beforehand? What if you could tweak things repeatedly until your important messages could not be missed.

Queen Mary’s Hamit Soyel turned the research models into a program called DragonflyAI, which does exactly that. The program analyses all kinds of imagery in real-time and predicts the places where people’s attention will, and will not, be drawn. It works whether the content is moving or not, and whether it is in the real world, completely virtual, or both. This then gives marketeers the power to predict and so influence human attention to see the things they want. The software quickly caught the attention of big, global companies like NBC Universal, GSK and Jaywing who now use the technology.

Find out more about DragonflyAI: https://dragonflyai.co/ [EXTERNAL]