CS4FN Advent 2023 – Day 5: snowman: analog hydraulic computers (aka water computers), digital compression, and a puzzle

This post is behind the 5th ‘door’ of the CS4FN Christmas Computing Advent Calendar – we’re publishing a computing-themed (and sometimes festive-themed) post every day until Christmas Day. Today’s picture is a snowman, and what’s a snowman made of but frozen water?

Image drawn and digitised by Jo Brodie.

1. You can make a computer out of water!

1n 1936 Vladimir Lukyanov got creative with some pipes and pumps built a computer, called a water (or hydraulic) integrator, which could store water temporarily in some bits and pump water to other bits. The movement of water and where it ended up used the ‘simplicity of programming’ to show him the answer – a physical representation of some Very Hard Sums (sums, equations and calculations that are easier now thanks to much faster computers).

A simple and effective way of using water to show a mathematical relationship popped up on QI and the video below demonstrates Pythagoras’ Theorem rather nicely.

Video from the BBC via their YouTube channel.

In 1939 Lukyanov published an article about his analog hydraulic computer for the (‘Otdeleniye Technicheskikh Nauk’ or ‘Отделение технических наук’ in Russian which means Section for Technical Scientific Works although these days we’d probably say Department of Engineering Sciences) and in 1955 this was translated by the Massachusetts Institute of Technology (MIT) for the US army’s “Arctic Construction and Frost Effects Laboratory”. You can see a copy of his translated ‘Hydraulic Apparatus for Engineering Computations‘ at the Internet Archive.

In a rather pleasing coincidence for this blog post (that you might think was by design rather than just good fortune) this device was actually put to work by the US Army to study the freezing and thawing not of snowmen but of soil (ie, the ground). It’s particularly useful if you’re building and maintaining a military airfield (or even just roads) to know how well the concrete runway will survive changes in weather (and how well your aircraft’s wheels will survive after meeting it).

For a modern take on the ‘hydrodynamic calculating machine’ aka water computer see this video from science communicator Steve Mould in which he creates a computer that can do some simple additions.

Video by Steve Mould via his YouTube channel.

2. The puzzle of digital compression

Our snowman’s been sitting around for a while and his ice has probably become a bit compacted, so he might be taking up less space (or he might have melted). Compression is a technique computer scientists use to make big data files smaller.

Big files take a long time to transfer from one place to another. The more data the longer it takes, and the more memory is needed to store the information. Compressing the files saves space. Data on computers is stored as long sequences of characters – ultimately as binary 1s and 0s. The idea with compression is that we use an algorithm to change the way the information is represented so that fewer characters are needed to store exactly the same information.

That involves using special codes. Each common word or phrase is replaced by a shorter sequence of symbols. A long file can be made much shorter if it has lots of similar sequences, just as the message below has been shortened. A second algorithm can then be used to get the original back. We’ve turned the idea into a puzzle that involves pattern matching patterns from the code book. Can you work out what the original message was? (Answer tomorrow, and another snowman-themed puzzle coming soon).

The code: NG1 AMH5 IBEC2 84F6JKO 7JDLC93 (clue: Spooky apparitions are about to appear on Christmas Eve).

The code book (match the letter or number to the word it codes for).

3. Answer to yesterday’s puzzle

The creation of this post was funded by UKRI, through grant EP/K040251/2 held by Professor Ursula Martin, and forms part of a broader project on the development and impact of computing.


Advert for our Advent calendar
Click the tree to visit our CS4FN Christmas Computing Advent Calendar

EPSRC supports this blog through research grant EP/W033615/1.

Composing from Compression

Recoloured Cranium head abstract image by Gordon Johnson from Pixabay

Computers compress files to save space. But it also allows them to create music!

Music is special. It’s one of the things, like language, that makes us human, separating us from animals. It’s also special as art, because it doesn’t exist as an object in the world – it depends on human memory. “But what about CDs? They’re objects in the world”, you might say and you’d be right, but the CD is not the music. The CD contains data files of numbers. Those numbers are translated by electronics into the movements in a loudspeaker, to create sound waves. Even the sound waves aren’t music! They only become music when a human hears them, because understanding music is about noticing repetition, variation and development in its structure. That’s why songs have verses and choruses: so we can find a starting point to understand its structure. In fact, we’re so good at understanding musical structure, we don’t even notice we’re doing it. What’s more, music affects us emotionally: we get excited (using the same chemicals that get us excited when we’re in love or ready to flee danger) when we hear the anthem section of a trance track, or recognise the big theme returning at the end of a symphony.

Surprisingly, brains seem to understand musical structure in a way that’s like the algorithms computer scientists use to compress data. It’s better to store data compressed than uncompressed, because it takes less storage space. We think that’s why brains do it too.

Even more surprisingly, brains also seem to be able to learn the best way to store compressed music data. Computers use bits as their basic storage unit, but we can make groups of bits work like other things (numbers, words, pictures, angry birds…); brains seem to do something similar. For example, pitch (high vs. low notes) in sequence is an important part of music: we build melodies by lining up notes of different pitch one after the other. As we learn to hear music (starting before birth, and continuing throughout life), we learn to remember pitch in ever more efficient ways, giving our compression algorithms better and better chances to compress well. And so we remember music better.

Our team use compression algorithms to understand how music works in the human mind. We have discovered that, when our programs compress music, they can sometimes predict musical structures, even if neither they nor a human have “heard” them before. To compress something, you find large sections of repeated data and replace each with a label saying “this is one of those”. It’s like labelling a book with its title: if you’ve read Lord of the Rings, when I say the title you know what I mean without me telling the story. If we do this to the internal structure of music, there are little repetitions everywhere, and the order that they appear is what makes up the music’s structure.

If we compress music, but then decompress it in a different way, we can get a new piece of music in a similar style or genre. We have evidence that human composers do that too!

What our programs are doing is learning to create new music. There’s a long way to go before they produce music you’ll want to dance to – but we’re getting there!

Geraint Wiggins, Queen Mary University of London


Related Magazine …

Subscribe to be notified whenever we publish a new post to the CS4FN blog.


This blog is funded by EPSRC on research agreement EP/W033615/1.

QMUL CS4FN EPSRC logos